1 / 50

Vesna Luzar-Stiffler, Ph.D.

Measuring ICT: A Need for Expertise How will we know when “the most dynamic and competitive knowledge based economy in the World has been created”?. Vesna Luzar-Stiffler, Ph.D. University Computing Centre, and CAIR Research Centre, Zagreb, Croatia Charles Stiffler, Ph.D.

yanni
Download Presentation

Vesna Luzar-Stiffler, Ph.D.

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Measuring ICT: A Need for ExpertiseHow will we know when “the most dynamic and competitive knowledge based economy in the World has been created”? Vesna Luzar-Stiffler, Ph.D. University Computing Centre, and CAIR Research Centre, Zagreb, Croatia Charles Stiffler, Ph.D. CAIR Research Centre, Zagreb, Croatia vluzar@srce.hr, charles.stiffler@cair-center.hr

  2. Introduction/Overview • Lisbon strategy/background (Feb 2004) • eSociety aims • Harmonization Strategy • Origin of Benchmarking • Case Study (structured around market research stages) • Conclusions/Recommendations for future research and additional expertise

  3. Lisbon strategy/background (Feb 2004) • How will we know? • Strategy • EU indicators • SEE indicators • Croatian strategic indicators • Benchmarking data, measures, methods • Tasks/timing • Budgeting

  4. eSociety Aims • Modernization and transparency (codification of procedure) • Harmonization, the big picture • Fiscal reform • Privatization • Banking reform • Uniform commercial code • Agriculture • FDI • SMEs

  5. Chain Reaction in 6 years?

  6. “The mother of all benchmarking” • TQM/Statistics 1949 (How fast do we have to grow? … assuming 6 years and 4.5%...) • Real learning organization (Shewhart control charts) • Systems theory (design it and throw it over the fence?) • Variation (what is inhibiting and promoting growth rates) • Psychology of human motivation (job security and growth)

  7. A Case Study in Nine StepsPreview • ICT in Education: Census of 40,000+ teachers/ 1,300+ schools, 2003 • Data sources, • Variables measured, • Methods of collection • What we developed from the study (10 “comparable” indicators) • Exploratory/pilot study (no hypothesis test) • Overview of all 9 stages in the research process • (we were brought in at stage 7) • Juggling 4 balls to create the system of research • Overview of major shortcomings • Opportunity to reuse/improve the next project

  8. A Case Study in Nine StepsCensus of 40,000+ teachers/ 1,300+ schools

  9. EU Data sources Infrastructure (HW/SW, ISPs), Nat.Stats Orgs, Government Variables measured 134/34 indicators, etc. Methods of collection Face-to-face Mail Telephone Panels, etc. Case Study Data sources Infrastructure (HW/SW, ISPs), Nat.Stats Org, Ministry of Education Variables measured 200+ questions/10 ind. Method of collection Web A Case Study in Nine StepsCensus of 40,000+ teachers/ 1,300+ schools

  10. What we developed from the study (10 “comparable” indicators)

  11. Number of computers per 100 students in primary education (Croatia vs. eEurope+, 2003) Croatia: 4.3 Source: eEurope+ Progress Report, Feb 2003

  12. Number of computers per 100 students in secondary education (Croatia vs. eEurope+, 2003) Croatia: 5.6 Source: eEurope+ Progress Report, Feb 2003

  13. Students to computers ratio in secondary education (Croatia by Counties)

  14. Grand Overview of Key Stages of M/R 1. Formulation the Problem 2. Secondary Data Issues 3. Methodology 4. Sampling Methods 5. Instrument Design 6. Collection of data 7. Data Analyses (This is where we came in ….) 8. Evaluation of data and methods 9. Report preparation stage

  15. 1. Formulate the Problem • Handout of overview, kinds, uses • Decision support/careers • ethics - sales calls # anonymous research • one shot vs. ongoing nature (flash, candle) • is it worth it? EVPI - Res. cost = Value • what to collect, how, from whom, WHY?

  16. DSS Example EV(PI) = EV(C) - EV(no clue) 28 = 98 - 70.0 “max-max” State of product demand: p(Low)=.6 p(Med)=.3 p(Heavy)=.1 price H .6(100)+.3(50)+.1(-50)=70.0 EV(A1) action M .6(50)+.3(100)+.1(-25)=57.5 EV(A2) options L .6(-50)+.3(0) +.1(80)=-22.0 EV(A3)

  17. DSS Example • Prior Analysis + Preposterior (prior res. corr. prior outcomes ... data warehouse) • Issue: EV(research) = EV(A1) +/-(?): prior data + quality research

  18. 2. Secondary Data Issues (Sources for Schools study) • Why bother? - fast, cheap, perfect answers! (forecasts) • Where to go? Internet, Stats Bureau (NSO) , IRS, media, assoc., library, competitors • What to find? - demo, sales, inv, share, methods, questions

  19. 3. Methodology Part IResearch Design (survey only) • for the study type (plan of action) • observation, survey, experimentation • exploratory - no hypothesis (e.g., focus groups/qualitative, cases, etc.) • descriptive - hypothesis, enumerative • (who, what, when, where, how) • longitudinal - panel, omnibus; cross-sectional • causal - experimental

  20. 3. Methodology Part ICommunication (survey only) • for the communication method • mail - no pressure, cheap?; uncontrollable, bias, response rate, language, culture • telephone - fast, cheaper, can monitor; reach?, down and dirty only, shallow • personal interview - visual, depth; expense, time • Web/e-mail/fax

  21. 3. Methodology Part IISteps to Drawing a Sample • Define Population • ID Sample Frame • Select Sampling Procedure • Determine Sample Size • Select Designated Sample Elements • Collect Sample Data

  22. 4. Sampling Designs and Terms • Non-probability type designs • Convenience • Quota • Judgment • Panels • Focus groups • Depth interviews (informants)

  23. 4. Sampling Designs and Terms • Probability type designs (i.e., scientific) • Simple Random Sample (SRS) • Random Sample (RS) • Systematic Random Sample (SY) • Stratified Sampling Design (ST) (prop., disp.) • Cluster Sampling Design (CL) (syst., area) • Complex Estimators (efficiency, effectiveness) - formulas

  24. 4. Sampling Designs and Terms • Summary • 1. Validity • 2. Reliability • 3. Efficiency/Cost/Economics

  25. Non-sampling Biases • Collecting Data • field procedure and non-sampling errors (increase w/ sample size!) • Non-sampling Biases (Survey Sampling) • Non-observation • noncoverage - fail to include/”frame” errors • nonresponse - designated info missing on element • possible outcomes (phone, CATI, CAII)

  26. Case Study response rate (secondary school educators)

  27. Possible outcomes when attempting to contact respondents for interactive surveys Attempt Respondent Not Contacted Respondent Contacted (Prescreening) No number available No answer Not at Home Household Refusal Disconnected Busy Nonworking Number Other Refusal (by respondent) Reject (by interviewer) Cooperating (Postscreening) Eligible But over quota Eligible Ineligible Nonhousehold Did not pass Screening Completed Terminate Refusal Reject Not Usable

  28. Methodology Part IIISample Size Calculation Issues • Probability sampling only • mean • proportion • CI interval • Non-probability sampling • doesn’t matter, no way to estimate. • More complex designs w/ computer to follow

  29. 5. Primary Data Collection Forms (Instrumentation) • Content and form • Topic - see below + ? research purpose • Wording - language, clarity, etc. • Sequence - general to specific • Scaling - “entire courses”

  30. 5. Primary Data Collection Forms (Content and Structure) • Primary Data Topics - general • demographics / socio-economic • psychological / life-style • attitudes opinions - EIB/EI • awareness knowledge • Intentions / motivation • behaviors • Structure of instrument • structure - open / closed response • disguise - open/close purpose

  31. 6. Data Collection • Web • Time period: fall term 2002/2003 • Data cleaning/ preparation/ transformations • Repeated submissions (up to 25x!) Format of data collected:

  32. 7. Analyses - Compare, Infer, Associate • editing, coding, tabulation • exploration • cross tabulation frequency distributions • condensation - means, proportions, std, INDICATORS (see report for equations/ estimation) • t-tests of Hypotheses • chi-square, ANOVA, regression • advanced multivariate analyses • Graphics, graphics, graphics

  33. Educator skills and knowledge levels: % Educators using computers

  34. Future training needs: ICT course supply/demand ratios by course type

  35. Interest/desire for taking and/or teaching ICT courses vs. skills available 0.3% 7% 3.3% 81.4% 8%

  36. 8. Evaluation • Research objective achieved? • Valid/reliable findings, procedures • Limitations, errors, further research

  37. 9. Preparation of Report • Outline (usually good) • Title page • Table of Contents • Summary (executive - page) • Introduction • Results • Conclusions • Recommendations • Introduction

  38. 9. Preparation of Report • Body • Methodology • Results • Limitations • Conclusions and Recommendations • Appendix (Appendices) • Copies of data collection forms • Detailed calculations - sample size, tests, etc. • Tables not included in Body • Bibliography

  39. Research is Like a Juggling Act • Juggling 4 balls to create the system of research • Whom do we need to talk to? • What do we need to ask them? • What is the best way to reach them? • Why are we doing this research?

  40. Limitations • Overview of major shortcomings • The questionnaire was too long (forgot gender!) • The web may not have been the best way to collect the data • It was the wrong time of year (end and beginning of term?) • Responses were forced and required • Non-standard indicators collected • Other issues you may find in the Report!?

  41. Opportunities to reuse/improve the next project • Based on the concept of “nation process re-engineering” • We will change the indicators measured • Shorten the questionnaire • Change whom we gather data from • Change how we collect data

  42. Conclusions/Recommendations • We got 10 indicators for measuring/benchmarking ICT in Education (Pictures of 2) • We need at least 34 other for measuring Information Society: • Infrastructure • Availability • Impact data • What drives or correlates with classroom use? (Impact picture Classification Tree)

  43. 12.6% educators use PCs in the classroom Data mining model (classification tree) identified ... Variables associated: Having free and frequent access to a computer at work, PC ownership, Networking skills, Knowledge of operating system, Use of manuals for self-instruction. Knowledge of text processing software What is driving the targeted educator behavior: “PC usage in classroom”?

  44. What we can do, where we need help • SRCE – University Computing Centre • To track indicator trends/forecasts, graphical presentations/visualization on the web (NSO Census 2001, 4000+ linked graphs example: http://www.dzs.hr ) • Record what events are impacting the indicators • Continuing contact with the EU, business, industry, and government policy makers • Funding • Additional expertise • Major support from telecommunication industry leadership, Internet service providers and other infrastructure providers, and National Statistical Offices (NSOs) • Innovative virtual think thank for collaborative application development so as to integrate Information Society technology into everyday use of citizens, business investment, and eGovernment • EU experts on legal policy issues, statistics/data mining, and sampling experts from NSOs, • Economic experts on technological developments and innovation • Statistical methodology expertise • NGO liaison with the public • CRM with business, industry, government, education, and the EU • FDI/Proposal money, money, money

  45. Tables vs. Graphs (www.dzs.hr): e.g. ethnic diversity table

  46. Tables vs. Graphs (www.dzs.hr): e.g. ethnic diversity map (w/ Croats)

  47. Tables vs. Graphs (www.dzs.hr): e.g. ethnic diversity map (w/out Croats)

  48. graphs density graphs graphs graphs Other education gender ethnicity graphs graphs age pyr. age households % by counties % by counties % by municip. % by municip. % by counties % by counties % by municip. % by municip. % by counties % by municip. DEMO: Site map (http://www.dzs.hr/Eng/Census/census2001.htm)

  49. graphs ethnicity education gender Other age households DEMO: Main Table (“Other”)

  50. Proposal: to create an interactive eSEE benchmarking visualization web application (table(s) linked to graphs) EU, Regions, Countries, counties, … INDICATORS

More Related