1 / 39

C R E S S T / U C L A

C R E S S T / U C L A. Evaluating the DoD Presidential Technology Initiative: Innovative Methods to Measure Student Outcomes Davina C. D. Klein & Christina Glaubke CRESST/UCLA Louise Yarnall Center for Technology in Learning, SRI International Harold F. O’Neil, Jr. CRESST/USC.

riva
Download Presentation

C R E S S T / U C L A

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. C R E S S T / U C L A Evaluating the DoD Presidential Technology Initiative: Innovative Methods to Measure Student Outcomes Davina C. D. Klein & Christina GlaubkeCRESST/UCLALouise YarnallCenter for Technology in Learning, SRI InternationalHarold F. O’Neil, Jr.CRESST/USC Paper presented as part of symposium “Quantitative and Qualitative Strategies for Evaluating Technology Use in Classrooms” AERA New Orleans—April 2000

  2. PTI Program Background • In 1995 President Clinton set goals: • computer access for students and teachers • connectivity to the Internet for classrooms • courseware to support quality curriculum • competent teachers trained in technology • DoDEA’s response was the Presidential Technology Initiative

  3. PTI Participants • PTI program implemented at 11 selected DoDEA school testbed sites across the world • Selected testbed sites required: • Minimum hardware and connectivity configurations • Technology implementation plans • School-wide support (e.g., staff)

  4. PTI Project Goals • “To develop and implement effective strategies for curriculum and technology integration” • Local site objectives included: • Evaluation and alignment of courseware • Development of technology integration plans • Integration of software and PTI courseware tools into the DoDEA curriculum

  5. Evaluation Steps • Step 1: Identify program goals • Specific expectations • Our focus was on PTI students • General achievement measures • Student attitudinal measures • Content-specific performance measures • Technology-specific performance measures

  6. Examining Technology Outcomes Student Outcomes Increased performance Better attitudes Classroom Outcomes Integration of technology and curriculum New instructional practices Teacher Outcomes Skilled teachers System Outcomes Computers, connectivity, courseware Professional development Support for innovative teaching

  7. Evaluation Steps (cont.) • Step 2: Describe how program plans to achieve goals • Theory of Action for PTI program

  8. Achieving Technology Goals System outcomes Classroom outcomes Teacher outcomes Student outcomes

  9. Evaluation Steps (cont.) • Step 3: Measure intended outcomes • Students’ attitudes toward technology • Students’ content-specific knowledge (focus on courseware tools) • Students’ Web fluency • Student-perceived classroom practices

  10. Measurement Instrumentation • Common measures • General impact of PTI on all students • Technology Questionnaire • On-line Web Expertise Assessment (WEA) • Student interviews • Courseware-specific measures • Detailed, courseware-by-courseware examination of tool impact on students • Content-specific performance-based assessment • PTI courseware usability studies

  11. Technology Questionnaire • Purpose: Measure students’ attitudes toward technology and perceptions of classroom practices • 36-item paper-and-pencil survey • Students rated statements on scale of 1 (“I really don’t agree”) to 5 (“I really agree”) • “I feel comfortable using computers” • “In class we use computers to solve problems or answer questions”

  12. Web Expertise Assessment • Purpose: Examine effects of Web usage in the classroom • Student training, then 20-minute session • Presented students with authentic search tasks • Asked students to navigate and search for relevant information in a closed Web-based environment, then bookmark relevant findings • All measures logged and coded

  13. WEA Search Task • Imagine you are learning about the U.S. presidents in your history class. Your teacher has asked you to write a report about what presidents said during their speeches when first elected to office. She has asked you to find out which presidents spoke of the importance of an educational system available to all without charge. • Use WEA to find this information for your report. • Find as many useful pages as you can. • Bookmark pages by clicking on the Add Bookmark button near the top of your screen. • You may bookmark as many useful pages as you think necessary.

  14. *

  15. WWW Background Questionnaire • Purpose: Evaluate students’ background knowledge regarding the World Wide Web • 7-item paper-and-pencil survey • Students rated statements on scale of 1 (“I really don’t agree”) to 5 (“I really agree”) • “The information on the World Wide Web is not very useful”

  16. Student Interviews • Purpose: To obtain further information about students’ attitudes toward technology and their perceptions of classroom practices • Brief 5- to 10-minute interviews • Three students interviewed per class • Qualitative data supplements quantitative findings

  17. Evaluation Participants • 6 schools at 2 DoDEA sites • 3 elementary schools • 2 middle schools • 1 middle/high school • 21 classrooms • 181 students participated in both pre- and posttest sessions

  18. Pre-Post Comparisons • Data aggregated to PTI program intervention level (classroom) • N = 14 classrooms • 4 of 21 classrooms not included because they completed modified questionnaire due to the young age of the students • 3 additional classrooms dropped because of lack of overlap between pretest and posttest samples

  19. Results: Technology Questionnaire • 140 students completed TQs • Two scales created • Attitudes toward technology • 19 items • a = .92 • Student perceptions of classroom practices • 8 items • a = .79

  20. Attitudes Toward Technology • In general, positive attitudes held (pre/post; 1-5 scale): • Students agreed it is fun figuring out how things work on a computer (4.1/3.9) • Students agreed/strongly agreed they feel comfortable using computers (4.3/4.4) • Students disagreed/strongly disagreed schoolwork on computer is waste of time (1.5/1.6) • Students agreed/strongly agreed it would be helpful to learn how to use WWW (4.4/4.4)

  21. Reported Classroom Practices • In general, limited computer use reported • High use (pre/post; 1-5 scale) • Used presentations, essays, portfolios (4.1/3.7) • Typed reports on computer after writing (3.8/3.6) • Worked in small groups (3.6/3.4)

  22. Classroom Practices (cont.) • Moderate use (pre/post; 1-5 scale) • Computers used for different assignments (3.3/3.5) • Computers used to explore things (3.2/3.2) • Low use (pre/post; 1-5 scale) • Computers used to practice basics (3.0/2.9) • Computers used to solve problems (3.2/3.0) • Many computer programs used (3.0/2.9)

  23. TQ Pre-Post Comparisons • No significant differences found from fall to spring in: • Attitudes toward technology(t(13) = -0.92, p = .37) • Reported classroom practices(t(13) = -1.3, p = .21)

  24. Results: WEA • 142 students completed WEA • Four scales created • Students’ background Web knowledge • 4 items, a =.77 • Students’ finding ability • 3 items, a =.88 • Students’ searching expertise • 3 items, a =.68 • Students’ navigational strategies • 2 items, a =.72

  25. Background Web Knowledge • In general, students familiar with Web • Students were neutral/agreed that information on WWW is accurate (3.6/3.3) • Students disagreed/strongly disagreed that information on WWW is not useful (1.6/1.8) • Students disagreed that there is not a lot of detailed or in-depth information on WWW (2.0/2.1) • Students agreed/strongly agreed that WWW is helpful in finding information (4.5/4.3)

  26. Finding Ability • In general, students able to find info • Average bookmark peripherally relevant to task (2.2/2.0 on 0-3 scale) • Quality of bookmark response set was good (2.2/2.0 on 0-3 scale) • About one third of pages bookmarked appropriately (efficiency of .31/.32)

  27. Searching Expertise • In general, students had difficulty searching (consistent with literature) • Quality of keyword searching set rather poor (1.6/1.6 on 0-3 scale) • Number of good searches low (3.0/1.7) • Students redirected searches, browsing search output before selection (2.2/2.2)

  28. Navigational Strategies • In general, students navigated well • Students revisited over half the information pages visited, orienting themselves in the Web space (7/6) • Students completed more steps, a sign of better searching (86/113) • [Use of back missing]

  29. WEA Pre-Post Comparisons • No significant differences found from fall to spring in: • Students’ Web knowledge (t(13) = 0.61, p = .55) • Students’ finding ability (t(13) = 0.43, p = .68) • Students’ searching expertise (t(13) = 0.54, p = .60) • Students’ navigational strategies (t(13) = 0.15, p = .88)

  30. Evaluation Steps (con’t.) • Step 4: Review implementation of plans • If antecedents don’t occur, expected outcomes won’t occur • With technology, pay close attention to: • Hardware/software • Measures of use or exposure • Technology integration

  31. PTI Implementation • Only 9 evaluation teachers planned to use courseware • Of these 9, only 5 used courseware • Courseware usage for these 5 was sparse • Teacher training/support was an issue • Student-reported classroom technology integration was weak

  32. Evaluation Steps (con’t.) • Step 5: Evaluate progress toward goals • No progress yet... • Not surprising that we found no student effects of the PTI program, as teacher- and classroom-level effects were not evident

  33. Conclusions • Need to find sensitive, innovative measures to reveal best use of technology to instruct, assess, evaluate • WEA and TQ are sample approaches • Our general approach involves: • Defining where benefits are expected based on particular high-technology environment • Creating/finding innovative measures that will be sensitive to changes within given area • Ensuring that expectations required “below” or before goal levels are being met

  34. For More Information • Visit our Web site at: • http://www.cse.ucla.edu/CRESST/pages/aera00.htm • Available: • Overheads of this presentation • Full paper • And much, much more...

More Related