1 / 35

META Education Research Group Multimedia Education & Technology Assessment

META Education Research Group Multimedia Education & Technology Assessment. Matthew T. Marino, Ph.D. Debra Foley, M.S.S.W. http://www.meta-erg.com/ 509.592.3760 (West) 860.459.2988 (East) 120 SW Cedar ST Pullman, WA 99163 meta.erg@gmail.com.

clark
Download Presentation

META Education Research Group Multimedia Education & Technology Assessment

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. META Education Research Group Multimedia Education & Technology Assessment Matthew T. Marino, Ph.D. Debra Foley, M.S.S.W. http://www.meta-erg.com/ 509.592.3760 (West) 860.459.2988 (East) 120 SW Cedar ST Pullman, WA 99163 meta.erg@gmail.com Sign up to participate in future game-enhanced projects before you leave today!

  2. Game-Enhanced Middle School Science A collaborative project between Filament Games and META Education Research Group.

  3. Game-enhanced Science ~ Project Goals • Apply evidence-based instruction and assessment principles to video games • Increase accessibility of middle school science curricular materials for ALL students • Reduce the science achievement gap by improving learning outcomes for students with disabilities and those who are at-risk of learning failure • Incorporate Universal Design for Learning and Response to Intervention progress monitoring tools in the games • Strategically target traditionally marginalized students during middle school, before they become disenfranchised with science

  4. What’s So Difficult About Middle School Science? • More new vocabulary than in the first year of a high school foreign language course • Complex expository texts that limit of poor readers’ comprehension (Lee & Erdogan, 2007) • New information is covered at a rapid pace • Increased emphasis on using the scientific method to solve complex multi-step problems

  5. 8th Grade Science Performance of Students Without Disabilities U.S. Department of Education, National Center for Education Statistics (2010)

  6. 8th Grade Science Performance of Students With Disabilities Outcome - Only 5% of SWD enter the STEM workforce (Leddy, 2010)

  7. Students with Learning Disabilities • Have difficulty activating prior knowledge • Are reluctant to pose questions or hypotheses • Are less likely to have a systematic plan to approach problems • Struggle to implement instructor feedback • Have difficulty making inferences during inductive and deductive reasoning processes • Seldom transfer knowledge across contexts • Are less likely to be aware of their metacognitive processes

  8. Challenges for Secondary Science Teachers • Developing science curricular materials that meet the needs of ALL students is extremely time intensive (Dymond et al., 2006) • Many secondary teachers do not have the expertise to meet the needs of struggling readers (Alston et al., 2002; Robinson, 2002) • Limited financial support to diversify instructional materials (Moriarty, 2007) • Lack of professional development opportunities (Mastropieri, et al., 2003) How can teachers overcome these barriers?

  9. Who Plays Video Games? • Random sample of 1,102 adolescents ages 12 – 17 • 99% of boys; 94% of girls play video games • 39% of boys & 22% of girls play daily (500k w/ LD) • 34% of boys & 18% of girls play for >2 hours daily • 86% on consoles • 73% on computers • 60% on portable devices • 48% on cell phones Note: error +/- 3 points Lenhart, A. (2008) Teens, video games and civics. Pew Internet and American Life Project http://www.pewinternet.org/Reports/2008/Teens-Video-Games-and-Civics.aspx

  10. “Serious” Video Games Educational video games represent the next generation of technology-enhanced instructional materials Students can become immersed in fun and engaging standards-based environments (e.g., undersea exploration) that are unobtainable in traditional classrooms (U.S. Department of Education, 2010) Resilient Planet ~ http://www.youtube.com/watch?v=qUWoNACJ-7Q

  11. Preliminary Research on Gaming and SWD • Can be more effective than traditional instruction (Twyman & Tindal, 2006) • Increases motivation (Charlton, Williams, & McLaughlin, 2005) • Promotes self-esteem (Harris & Rea, 2009) • Improves skills for extended periods after the game ends (Beaumont & Sofronoff, 2008) • Accelerates learning (Charlton et al., 2005)

  12. Criticisms of Science Video Games • The core game mechanics in many educational science video games are completely unrelated to the learning objectives. http://www.wisdomtools.com/projects/edu_astro.html - $100 • Many games immerse students in simulated environments that contain dramatic abstractions (e.g., cancer cells with teeth). These misrepresentations can negatively impact student learning (Kim et al., 2000) http://www.kendallhunt.com/nanolegends_trailer/ - $799

  13. Some Teachers Are Hesitant to Use Technology “With a dearth of information from educational software publishers concerning the production of their software, educators are basically on their own… Often educators find that the software they have purchased is not adaptable, does not teach what it purports to teach, or does not support what is occurring in the classroom” Boone and Higgins (2007, p. 138)

  14. Enter Universal Design for Learning • Provide scientific data using multiple means of representation (e.g., pictorial representations, tables, simulations, etc.) • Provide options for students to demonstrate their comprehension of concepts and phenomena (i.e., multiple forms of assessments) • Allow students to engage with the materials in a diverse manner that fosters their motivation and unique learning needs Center for Applied Special Technology (2009) http://www.cast.org/

  15. UDL in a Game Environment

  16. Another UDL Game Example

  17. Evidence-based Instruction in the Games • Clear goals and objectives • Expert modeling via a virtual mentor • Extended learning and practice opportunities • Immediate corrective feedback • Advanced organizers to assist with planning and problem solving • Collaborative grouping & peer tutoring options • Iterative learning cycles ~ each level builds on and reiterates previously learned knowledge and skills

  18. Integrating RtI Progress Monitoring • Each level of game-play is strategically aligned with Project 2061 Benchmarks for Science Literacy • Every aspect of game-play is recorded and available to teachers in real time • Game difficulty adjusts dynamically to ensure students play within their zone of proximal development • Dashboard simplifies progress monitoring by providing data at increasing levels of specificity

  19. Dashboard Home Page

  20. Sean Hilbert’s Performance Report ~ Page 2 Allows accurate interpretation of

  21. Sean Hilbert’s Performance Report ~ Page 3 Project 2061 Benchmarks Benchmark-specific proficiency reports

  22. Research Questions • What are the content, feasibility, and usability issues associated with the prototype and how can they be refined in order to promote students’ with LD science learning? • Are there significant pre/post intervention differences in students’ with LD learning as measured by the unit’s pre/posttest? • Are there significant differences on pre/posttest measures between students with LD across conditions (game vs. no game)? • Are there significant differences between students with LD and their general education peers across conditions? • Is there an interaction between condition and disability status? If so, what is the nature of the interaction? • What is the strength of the relationship among time spent playing the game, students’ use of tools within the game, and assessment outcomes?

  23. Preliminary Hypotheses • There will be a positive statistically significant difference between students in the game condition and their peers in the comparison group • Performance-based assessments in the game will be a more accurate measure of students’ science knowledge than traditional paper and pencil tests because the games remove independent variables such as students’ reading and writing abilities from the assessment process • The game will increase the efficiency of IEP team data collection, assessment, and dissemination because data can be easily collected, stored, and accessed remotely using a password protected website (Future Study) • Game-play information will provide researchers with large datasets that could be examined using multivariate analyses without any extraneous burden on school personnel (Future Study)

  24. Iterative Mixed Method Design We plan to include 112 students with LD to account for a potential 10% attrition rate. Power analyses were conducted to ascertain minimal detectable effect sizes (MDEs) using Optimal Design software (Raudenbush, Spybrookm, Liu, & Congdon, 2006). Given 64 classrooms, interclass correlations of 0.1 and 0.24, a level 2 covariate explaining .10 percent of variance, 25 students per classroom, and power equal to .80, the minimum detectable effect size is 0.22 (ICC = 0.10) and 0.31 (ICC=0.24). The ICC values are in accord with estimates obtained from large national databases published to aid the planning of group randomized trials in education (Hedges & Hedberg, 2007). Quantitative Analysis 2 x 2 Experimental Design

  25. Measures • Demographic surveys • Paper and pencil pre/posttest • Semi-structured web-based interviews • Transcripts of messages sent through the game interface • Game-play statistics • Game performance-based assessments • Norm-referenced exam • Pre/post game-play surveys (treatment condition)

  26. Procedures • Random assignment at school level to minimize diffusion effects • Teacher professional development to ensure fidelity of implementation • Demographic survey • Paper & pencil pretest (all students) • Initial game-play survey (all students) • Implement chapter (1 – 2 hours game play) • Posttest & TerraNova 3rd Ed. Science Battery (all students) • Post-game survey (games condition) • Web-based focus group interviews with teachers and students (games condition)

  27. Qualitative Analysis Question: “What are the content, usability and playability issues associated with the prototype and how can they be refined in order to promote science learning for students with LD?” • Qualitative analysis of the semi-structured interviews, surveys and videos of student game play. • Interviews will be transcribed in full (including pauses, facial expression, and tone of voice) and analyzed using the constant comparative method (Merriam, 2002), facilitated by NVIVO qualitative data analysis software. • The transcripts of interviews and videos of game-play will be examined for emergent analytic categories by multiple analysts whose coding will be compared through NVIVO’s reliability function. • Validity will be addressed through data triangulation between interview data, game play statistics, and student test scores.

  28. Quantitative Analysis • A 2 x 2 (i.e., disability status x condition) ANCOVA with students’ pretest scores as the covariate will be used for between-group analysis. • Caveat: ANCOVA assumes independent observations. There is some dependence in scores in this sample with students linked together by teachers and classrooms. • Due to the small sample size in Phase I, gain scores will be included in the analysis. • Correlation analyses (e.g., bivariate correlations, regression) will be used to assess the following question: “What is the strength of the relationship among time spent playing the game, students’ use of tools within the game and assessment outcomes?”

  29. Construct Validity of Game-based Assessments • Structural equation modeling (SEM) to assess construct validity of the game-based assessment and reliability of indicators • Sub-components of the game will be modeled as indictors of “Scientific Knowledge” construct • Evaluate the significance of factor loadings to determine the reliability of individual game components in measuring science ability • Inter-item correlations and composite (scale) reliability will be estimated

  30. Concurrent Validity of Game-based Assessments • Assess the relationship between performances on the video game assessment and on validated assessments of science knowledge including current state-level standardized science tests and a commercial assessment (e.g., TerraNova, 3rd Ed. Science Battery). • Given that current paper and pencil assessments require verbal and writing skills, as well as science knowledge, we expect to find a strong correlation between scores on these and the video-based assessments for students without disabilities, but weaker correlations in scores for students with LD. • For the commercial assessment (which will provide item-level data), we will use SEM to “partial out” variance in test scores not relevant to science ability. After doing so, we expect the correlation between the science knowledge factors measured by the two assessment methods will be higher for students with LD, and will more accurately indicate the concurrent validity of the video-based assessment.

  31. Participating in the Pilot • Game-based Interactive Life Science (GILS) Pilot testing 2011 – 2012 all year Learn more and sign up http://www.filamentgames.com/gils/ • Game-based Interactive Physical Science (GIPS) Pilot testing April – June 2011 Learn more and sign up http://www.filamentgames.com/gips/ • Coming Soon ~ Game-enhanced Earth Science

  32. META Education Research Group Multimedia Education & Technology Assessment Matthew T. Marino, CEO Debra Foley, CFO http://www.meta-erg.com/ 509.592.3760 (West) 860.459.2988 (East) 120 SW Cedar ST. Pullman, WA 99163 meta.erg@gmail.com

More Related