130 likes | 269 Views
This presentation, delivered by Brian Zuckerman at the American Evaluation Association, explores how the Pittsburgh Science of Learning Center (PSLC) has adapted its evaluation methods over time. The PSLC aims to integrate cognitive theory and computational modeling to enhance educational outcomes. The changes in PSLC's structure, research focus, and evaluation strategies over its lifecycle highlight a transition towards practical, data-driven learning practices. This evolution reflects the need for agility in evaluation amidst dynamic educational landscapes while leveraging its extensive data resources for insights into effective learning methodologies.
E N D
It's an Evolution: Changing Roles and Approaches in the Evaluation of the Pittsburgh Science of Learning Center Brian Zuckerman American Evaluation Association November 2nd 2011
Pittsburgh Science of Learning Center PI: Ken Koedinger (CMU) Co-PIs: Chuck Perfettti (UPitt), David Klahr (CMU), Lauren Resnick (Upitt) Ed technology + Wide dissemination = “Basic research at scale” + = • Purpose: Leverage cognitive theory and computational modeling to identify the conditions that cause robust student learning. • Goals: Fundamentally transform • translational research in education • generation of learning science theory
Researchers Schools LearnLab PSLC: Transforming Translational Research Algebra Intelligent Tutor Chemistry Virtual Lab English Reading Tutor • LearnLab =social & technical infrastructure to support field-based basic research • Controlled experiments in real courses • Educational technologies => Data! • Practice-relevant discovery • What lab-based theory survives translation? • Field-based data drives discovery
PSLC: Transforming Theory Generation • PSLC Capacity Building • Vast student data repository • New field: Educational Data Mining • 2010 KDD Cup • Annual competition of Knowledge Discovery and Data Mining conference • Task: Predict step-by-step performance of 10,000 algebra students across school year pslcdatashop.web.cmu.edu/KDDCup/ Emerging “Computational Learning Science” • Data mining & computational modeling techniques • New data sources: Brain imaging, classroom video, student interactions with ed tech
PSLC: Research to Practice • Translation is built in • LearnLab embeds experiments & scientific data collection within running courses • Many outcomes of 200+ learning studies incorporated into courses • Ed tech dissemination partners • Carnegie Learning, Inc. >600,000 K12 math students a year • Open Learning Initiative 1000s of college student users a semester Cognitive Tutor 2010/11 release uses PSLC results (Butcher & Aleven, 2008) In use in all 50 states
PSLC Activities Relative to Program Goals • Conducts large-scale research • Large-scale theory development • Formation/nucleation of new fields or subdisciplines • Develops and maintains infrastructure useful to community • Large-scale infrastructure development (LearnLab, DataShop, tools) • Educates diverse, highly competent, and globally-engaged workforce • Education of graduate students and postdoctoral researchers • Broadening participation (e.g., PSLC summer internships) • Conducts center mass-requiring ancillary education efforts for broader learning sciences community (e.g., PSLC summer school) • Forges valuable partnerships • Among PSLC researchers in interdisciplinary collaborations • With industry/external stakeholders
Changes in PSLC Organization • Reorganization around renewal • Four clusters become three thrusts • Change in co-PI, on Executive Committee
Evaluation Context • Evaluator context • STPI funded by the center • STPI came on board around first site visit in 2005 • Center passed through five-year review, now in year 7 • Shifts in PSLC activities and logic model • Change in organization • Shifting emphasis on goal of theoretical framework development • Other smaller changes (e.g., shift in diversity goals toward long-term expansion of field)
Evolution of Evaluation Effort Matches Changes in PSLC Lifecycle • Years 1-2: Predominantly focused on growth of “Centerness” with data collection internal to Center • Management processes (interviews) • Collaboration formation (interviews, collaboration survey) • Development of Center-wide language and culture (interviews) • Years 3-4: In preparation for site review focus shifted • External investigators’ knowledge of Center research and predictions of future value • Theoretical framework development/wiki analysis • Bibliometric analysis of publications to date • Years 5-6: Center reorganization led to refocus on “centerness” • Return to interviews in Years 1-2 • Analysis of changes in thrust plans over time • Present: Evaluation effort largely dormant until plans for SLC-wide evaluation become evident • Some continuing activities around sustainability
Data Collection Changes Over Time (Partial list of measures and approaches)
Reflections • Evaluation in complex context • Pressure from government funder to demonstrate results even during first five-year period • Change in PSLC organization and goals over time • Required nimble evaluation approach as a result • Evaluation plans developed in Years 1 and 5 served as point of departure rather than blueprint • Difficult-to-maintain balance between need for continuity in data collection and shifting priorities