1 / 19

Cognitively-Based Assessment Enabled by Technology

Cognitively-Based Assessment Enabled by Technology. Eva L. Baker. UCLA Graduate School of Education & Information Studies Center for the Study of Evaluation (CSE) National Center for Research on Evaluation, Standards, and Student Testing (CRESST). AERA 44.38 April 2001.

hu-williams
Download Presentation

Cognitively-Based Assessment Enabled by Technology

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Cognitively-Based Assessment Enabled by Technology Eva L. Baker UCLA Graduate School of Education & Information StudiesCenter for the Study of Evaluation (CSE) National Center for Research on Evaluation, Standards, and Student Testing (CRESST) AERA 44.38 April 2001

  2. Technology Principles for the Design and Use of Educational Information • Problem definition • Assessment • Data interpretation and representation • Examples and inferred principles • Key research

  3. Problem • Global notions of assessment design—match or aligned to standards, illustrate a preferred format; normed interpretation • Naive view that mere access to data will improve performance • Policy now expects multiple purposes to be served by limited assessment(s) • One-at-a-time mentality • Assessment “systems” remain to be achieved

  4. To be Productive in Technology-Based Assessment/Improvement Systems • Design reusable components—tasks, data modules, scoring protocols, reporting • Specify details guiding the integration of system elements • Plan for rapidly changing technology • Include in the system both data elements, user models, and interpretative options

  5. Assessment Design Strategy • Start with cognitive demands • Guide task development, test integration, and scoring elements • Implement in subject matter domains or skills (soft or hard) • Monitor precursor or developmental sequence • Review for linguistic appropriateness • Determine key data elements or processes to be collected

  6. Families of Cognitive Demands:Both Domain-Dependent and Domain-Independent Features

  7. Authoring Tools • Assessment tasks and tests • Data representation • Interpretation • Public reporting

  8. CRESST Authoring System Plan: Part 1 • Templates based on current model-based assessments • Web-based with expert and peer review • Automated scoring using extant- or expert-based systems • Correspondence with “content and performance standards” or other system goals

  9. Principles for Assessment Design Today • Contain cost by automation • Start with pervasive rather than ephemeral elements (e.g., cognitive demands) • Implement in content and skill domains • Assess and correct linguistic complexity and other likely sources of construct-irrelevant variance • Generate resusable structures, including support by users (teachers, administrators, publishers) • Link to other existing system elements

  10. Automation: Part 2 • Depends on realization of “Learnome” maps of domains • Proofs of concept in literacy, geography, math, technical skills, chemistry • Selection of primitives or objects • Links to Web-enabled content classification • Default conditions supporting validity for purpose, reliability, and flexibility • Interactive user trials in real and controlled settings

  11. Principles for a Rapidly Changing World • Automate design based on Learnome primitives • Technological support for test administration • Automate data collection for on-the-fly technical quality monitoring • Create “add an egg” versions with talkies • Develop comparability indices

  12. System Data Interpreter(s) and Reporting Systems • Early version—QSP—data manager intuitive, novice user, disaggregation, query based, longitudinal story for individual, unit, institution, or program • Multiple purposes—feedback, evaluation, accountability, and individual diagnosis • Additional data—meeting requirements or supporting validity interpretations • Top-down, bottom-up • Massive differences in user knowledge requirements and expectations

  13. New Version • Expanded user set • User-selected data elements and representations • Local flexibility expanded • Scenarios to simulate consequences of selected actions on groups, schools, or system

  14. Report Card Generator • Automated representations of extant data elements • Iconic, metaphorical, intuitive • Multiple media—Web • Institutional, program, individuals • Inexpensive and fast

  15. Current Example:Static Data Representation

  16. Next Generation Reporting • Multiple metaphors • Intuitive, dynamic, and progressive • Extensible and portable • User selection of options based on personal mental model

  17. http://vv.arts.ucla.edu/

  18. Principles for Data Representation and Interpretation • Explicit user models—purposes and element preferences • Responsive timing • Local automation of some functions • Representation flexibility • System supports for mental models and partial knowledge

  19. Key Research • Learnome mapping and primitive development • Limits of on-the-fly technical quality supports • Flexibility by mental model of user(s) • Updates of “prescription” selection and scenario building • Integrating the user in the representation

More Related