Recommendations for technology and innovation in assessment l.jpg
This presentation is the property of its rightful owner.
Sponsored Links
1 / 41

Recommendations for Technology and Innovation in Assessment PowerPoint PPT Presentation


  • 214 Views
  • Updated On :
  • Presentation posted in: Pets / Animals

Recommendations for Technology and Innovation in Assessment. Edys S. Quellmalz Michael J. Timms Barbara C. Buckley WestEd Invited presentation at the Race to the Top Assessment Program Public and Expert Input Meeting. Boston, MA November 13, 2009.

Download Presentation

Recommendations for Technology and Innovation in Assessment

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript


Recommendations for technology and innovation in assessment l.jpg

Recommendations for Technology and Innovation in Assessment

Edys S. Quellmalz

Michael J. Timms

Barbara C. Buckley

WestEd

Invited presentation at the Race to the Top Assessment Program Public and Expert Input Meeting. Boston, MA

November 13, 2009


Question 1 how can innovative technologies be deployed to create better assessments l.jpg

Question 1: How can innovative technologies be deployed to create better assessments?

  • Break the mold, transform, don’t transition

  • Go beyond delivery, scoring, reporting

  • Take advantage of capabilities of technology to represent domain principles, support use of “tools of the trade”

  • Focus new development on what is not currently well tested in paper formats, i.e., integrated knowledge, active processes

  • Reform test form designs and timing

  • Form collaboratives to develop collections of innovative tasks

  • Create common core of state and classroom standards, specifications, task banks

  • Create common platforms for authoring and administration


Question 1 how can innovative technologies be deployed to create better assessments3 l.jpg

Question 1: How can innovative technologies be deployed to create better assessments?


Increasing use of innovative tasks and simulations in summative assessments l.jpg

Increasing Use of Innovative Tasks and Simulations in Summative Assessments

  • PISA Computer-Based Science Assessment

  • 2009 NAEP Science Interactive Computer Tasks

  • Minnesota State Science Test

  • 2009 PISA Electronic Text

  • 2011 NAEP Writing

  • 2012 NAEP Technological Literacy


Technology affordances l.jpg

Technology Affordances

  • Rich, authentic environments and problems form contexts, forge integration via models, discourse structures and problem types

  • Multiple modalities (static, active, interactive)

  • Iterative, active processing/ inquiry-multiple trials

  • Multiple representation and symbols/overlays

  • Multiple response formats (control of pacing, replay iterate)

  • Multiple modalities may benefit ELL and SWD

  • Dynamic representations of spatial, causal, temporal science phenomena

  • Support access to collections of information sources and expertise

  • Support formal and informal forms of collaboration and social networking


Simscientists model progressions l.jpg

SimScientists: Model Progressions


Advantages of simulation based assessments for formative assessment l.jpg

Advantages of Simulation-Based Assessments for Formative Assessment

  • Can log problem solving sequences during inquiry

  • Can provide immediate, individualized feedback

  • Can provide customized, graduated coaching

  • Can cue with highlighting, working examples

  • Can provide adaptive tasks and items


Example of simscientists formative assessment task l.jpg

Example of SimScientists Formative AssessmentTask

Figure 1 – Screen Shot of the Mountain Lake Food Web Embedded Assessment with Coaching


English language arts digital cyber literacy l.jpg

English Language Arts: Digital/Cyber Literacy

  • Authentic, integrated tasks representing discourse aims and structures

  • Multimodal/multimedia “tools of the trade”

    • Static, active, interactive

    • images, graphics, symbols, multimedia

  • Search and find

  • Comprehend

  • Highlight

  • Summarize, take notes

  • Select, assemble, represent, transform


Slide10 l.jpg

Formative Assessment: Student Self-Assessment of Constructed Responses: Scientific Explanation

In embedded assessments students revise and evaluate constructed responses.

Write

task

criteria

Revise

example

Evaluate


Mathematics digital cyber learning l.jpg

Mathematics: Digital/Cyber Learning

  • Authentic, integrated tasks representing significant problem types

  • Multimodal/multimedia “tools of the trade”

    • Static, active, interactive

    • images, graphics, symbols, multimedia

  • Search and find

  • Analyze data, visualizations, simulations

  • Run iterative solutions

  • Transform representations (tables, graphs)

  • Select and present best evidence

  • Present, explain, display processes and solutions


Slide12 l.jpg

SimScientists: Simulations for Iterative Investigations, Interpretations of Multiple Data Representations and Analyses

Figure 2 – Screen Shot of the Australian Grassland Population Dynamics Benchmark Assessment


Slide13 l.jpg

Question 2. What would be features of a system to develop, administer, score, report with quality and cost effectiveness?

  • Design templates and re-usable components for rapid and cost-effective development

  • Web-based for easy access from any site

  • Should run from a browser so that no software has to be loaded on school computers

  • Scoring should be computer-based and computer-assisted

  • Provide formative and summative reports to students and teachers


Slide14 l.jpg

Technology Support for Teacher Scoring of Constructed Responses on Unit Benchmark

.


Costs and benefits l.jpg

Costs and Benefits

Additional Costs

Benefits

No printing and shipping of assessments

No scanning of bubble sheets

No human-scoring sessions

Results could be sent electronically (less mailing)

Reuse of templates and components via authoring system

Allows authoring by developers and teachers

Easier to add accommodations like large print or read-aloud (text to speech)

  • Increased site administration costs due to need for more skilled personnel

  • Increased item development costs

  • Initial system development costs

  • Ways to limit additional costs:

  • States could form consortia to spread the costs of system and assessment development

  • Strategic use of complex item types (e.g. matrix sampling of specific knowledge/skills)


Slide16 l.jpg

Question 3.How could the technology platform support development of high quality interim assessments?

  • Design templates, specification shell, storyboard, and re-usable components for rapid and cost-effective development

  • Common tasks design specifications for core tasks at state and classroom levels

  • Common core collection of secure and public tasks

  • Models for embedding secure tasks in end of unit benchmarks


Slide17 l.jpg

Multilevel Balanced State Assessment System

state

Proficiency by Standards

Proficiency by Standards

district

Proficiency by Standards

Proficiency by Standards

Classroom

Core standards

Specifications

Common tasks

Item bank

Benchmark Summative Unit Assessments

Benchmark Summative Unit Assessments

Benchmark Summative Unit Assessments

Benchmark Summative Unit Assessments

Embedded Formative Assessments

Embedded Formative Assessments

Embedded Formative Assessments

Embedded Formative Assessments


Slide18 l.jpg

SimScientists Embedded Formative Assessments Unit Benchmark Assessments

Online assessment with feedback and coaching

Follow up Classroom Reflection Activity

Progress report

Embedded Formative Assessments

Benchmark Summative Unit Assessments

Online assessment without feedback

Teacher scores constructed responses

Bayes Net

Proficiency report


Slide19 l.jpg

SimScientists Embedded Formative Assessment Class Report


Slide20 l.jpg

SimScientists Unit Benchmark Summative Proficiency Report


Developing multilevel balanced state assessment systems l.jpg

Developing Multilevel Balanced State Assessment Systems

  • The assessment systems leverage the power of collaboration to share the costs and logistics of fully developing, maintaining, and articulating science assessments.

  • The systems develop explicit plans for connecting and integrating assessment designs and results gathered from multiple levels of the system.

  • The science assessment systems document alignment of current and planned assessment tasks and items with content and performance standards. These standards are further specified to define the knowledge, skills, and strategies and levels of performance that comprise the assessment targets of a student outcome model.

  • The systems employ common task and item specifications to shape pools of tasks and items that can be accessed for assessments at multiple levels of the system and that will elicit and link evidence of achievement of science standards.

  • From Quellmalz, E.S. & Moody, M. (2004). Developing Multilevel State Science Assessment Systems. Report commissioned by the National Research Council Committee on Test Design for K-12 Science Achievement..


Developing multilevel balanced state assessment systems22 l.jpg

Developing Multilevel Balanced State Assessment Systems

  • The systems develop strategies for sampling from the collections to build and connect test forms at different levels of the system.

  • The science assessment systems design and implement professional development on assessment, item and task development, administration, and use of results.

  • The systems place an emphasis on the use of science assessment for learning, i.e., for diagnosis at the classroom level.

  • The assessment systems draw upon the capabilities of technologies to support assessment design, administration, scoring, interpretation, and assessment literacy.

  • From Quellmalz, E.S. & Moody, M. (2004). Models for Multilevel State Science Assessment Systems. Report commissioned by the National Research Council Committee on Test Design for K-12 Science Achievement.


References l.jpg

References


Recommendations for technology and innovation in assessment24 l.jpg

Recommendations for Technology and Innovation in Assessment

http://simscientists.org

Edys S. Quellmalz

[email protected]

Michael J. Timms

[email protected]

Barbara C. Buckley

[email protected]


Recommendations on psychometrics l.jpg

Recommendations on psychometrics

  • Need more research on effective methods to assess learning in complex tasks in games and simulations.

  • Expand our range of psychometric tools to include such things as:

    • Bayes Nets

    • Artificial Neural Networks

    • Model Tracing

    • Rule-based methods

  • Ed researchers need to work with other disciplines like computer science to find other methods.

  • Train future psychometricians in wider range of methods.


Illustrative innovativetasks l.jpg

Illustrative InnovativeTasks


Simscientists ecosystem unit benchmark task l.jpg

SimScientists: Ecosystem Unit Benchmark Task

Screenshot of SimScientists Ecosystems Benchmark Assessment Showing a Food Web Diagram Interactively Produced by a Student After Observing the Behaviors of Organisms in the Simulated Australian Grasslands Environment.


Nsf project toward a coordinated framework for the design of ict assessments l.jpg

NSF PROJECT :TOWARD A COORDINATED FRAMEWORK FOR THE DESIGN OF ICT ASSESSMENTS

Quellmalz, E. S. & Kozma, R. (2003). Designing assessments of learning with technology. Assessment in Education, 10(3), 389-407.


Slide29 l.jpg

COORDINATED ICT FRAMEWORK

ICT ASSESSMENT FRAMEWORK


Slide30 l.jpg

Integrated Performance Assessments with Technology (IPAT) Modular Design

Interrated

  • Performance assessments designed to test problem-based

  • reasoning using technology

    • Core components

      • planning

      • accessing information

      • reasoning with the information

      • drawing conclusions

      • communication

  • 1


Slide31 l.jpg

IPAT Modular Design

  • Modules can be based on an ICT strategy,technology tool,

  • subject-matter of the problem, or complexity

  • Modules can be inserted or deleted without disruputing the

  • flow of the investigation

  • Permits custom design, adaptive assessment

  • Permits separate score reports for domain knowledge,

  • strategies, technology use

  • 1


  • Slide32 l.jpg

    ICT PERFORMANCE ASSESSMENT

    Middle School Level Prototype

    • Predator – Prey ICT Performance Assessment Task

      • Intended audience = grade 8 / pop. 2 / 13 year olds

      • Problem within science context--well taught & learned

      • Modules for ICT strategies

      • Technologies

        • Common Internet and productivity tools

        • Net Logo Modeling Tool


    Predator and prey performance assessment l.jpg

    Predator and PreyPerformance Assessment


    Too many hares l.jpg

    Too many hares!

    • Read the following passage.

    Park rangers in Canada have observed that too many hares in the parks are causing problems. They eat small plants that other animals depend on for food. Some park rangers suggest that the government bring in more lynx (which eat hares) and help reduce the population.

    The park service hired Dr. Kloss at the Arctic Research Institute to investigate. Dr. Kloss wants teams of students made up of students from different schools to help. You’ll be working with two other students, Filo from York and Kari from Ottawa.

    For this project you will use technology to solve the problem should the government bring in more lynx?


    Collect information take notes and cite sources l.jpg

    Collect information, take notes, and cite sources

    Data / information to collect:

    • When does the hare population start to go down?

    • What are reasons that can cause it to go down?

    » Canadian lynx

    » The lynx and hare

    » Predator-prey cycles

    Notes

    Lynx mostly eat hares. They don’t eat much else. If there aren’t many hares, they start to get hungry.

    http://dspace.dial.pipex.com/agarman/


    Data from the last 4 years l.jpg

    Data from the last 4 years

    Access & Organize Information / Data

    Analyze & Interpret – infer trends & patterns

    Here is the data for the number of hares in the parks over the last four years. Last year (2002) there were about 95,000 hares. The year before that (2001) there were about 80,000. In 2000, there were 25,000. And in 1999, there were only about 1,000 hares.Organize the data and describe the population trend.Pick a tool to use:


    More data l.jpg

    More Data

    Collaborate

    Represent & Transform Information / Data

    Hi Maribel,

    I saw the data that you got from the Canadian park service. I did some more research and found a list that goes back 25 years and shows how many hares and lynx there were. I attached the information in a spreadsheet but I can’t understand it. We need to figure out how to make sense of all this. Can you find a better way to analyze and display the data?

    Thanks,Kari


    Slide38 l.jpg

    Transform Data


    Slide39 l.jpg

    Use Modeling Tools to Investigate, Compare, Test, Analyze & Interpret Information / Data

    Once you’ve used the modeling tool, record your results below.

    Size of hare population if lynx are NOT added to the park:

    Your Web research indicated that although lynx depend on hares for food, introducing more lynx might decrease the hare population too quickly. But how quickly is too quickly?

    Dr. Kloss would like to use a modeling program to predict what will happen to the hare and lynx populations if you added more lynx to the parks.

    2003

    2008

    increase a lotincrease a littledecrease a littledecrease a lot

    increase a lot increase a little decrease a little decrease a lot

    Size of hare population if lynx ARE added to the park:

    2003

    2008

    increase a lotincrease a littledecrease a littledecrease a lot

    increase a lotincrease a littledecrease a littledecrease a lot


    Slide40 l.jpg

    Use a model to predict outcomes


    Create a presentation l.jpg

    Create a presentation

    • Dr. Kloss would like you to prepare a short presentation (3 pages / slides) to the park service with your recommendations what to do about about the hare and lynx populations.

    • Be sure your presentation is clearly organized and includes:

    • A statement of the problem

    • Your group’s recommendation

    • Information, data and explanations to support your recommendation


  • Login