1 / 27

Dimensions of Success for STEM Learning (DOS):

Dimensions of Success for STEM Learning (DOS):. Observation Tool for Assessing STEM Learning in Out-of-School Time Settings 4H Webinar June 16, 2009. Background. Why Is STEM Afterschool Assessment Important?.

dylan-hunt
Download Presentation

Dimensions of Success for STEM Learning (DOS):

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Dimensions of Success for STEM Learning (DOS): Observation Tool for Assessing STEM Learning in Out-of-School Time Settings 4H Webinar June 16, 2009

  2. Background

  3. Why Is STEM Afterschool Assessment Important? • Increasing concern about declining interest in STEM and STEM careers (Campbell et al., 2002; Jacobs & Simpkins, 2005; NSF, 2008) • Concern about assessment of STEM afterschool • Informal Learning in Science Afterschool (ILSA) Survey Data • Over half of programs (55%) had no formal evaluation for science programming, N=736 (Dahlgren et al., 2008)

  4. New STEM Assessment ToolsProgram rather than Outcome • Inquiry Science Instruction Observation Protocol (ISIOP) • Still in development • Classroom (school-based) STEM observation tool • Dimensions of Success in STEM Learning (DOS)

  5. Development of DOS

  6. What does DOS offer? • Observation tool to assess quality of science programming • Informal Learning/Inquiry Based • Afterschool Programming • Flexible so can be used to assess science programming outside of these domains • Easily modifiable • DOS can be used with other assessment tools • Add DOS to broader program assessment (e.g. PPRS) to specifically evaluate science programming • DOS can be used to assess a range of program types (e.g. community based, local affiliate of larger organization, etc.)

  7. What does DOS offer? • Tool designed to be used by evaluators with varying assessment experiences • Experience in science/education highly recommended but not required • Evaluation experience highly recommended but not required • Supplemental materials • Detailed Rating Rubric • High Reliability – (94% initial IRR) • Clear language based on observable phenomena • Curtail differences of interpretation

  8. NSF Framework Categories* • Awareness, Knowledge or Understanding of STEM concepts, processes, or careers • Engagement or Interest in STEM concepts, processes, or careers • Attitude towards STEM-related topics or capabilities • Behavior resulting from engagement STEM activities • Skills based on engagement in STEM activities *Friedman (2008)

  9. New Domain Awareness Knowledge or Understanding Programmatic Features Engagement or Interest Engagement/Interest Attitude Content Knowledge & Competence and Reasoning Behavior Career Knowledge/Acquisition & Attitude/Behavior Skills

  10. Dimensions of Success:Assessment Domains and Dimensions

  11. Mechanics of DOS

  12. Rating Sheet • Qualitative Descriptions • Quantitative Rating • Type Directly into the Sheet • Space will Expand Automatically

  13. Recommendations for Using Tool • At least two observers • Especially important during start of assessment • Establish reliability of observer(s) (IRR) • Write field notes of observation • Use field notes to be reflective about observation • Do not rate activity immediately after observing • Have time to reflect on observation and then rate activity • Complete rating before next observation • Each observer should rate the activity individually and come together to have a focus group discussion to reach consensus

  14. Observer 2 Observer 1 Program A Program C Program B

  15. Observer 2 Observer 1 Program A Program C Program B

  16. Insert rubric

  17. Training

  18. Final Discussion

  19. References/Suggested Reading Campbell, P. B., Jolly, E., Hoey, L., & Perlman, L. K. (2002). Upping the numbers: Using research-based decision making to increase diversity in the quantitative disciplines. Report Commissioned by the General Electric Foundation. Dahlgren, C. T., Noam, G. G., & Larson, J. D. (2008). Findings for year one data for the Informal Learning and Science Afterschool Study. Paper presented at the annual meeting of the American Educational Research Association, New York, NY. Friedman, A. (Ed). (2008). Framework for Evaluating Impacts of Informal Science Education Projects. Arlington, VA: National Science Foundation. Jacobs, J. E., & Simpkins, S. D. (eds.) (2005). Leaks in the pipeline to math, science, and technology careers. New Directions for Child and Adolescent Development, 110, San Francisco, CA: Jossey-Bass. National Science Foundation. (January 10, 2008). Science education brings together government and corporate leaders. Retrieved March 20, 2008, from the National Science Foundation Web site: http://www.nsf.gov/news/news_summ.jsp?cntn_id=110966. Yohalem, N., Wilson-Ahlstrom, A. with Fisher, S. & Shinn, M. (2007 March) Measuring Youth Program Quality: A Guide to Assessment Tools. Washington, D.C.: The Forum for Youth Investment, Impact Strategies, Inc.

  20. If you have any further questions, please contact: pear@mclean.havard.edu Contact Information

More Related