1 / 47

Teaching Day 2013

Teaching Day 2013. How Are Our Students Measuring up? Using Assessment to Align Our Practices and Inform Our Decisions. Dr. Peggy Maki Education Consultant Specializing in Assessing Student Learning. Foci. The Context of a Meaningful and Useful Assessment Process

yale
Download Presentation

Teaching Day 2013

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Teaching Day 2013 How Are Our Students Measuring up?Using Assessment to Align Our Practices and Inform Our Decisions Dr. Peggy Maki Education Consultant Specializing in Assessing Student Learning

  2. Foci The Context of a Meaningful and Useful Assessment Process The Principles and Processes of a Collaborative Commitment to Assessment

  3. Coding Curricular Maps • Aligning for Learning and Assessing • Identifying or Designing Assessment Methods That Provide Evidence of Product and Process • Chronologically Collecting and Assessing Evidence of Student Learning • Reporting on Results of Scoring Evidence of Student Learning • Establishing Soft Times and Neutral Zones for Faculty and Other Professionals to Interpret Analyzed Results and Develop a Plan to Improve or Advance Student Learning

  4. Problem? How to restructure incorrect understanding of physics concepts became the work of physics faculty at the University of Colorado (PhET project). That is, physics faculty became intellectually curious about how they could answer this question to improve students’ performance over the chronology of their learning.

  5. How well do your students…

  6. Within a course or module or learning experience? • Along the chronology of their studies and educational experiences? • From one subject or topic or focus or context to another one such as from an exercise to a case study or internship?

  7. Cognitive Psychomotor Affective Forms of Representation within Contexts Integrated Learning….

  8. Some Research on Learning that Informs Teaching, Learning, and Assessment Learners Create Meaning

  9. Threshold Concepts • pathways central to the mastery of a subject or discipline that change the way students view a subject or discipline, prompting students to bring together various aspects of a subject that they heretofore did not view as related (Land, Meyer, & Smith).

  10. People learn differently and may hold onto folk or naive knowledge, incorrect concepts, misunderstandings, false information • Deep learning occurs over time transference

  11. Learning Progressions • knowledge-based, web-like interrelated actions or behaviors or ways of thinking, transitioning, self-monitoring. May not be developed successfully in linear progression--thus necessitating formative assessment along the trajectory of learning. Movements towards increased understanding (Hess).

  12. Deep Learning Occurs When Students Are Engaged in Their Learning

  13. Learning Strategies of Successful Students in All Majors • Writing beyond what is visually presented during a lecture • Identifying clues to help organize information during a lecture • Evaluating notes after class • Reorganizing notes after class

  14. Comparing note-taking methods with peers • Using one’s own words while reading to make notes • Evaluating one’s understanding while reading • Consolidating reading and lecture notes Source: Calvin Y. Yu: Director of Cook/Douglass Learning Center, Rutgers University

  15. What do you expect your students to demonstrate, represent, or produce by the end of their program of study or by the end of their undergraduate or graduate studies? • What chronological barriers or difficulties do students encounter as they learn--from the moment they matriculate? • How well do you identify and discuss those barriers with students and colleagues and then track students’ abilities to overcome them so that increasingly “more” students achieve at higher levels of performance?

  16. Research or Study Questions • Collaboratively developed • Open-ended • Coupled with learning outcome statements • Developed at the beginning of the assessment process

  17. The Seeds of Research or Study Questions

  18. Some Examples of Research/Study Questions

  19. See handout 2

  20. II. The Principles and Processes of a • Collaborative Commitment to Assessment • A. Coding in Curricular Maps: Where and How Students Learn • Helps us determine coherence among our educational practices that enables us, in turn, to design appropriate assessment methods (See handouts 3-4) • Identifies gaps in learning opportunities that may account for students‘achievement levels • Provides a visual representation of students’learning journey

  21. Cognitive Levels of Learning: Revised Bloom’s Taxonomy (Lorin et. als.)

  22. Helps students make meaning of the journey and holds them accountable for their learning over time • Helps students develop their own learning map—especially if they chronologically document learning through eportfolios

  23. B. Aligning for Learning and Assessing (See Handouts 5, 6-9, and 10)

  24. C. Identifying or Designing Assessment Methods that Provide Evidence of Product and Process

  25. Some Direct Methods to Assess Students’ Learning Processes • Think Alouds: Pasadena City College, “How Jay Got His Groove Back and Made Math Meaningful” (Cho & Davis) • Word edit bubbles • Observations in flipped classrooms • Students’ deconstruction of a problem or issue (PLEs in eportfolios can reveal this - tagging, for example)

  26. Student recorder’s list of trouble spots in small group work or students’ identification of trouble spots they encountered in an assignment • Results of conferencing with students • Results of asking open-ended questions about how students approach a problem or address challenges • Use of reported results from adaptive or intelligent technology

  27. Use of reported results from adaptive or intelligent technology • Focus on hearing about or seeing the processes and approaches of successful and not so successful students • Analysis of “chunks of work” as part of an assignment because you know what will challenge or stump students in those chunks

  28. Some Direct Assessment Methods to Assess Students’ Products • Scenarios—such as online simulations • Critical incidents • Mind mapping • Questions, problems, prompts

  29. Problem with solution: Any other solutions? • Chronological use of case studies • Chronological use of muddy problems • Analysis of video • Debates • Data analysis or data conversion

  30. Some Indirect Methods that Probe Students’ Learning Experiences and Processes • SALG (salgsite.org): Student Assessment of Their Learning Gains • Small Group Instructional Design • Interviews with students about their learning experiences--about how those experiences did or did not foster desired learning, about the challenges they faced and continue to face. (See Handout 11)

  31. D. Chronologically Collecting and Assessing Evidence of Student Learning

  32. Your Method of Sampling Ask yourself what you want to learn about your students and when you want to learn: • All students • Random sampling of students • Stratified random sampling based on your demographics—informative about patterns of performance that can be addressed for specific populations, such as non-native speakers

  33. Scoring Faculty: • Determine when work will be sampled. • Identify who will score student work (faculty, emeritus faculty, advisory board members, others?). • Establish time and place to norm scorers for inter-rater reliability on agreed upon scoring rubric. (See Handout 12)

  34. E. Reporting on Results of Scoring Evidence of Student Learning Office of IR, Director of Assessment, or “designated other”: • Analyzes and represents scoring or testing results that can be aggregated and disaggregated to represent patterns of achievement and to answer the guiding research or study question(s) • Develops a one-page Assessment Brief

  35. The Assessment Brief Is organized around issues of interest, not the format of the data (narrative or verbal part of the brief). Reports results using graphics and comparative formats (visual part of the brief, such astrends over time, for example, or achievement based on representative populations).

  36. F. Establishing Soft Times and Neutral Zones for Faculty and Other Professionals to Interpret Analyzed Results • Identify patterns against criteria and cohorts (if possible) • Tell the story that explains the results— triangulate with other reported data, such as results of student surveys. • Determine what you wish to change, revise, or how you want to innovate and develop a timetable to reassess once changes are implemented. (See Handouts 13-14)

  37. A Problem-based Assessment Framework

  38. What if we…. Collaboratively use what we learn from this approach to assessment to design the next generation of curricula, pedagogy, instructional design, educational practices, and assignments to help increasingly more students successfully pass through trouble spots or overcome learning obstacles;

  39. and, thereby, collaboratively commit to fostering students’ enduring learning in contexts other than the ones in which they initially learned. (See Handout 15 to identify where you see the need to build your program’s assessment capacity.)

  40. Works Cited • Cho, J. and Davis, A. 2008. Pasadena City College. “How Jay Got His Groove Back and Made Math Meaningful.”http://www.cfkeep.org/html/stitch.php?s=13143081975303&id=18946594390037 • Hess, K. 2008. Developing and Using Learning Progressions as a Schema for Measuring Progress. National Center for Assessment, 2008. http://www.nciea.org/publications/CCSSO2_KH08.pdf • Land, R., Meyer, J.H.F., and Smith, J. Eds. 2010. Threshold Concepts and Transformational Learning. Rotterdam: Sense Publishers. • Lorin, A.W., Krathwohl,D.R., Airasian, P.W., and Cruikshank, K.A. Eds. (2000). A Taxonomy for Learning, Teaching, and Assessing: A Revision of Bloom’s Taxonomy of Educational Objectives: Boston, MA.: Allyn and Bacon.

  41. Maki, P. 2010. 2nd Ed. Assessing for Learning: Building a Sustainable Commitment Across the Institution. VA: Stylus Publishing, LLC • National Research Council. 2002. Knowing What Students Know: he Science and Design of Educational Assessment. Washington, D.C. • Yu, C. Y. “Learning Strategies Characteristic of Successful Students.” Maki, P. 2010. p. 139.

More Related