1 / 32

Research on Learning: Implications for Assessment in Higher Education

Research on Learning: Implications for Assessment in Higher Education. Mark Wilson & Kathleen Scalise UC, Berkeley. Overview of Today’s Presentation. Set the context What are some problems with assessment in higher education?

herman
Download Presentation

Research on Learning: Implications for Assessment in Higher Education

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Research on Learning: Implications for Assessment in Higher Education Mark Wilson & Kathleen Scalise UC, Berkeley

  2. Overview of Today’s Presentation • Set the context • What are some problems with assessment in higher education? • Overview of the logic & findings from Knowing What Students Know (KWSK) • A conceptual scheme for assessment • How cognitive and measurement science together provide for major advances in assessment • Principles of assessment design and use • Give an example of an assessment system • BEAR Assessment System • Applied to undergraduate Chemistry (“ChemQuery”)

  3. An Example of Problems with Current Educational Assessment “One of the persistent dilemmas in education is that students spent time practicing incorrect skills with little or no feedback. Furthermore, the feedback they receive is often neither timely nor informative. For the less capable student, unguided practice can be practice in doing tasks incorrectly.” — NRC report “Knowing What Students Know,” p. 87

  4. An Example:CS 1731 • Week 1: “Went to my first CS 173 discussion today. Went over what we are going to cover in the class. Sounds like some cool stuff…. The only thing bugging me right now is that I am not officially enrolled in CS 173 yet. I am fourth on the waitlist.” 1 Not the real course number.

  5. Week 4 • “Damn homework…. I went to the lab to work on it. Everyone shortly there after came in and started working on their CS 173 homework. So I ended up staying around to help everyone the best I could since I was the only one to have finished the homework…. I have no clue if it is correct, though.”

  6. Week 8 • Almost all of Monday was spent in the OCF working on CS 173 homework again. What made this severely frustrating is that I was unable to solve the problem that I spent all day on; no one I know was able to solve that problem….Severely frustrated, I went on home.”

  7. Week 9 • “Midterm grade: 63/100 . The mean was 55.5 . Standard deviation was around 18. Would have liked mean + SD, but I will live. Still beat the mean which is what is really important.”

  8. Week 17 • “Went and took the exam. Kind of stupid. Bunch of short answer with some other stuff that was never covered in the homework. I found out afterwards that about a third to half of the test lifted from last year's final. So everyone who had it or had read it knew how to answer those questions perfectly. Of course I had not seen it, let alone had a copy for the test…. Needless to say the curve will be skewed.”

  9. Some concerns about Assessment in Higher Education • Assessments frequently fail to provide: • useful feedback to students • useful “feedforward” to instructors. • useful “feedforward” to administrators • Narrowing of instruction by teaching to tests with restricted performance outcomes. • Narrowing of student learning engagement when metacognitive needs not satisfied.

  10. The Assessment Triangle • cognition • model of how students represent knowledge & develop competence in the domain • observations • tasks or situations that allow one to observe students’ performance • interpretation • method for making sense of the data interpretation observation cognition Must be coordinated!

  11. Scientific Foundationsof Assessment • Advances in the Sciences of Thinking and Learning -- the cognition vertex • informs us about what observations are sensible to make • Contributions of Measurement and Statistical Modeling -- the interpretation vertex • Informs us about how to make sense of the observations we have made

  12. Advances in Sciences of Thinking & Learning • The most critical implications for assessment are derived from study of the nature of competence and the development of expertise in specific curriculum domains. • Knowledge organization • Characteristics of expertise • Metacognition • Multiple paths to competence • Preconceptions and mental models • Situated knowledge and expertise

  13. Some Summary Points • Contemporary knowledge from the cognitive sciences strongly implies that assessment practices need to move beyond discrete bits and pieces of knowledge to encompass the more complex aspects of student achievement • Instructional programs and assessment practices based on cognitive theory exist for areas of the curriculum • Further work is needed • translate research findings for practical use • develop models of learning for all areas of curriculum

  14. Advances in Measurement:Beyond Models of General Proficiency • Three general sets of measurement issues that can be accommodated by various models • continua vs classes • single vs multiple attributes • status vs change • Report describes a progression of models and methods of increasing complexity

  15. Assessment Design Principles Assessment design should always be • based upon a model of student learning • well-designed and tested items • and a clear sense of the inferences about student competence that are desired • for the particular context of use.

  16. Implications forAssessment Practice In the classroom: • assessment should be an integral part of instruction • students should get information about particular qualities of their work and what they can do to improve • students must understand learning goals and landmark performances along the way • cognitive science findings need to be made user-friendly

  17. Assessment Practice, cont. • Report envisions systems of assessments that cut across contexts and that are: • comprehensive • coherent • continuous • We need to shift the emphasis toward the classroom where learning occurs • Example -- BEAR assessment system

  18. The BEAR Assessment System Example: The ChemQuery project

  19. Reminder…Assessment Triangle Observation Interpretation Cognition

  20. BEAR Assessment System ChemQuery Items & Scoring guides Multidimensional IRT model 3 ChemQuery progress variables

  21. Student Model Progress Variables from ChemQuery • Matter is composed of atoms arranged in various ways: composition, structure, properties and amount of matter. • Change is associated with rearrangements of atoms: type, progression and conservation in change. • Stability is maintained unless change occurs with energy input: possibilities, influence and effort of stability.

  22. Major levels within each variable:(Read from bottom up) • Generation: Students use the models to generate new knowledge and to extend models. (~graduate school) • Construction: Students integrate scientific understanding into full working models of the domain. (~upper division) • Formulation: Students combine unirelational ideas, building more complex knowledge structures in the domain. (~lower division) • Recognition: Students begin to recognize normative scientific ideas, attaching meaning to unirelational concepts. (~high school) • Notions: Students bring real-world ideas, observation, logic and reasoning to explore scientific problem-solving. (~middle-school)

  23. ChemQuery Examples of items from our instrument: Both of the solutions have the same molecular formulas, but butyric acid smells bad and putrid while ethyl acetate smells good and sweet. Explain why these two solutions smell differently. C4H8O4 ethyl acetate C4H8O4 butyric acid

  24. ChemQuery Level One: Notions Response 1: I think there could be a lot of different reasons as to why the two solutions smell differently. One could be that they're different ages, and one has gone bad or is older which changed the smell. Response 2: Using chemistry theories, I don't have the faintest idea, but using common knowledge I will say that the producers of the ethyl products add smell to them so that you can tell them apart. Response 3: Just because they have the same molecular formula doesn't mean they are the same substance. Like different races of people: black people, white people. Maybe made of the same stuff but look different.

  25. ChemQuery Level Two: Recognition Response: "They smell differently b/c even though they have the same molecular formula, they have different structural formulas with different arrangements and patterns.”

  26. Interpretation: student profile

  27. Interpretation: track student over time

  28. ChemQuery To help ALL students increase understanding of chemistry 2 -2 +1 1 -1 0 ss s s s s s s s Score Levels Pretest Post-test Low Middle High Fall 2000 Student Gains, Grouped by Pretest Score

  29. To know what they know. And how to help. ChemQuery team: Jennifer Claesgens Kathleen Scalise Angelica Stacy Rebecca Krystiniak Sheryl Mebane Karen Draney Mark Wilson $$ NSF Contact: mrwilson@ socrates.berkeley.edu kms@uclink.berkeley.edu

More Related