1 / 28

Dr. Mary Spencer Dr. Chris Lobban Dr. María Schefter Dr. Greg Witteman University of Guam

Evolution of Institutional Capacity to Support the Assessment-Change-Effectiveness Cycle: An Undergraduate Science Program Case Study. Dr. Mary Spencer Dr. Chris Lobban Dr. María Schefter Dr. Greg Witteman University of Guam. Introduction Pre-assessment of attendees

allene
Download Presentation

Dr. Mary Spencer Dr. Chris Lobban Dr. María Schefter Dr. Greg Witteman University of Guam

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evolution of Institutional Capacity to Support the Assessment-Change-Effectiveness Cycle:An Undergraduate Science Program Case Study Dr. Mary Spencer Dr. Chris Lobban Dr. María Schefter Dr. Greg Witteman University of Guam

  2. Introduction • Pre-assessment of attendees • Synthesis of 5 years of student outcomes assessment at UOG (Dr. Spencer) • Evaluation and the RISE Program • Background and framework (Dr. Schefter) • Example from the classroom (Dr. Lobban) • Assessment and information technology (Dr. Witteman) • Interactive sharing and discussion • Wrap-up • Post-assessment

  3. UOG’s NIH RISE Program • RISE program is a broad and flexible grant for student, faculty, and institutional development. • Long-term goal is more minority PhDs in biomedical research. • Short-term goal is to increase motivation and capacity for biomedical research.

  4. UOG’s NIH RISE Program • Evaluation is required. • Goals/objectives for NIH must be in terms of measurable outcomes for students, faculty, or the institution. • Present program includes student apprenticeships in research labs and a science technology classroom, plus faculty development opportunities.

  5. How reluctant scientists are getting involved in evaluation • Evaluation standards • Engaging scientists • Goals and measurable objectives • Student input • Closing the loop

  6. The “big picture” • In the present paradigm of biology, life is organized into “levels,” – or systems – with each level having “emergent” properties not seen in the parts. • Analogous to an institution of higher ed.? • The “big picture” of our assessment efforts. • Assessment of RISE is multi- level as well as multidisciplinary.

  7. Levels of Organization Organism (individual bat) Body system (skeletal system) Organ (leg bone) Part of an illustration in Lobban & Schefter (1997)

  8. Courses and workshops • o Factual conceptual knowledge of course content • o Lab skills appropriate to course • o Specific training in (e.g.) computer skills • o Specialized science reading / writing skills (e.g., lab reports) • o Links between course objectives and program / institutional / gen. ed. outcomes Educational outcomes: “levels” for assessment Community University Majors Courses & workshops

  9. Majors (Discipline-specific training / education) • o Factual / conceptual knowledge of the field • o Proficiency in using scientific literature • o Ability to perform appropriate data collection /analysis • o Apprenticeship experiences in research labs Educational outcomes: “levels” for assessment Community University Majors Courses & workshops

  10.  University education • o Reading/Writing/Analytical skills (GRE) • o Critical Thinking skills • o Computer literacy • o Presentation skills • o General education outcomes Educational outcomes: “levels” for assessment Community University Majors Courses & workshops

  11. Career /Community level • o Career success (as PhD researcher or other) • o # Scientific findings • o # of PhD researchers • o Community service – science ed., biota, • (endangered) species survey work Educational outcomes: “levels” for assessment Community University Majors Courses & workshops

  12. NIH RISE Program Assessment—change—effectiveness cycle(s)? Community Change University Majors Courses & workshops Assessment

  13. Grass roots • Learning objectives • 3 parts (Mager) • Observable behavior (esp. verb… Bloom) • Of what…? • Criteria, e.g., scoring rubric

  14. COGNITIVE PROCESS DIMENSION Remember Understand Apply Analyze Evaluate Create KNOWLEDGE DIMENSION Factual knowledge Conceptual knowledge Procedural knowledge Metacognitive knowledge

  15. Source: Anderson, L.W. & D.R. Krathwohl. 2001.

  16. Grass roots • Strengths and Weaknesses • Student self ratings • Faculty ranking of skills by courses

  17. Pooled results of detail questions

  18. Faculty assessment of skills for courses: A. Students need this skill as a prerequisite. B. Students need basic skill and I help them with it. C. I teach students this. D. Could be helpful in the course but not necessary. Cross out if skill is not useful in your course Also please: Put a star by the number if you think students will need this skill in most graduate biomedical/behavioral programs

  19. Introduction • Pre-assessment of attendees • Synthesis of 5 years of student outcomes assessment at UOG (Dr. Spencer) • Evaluation and the RISE Program • Background and framework (Dr. Schefter) • Example from the classroom (Dr. Lobban) • Assessment and information technology (Dr. Witteman) • Interactive sharing and discussion • Wrap-up • Post-assessment

  20. Typical syllabus

  21. Revised syllabus

  22. Source: Anderson, L.W. & D.R. Krathwohl. 2001.

  23.  To gain an understanding of Pacific Island environments and the ecological principles on which they operate: the ecosystems (reefs, forests, savanna, wetlands); the biological, physical, and chemical processes and interactions that regulate these systems; and the ways in which humans affect and are affected by the natural environment. • Your understanding will be tested through your skills in: • interpreting – e.g., changing classification diagrams into text or vice versa; reading graphs; • exemplifying – e.g., giving an example of … • classifying – e.g., being able to classify the trophic level of an animal from a food web diagram • summarizing – e.g., Be able to summarize the process by which Darwin arrived at his hypothesis of atoll formation. • inferring – e.g., draw a logical conclusion from presented information • comparing – e.g., determine how similar things are as a criterion for applying analogy; • explaining – e.g., explain the cause of drought during El Nino

  24. Understand the scientific process Darwin used and how his hypothesis of atoll formation was tested. • Be able to summarize the process by which Darwin arrived at his hypothesis. (Do NOT state or explain his hypothesis.) • Be able to explain why Darwin’s model of atoll formation was a scientific hypothesis (i.e., not a belief/statement of faith, nor idle speculation); • Using Darwin’s hypothesis, be able to infer the relative ages of two oceanic islands given maps of them. • Be able to recall what was done to test Darwin’s hypothesis. Became…

  25. Evaluation Plan to Determine Program Outcomes (NIGMS-MORE) • Describe formative evaluations--these are evaluations carried out during the course of implementing activities to assess its suitability for the need. • Describe summative evaluations--these evaluations are carried out at the end of the activity to assess the outcome. • Discuss the use of qualitative and quantitative data collection methods. • State when in the course of implementing the activity data will be collected. • State any plans to make a mid-course modification of activities if formative evaluations indicate a need to change. • Provide examples of questionnaires to be used to collect qualitative improvements such as perceptions of participants. • State how data will be analyzed and provide the types of statistical methods to be used, if any, to test the reliability of the data. • Identify who will collect and analyze the data and provide credentials of the person(s) selected for collection and analysis of data. • Source: NIGMS-MORE Division

  26. Source: Anderson, L.W. & D.R. Krathwohl. 2001.

  27. Introduction • Pre-assessment of attendees • Synthesis of 5 years of student outcomes assessment at UOG (Dr. Spencer) • Evaluation and the RISE Program • Background and framework (Dr. Schefter) • Example from the classroom(Dr. Lobban) • Assessment and information technology (Dr. Witteman) [Click to continue slide show or to download next ppt file] • Interactive sharing and discussion • Wrap-up • Post-assessment

More Related