1 / 47

Closing the Loop

Closing the Loop. What to do when your assessment data are in. Step 8: Revise the Assessment Plan and Continue the Loop. Step 7: Close the Loop (Use the Results). Step 1: Identify Program Goals. Cycle of Assessment. Step 2: Specify Intended Learning Outcomes (Objectives). Step 6:

maribeth
Download Presentation

Closing the Loop

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Closing the Loop What to do when your assessment data are in Raymond M. Zurawski, Ph.D.

  2. Step 8: Revise the Assessment Plan and Continue the Loop Step 7: Close the Loop (Use the Results) Step 1: Identify Program Goals Cycle of Assessment Step 2: Specify Intended Learning Outcomes (Objectives) Step 6: Report Findings And Conclusions Step 3: Select Assessment Methods Step 5: Analyze and Interpret the Data: (Make Sense of It All) Step 4: Implement: Data Collection

  3. Assessment Methods Raymond M. Zurawski, Ph.D.

  4. Assessment Methods Used at SNC • Examination of student work • Capstone projects • Essays, papers, oral presentations • Scholarly presentations or publications • Portfolios • Locally developed examinations • Major field or licensure tests • Measures of professional activity • Performance at internship, placement, sites • Supervisor evaluations • Miscellaneous Indirect Measures • Satisfaction/evaluation questionnaires • Placement analysis (graduate or professional school, employment) Raymond M. Zurawski, Ph.D.

  5. Other Methods Used • Faculty review of the curriculum • Curriculum audit • Analysis of existing program requirements • External review of curriculum • Analysis of course/program enrollment, drop-out rates Raymond M. Zurawski, Ph.D.

  6. What to Know About Methods • Notice that different assessment methods may yield different estimates of program success • Measures of student self-reported abilities and student satisfaction may yield different estimates of program success than measures of student knowledge or student performance • What are your experiences here at SNC? • Good assessment practice involves use of multiple methods; multiple methods provide greater opportunities to use findings to improve learning Raymond M. Zurawski, Ph.D.

  7. What to Know About Methods • Even if the question is simply… • Are students performing… • …way better than good enough? • …good enough? • …NOT good enough? • The answer may depend on the assessment method used to answer that question Raymond M. Zurawski, Ph.D.

  8. Implementation Raymond M. Zurawski, Ph.D.

  9. Implementation • Common Problems • Methodological problems • Instrument in development; method misaligned with program goals • Human or administrative error • Response/participation rate problems • Insufficient numbers (few majors; reliance on volunteers, convenience sample; poor response rate); insufficient incentives, motivation • High “costs” of administration • “Other” (no assessment, no rationale) • NOTE: Document the problems; provides one set of directions for ‘closing the loop’ Raymond M. Zurawski, Ph.D.

  10. Document Your Work! • “If you didn’t document it, it never happened…” The clinician’s mantra Raymond M. Zurawski, Ph.D.

  11. Analyzing and Interpreting Data Raymond M. Zurawski, Ph.D.

  12. Analyzing and Interpreting Data • General Issues • Think about how information will be examined, what comparisons will be made, even before the data are collected • Provide Descriptive information • Percentages (‘strongly improved’, ‘very satisfied’) • Means, medians on examinations • Summaries of scores on products, performances • Provide Comparative information • External norms, local norms, comparisons to previous findings • Comparisons to Division, College norms • Subgroup data (students in various concentrations within program; year in program) Raymond M. Zurawski, Ph.D.

  13. Interpretations • Identify patterns of strength • Identify patterns of weakness • Seek agreement about innovations, changes in educational practice, curricular sequencing, advising, etc. that program staff believe will improve learning Raymond M. Zurawski, Ph.D.

  14. Ways to Develop Targeted Interpretations • What questions are most important to you? What’s the story you want to tell? • helps you decide how you want results analyzed • Seek results reported against your criteria and standards of judgment so you can discern patterns of achievement Raymond M. Zurawski, Ph.D.

  15. Interpreting Results in Relation to Standards • Some programs establish target criteria • Examples • If the program is effective, then 70% of portfolios evaluated will be judged “Good” or “Very good” in design • The average alumni rating of the program’s overall effectiveness will be at least 4.5 on a 5.0-point scale Raymond M. Zurawski, Ph.D.

  16. Standards and Results:Four Basic Relationships • Four broad relationships are possible: • A standard was established that students met • A standard was established that students did not meet • No standard was established • The planned assessment was not conducted or not possible • Some drawbacks to establishing target criteria • Difficulties in picking the target number • Results exceeding standard do not justify inaction • Results not meeting standard do not represent failure Raymond M. Zurawski, Ph.D.

  17. Reporting Results Raymond M. Zurawski, Ph.D.

  18. Reporting Assessment Findings • Resources • An Assessment Workbook • Ball State University • Another Assessment Handbook • Skidmore College • An important general consideration: • Who is your audience? Raymond M. Zurawski, Ph.D.

  19. Sample Report Formats • Skidmore College • Old Dominion University • Ohio University • George Mason University • Montana State University (History) • Other programs • Institutional Effectiveness Associates, Inc. Raymond M. Zurawski, Ph.D.

  20. Local Examples of Assessment Reports • Academic Affairs Divisions • Division of Humanities and Fine Arts • Division of Natural Sciences • Division of Social Sciences • Student Life • Mission and Heritage Raymond M. Zurawski, Ph.D.

  21. OK, but just HOW do I report… • Q: How to report…. • Survey findings, Major Field Test data, Performance on Scoring Rubrics, etc. • A: Don’t Reinvent the Wheel • Consult local Assessment and Program Review reports for examples Raymond M. Zurawski, Ph.D.

  22. Closing the Loop:Using Assessment Results Raymond M. Zurawski, Ph.D.

  23. Closing the Loop: The Key Step • To be meaningful, assessment results must be studied, interpreted, and used • Using the results is called “closing the loop” • We conduct outcomes assessment because the findings can be used to improve our programs Raymond M. Zurawski, Ph.D.

  24. Closing the Loop • Where assessment and evaluation come together… • Assessment: • Gathering, analyzing, and interpreting information about student learning • Evaluation • Using assessment findings to improve institutions, divisions, and departments • Upcraft and Schuh Raymond M. Zurawski, Ph.D.

  25. Why Close the Loop? • To Inform Program Review • To Inform Planning and Budgeting • To Improve Teaching and Learning • To Promote Continuous Improvement (rather than ‘inspection at the end’) Raymond M. Zurawski, Ph.D.

  26. Steps in Closing the Assessment Loop • Briefly report methodology for each outcome • Document where the students are meeting the intended outcome • Document where they are not meeting the outcome • Document decisions made to improve the program and assessment plan • Refine assessment method and repeat process after proper time for implementation Raymond M. Zurawski, Ph.D.

  27. Ways to Close the Loop • Curricular design and sequencing • Restriction on navigation of the curriculum • Weaving more of “x” across the curriculum • Increasing opportunities to learn “x” Raymond M. Zurawski, Ph.D.

  28. Additional Ways to Close the Loop • Strengthening advising • Co-designing curriculum and co-curriculum • Development of new model of teaching and learning based on research or others’ practice • Development of learning modules or self-paced learning to address typical learning obstacles Raymond M. Zurawski, Ph.D.

  29. And don’t forget… • A commonly reported use of results is to refine the assessment process itself • New or refined instruments • Improved methods of data collection (instructions, incentives, timing, setting, etc.) • Changes in participant sample • Re-assess to determine the efficacy of these changes in enhancing student learning. Raymond M. Zurawski, Ph.D.

  30. A Cautionary Tale • Beware the Lake Woebegone Effect • …where all the children are above average… Raymond M. Zurawski, Ph.D.

  31. A Cautionary Tale • When concluding that… no changes are necessary at this time… • Standards may have been met but… • There may nonetheless be many students failing to meet expectations • How might they be helped to perform better? • There may nonetheless be ways to improve the program Raymond M. Zurawski, Ph.D.

  32. Facilitating Use of Findings • Laying Appropriate Groundwork • Assessment infrastructure • Conducive policies • Linking assessment to other internal proceses • (e.g., planning, budgeting, program review, etc,) • Establish an annual assessment calendar Raymond M. Zurawski, Ph.D.

  33. Factors that Discourage Use of Findings • Failure to inform relevant individuals about purposes and scope of assessment projects • Raising concerns and obstacles over unimportant issues • Competing agendas and lack of sufficient resources Raymond M. Zurawski, Ph.D.

  34. What You Can Do (Fulks, 2004) • Schedule time to record data directly after completing the assessment. • Prepare a simple table or chart to record results. • Think about the meaning of these data and write down your conclusions. • Take the opportunity to share your findings with other faculty in your area as well in those in other areas. • Share the findings with students, if appropriate. • Report on the data and what you have learned at discipline and institutional meetings. Raymond M. Zurawski, Ph.D.

  35. Group Practices that Enhance Use of Findings • Disciplinary groups’ interpretation of results • Cross-disciplinary groups’ interpretation of results (library and information resource professionals, student affairs professionals) • Integration of students, TAs, internship advisors or others who contribute to students’ learning Raymond M. Zurawski, Ph.D.

  36. External Examples of Closing the Loop • University of Washington • Virginia Polytechnic University • St. Cloud State University • Montana State University (Chemistry) Raymond M. Zurawski, Ph.D.

  37. Closing the Loop:Good News! • Many programs at SNC have used their results to make program improvements or to refine their assessment procedures Raymond M. Zurawski, Ph.D.

  38. Local Examples of Closing the Loop • See HLC Focused Visit Progress Report Narrative on OIE Website • See Program Assessment Reports and Program Review Reports on OIE Website Raymond M. Zurawski, Ph.D.

  39. Ask your colleagues in … about their efforts to close the loop • Music • Religious Studies • Chemistry • Geology • Business Administration • Economics • Teacher Education • Student Life • Mission and Heritage • Etc. Raymond M. Zurawski, Ph.D.

  40. One Example of Closing the Loop • Psychology • Added capstone in light of curriculum audit • Piloting changes to course pedagogy to improve performance on General Education assessment • Established PsycNews in response to student concerns about career/graduate study preparation • Replaced pre-test Major Field Test administration with a lower cost, reliable and valid externally developed test Raymond M. Zurawski, Ph.D.

  41. Conclusions Raymond M. Zurawski, Ph.D.

  42. Conclusions • Programs are relatively free to choose which aspects of student learning they wish to assess • Assessing and reporting matter, but . . . • Taking action on the basis of good information about real questions is the best reason for doing assessment

  43. Conclusions • The main thing… • …is to keep the main thing… • …the main thing! Douglas Eder, SIU-E Raymond M. Zurawski, Ph.D.

  44. Conclusions • It may be premature to discourage the use of any method • It may be premature to establish a specific target criteria • It may be premature to require strict adherence to a particular reporting format • Remember that sample reports discussed here are examples not necessarily models Raymond M. Zurawski, Ph.D.

  45. Oh, and by the way…Document Your Work! Raymond M. Zurawski, Ph.D.

  46. Additional Resources • Internet Resources for Higher Education Outcomes Assessment (at NC State) Raymond M. Zurawski, Ph.D.

  47. Concluding Q & A:A One-Minute paper • What remains most unclear or confusing to you about closing the loop at this point? Raymond M. Zurawski, Ph.D.

More Related