closing the loop l.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
Closing the Loop PowerPoint Presentation
Download Presentation
Closing the Loop

Loading in 2 Seconds...

play fullscreen
1 / 47

Closing the Loop - PowerPoint PPT Presentation


  • 106 Views
  • Uploaded on

Closing the Loop. What to do when your assessment data are in. Step 8: Revise the Assessment Plan and Continue the Loop. Step 7: Close the Loop (Use the Results). Step 1: Identify Program Goals. Cycle of Assessment. Step 2: Specify Intended Learning Outcomes (Objectives). Step 6:

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Closing the Loop' - maribeth


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
closing the loop

Closing the Loop

What to do when your assessment data are in

Raymond M. Zurawski, Ph.D.

slide2

Step 8:

Revise the Assessment Plan and Continue the Loop

Step 7:

Close the Loop

(Use the Results)

Step 1:

Identify Program Goals

Cycle

of Assessment

Step 2:

Specify Intended Learning Outcomes (Objectives)

Step 6:

Report Findings

And Conclusions

Step 3:

Select Assessment

Methods

Step 5:

Analyze and Interpret the Data:

(Make Sense of It All)

Step 4:

Implement:

Data Collection

assessment methods
Assessment Methods

Raymond M. Zurawski, Ph.D.

assessment methods used at snc
Assessment Methods Used at SNC
  • Examination of student work
    • Capstone projects
    • Essays, papers, oral presentations
    • Scholarly presentations or publications
    • Portfolios
    • Locally developed examinations
  • Major field or licensure tests
  • Measures of professional activity
    • Performance at internship, placement, sites
    • Supervisor evaluations
  • Miscellaneous Indirect Measures
    • Satisfaction/evaluation questionnaires
    • Placement analysis (graduate or professional school, employment)

Raymond M. Zurawski, Ph.D.

other methods used
Other Methods Used
  • Faculty review of the curriculum
      • Curriculum audit
      • Analysis of existing program requirements
  • External review of curriculum
  • Analysis of course/program enrollment, drop-out rates

Raymond M. Zurawski, Ph.D.

what to know about methods
What to Know About Methods
  • Notice that different assessment methods may yield different estimates of program success
      • Measures of student self-reported abilities and student satisfaction may yield different estimates of program success than measures of student knowledge or student performance
        • What are your experiences here at SNC?
      • Good assessment practice involves use of multiple methods; multiple methods provide greater opportunities to use findings to improve learning

Raymond M. Zurawski, Ph.D.

what to know about methods7
What to Know About Methods
  • Even if the question is simply…
    • Are students performing…
      • …way better than good enough?
      • …good enough?
      • …NOT good enough?
    • The answer may depend on the assessment method used to answer that question

Raymond M. Zurawski, Ph.D.

implementation
Implementation

Raymond M. Zurawski, Ph.D.

implementation9
Implementation
  • Common Problems
    • Methodological problems
      • Instrument in development; method misaligned with program goals
    • Human or administrative error
    • Response/participation rate problems
      • Insufficient numbers (few majors; reliance on volunteers, convenience sample; poor response rate); insufficient incentives, motivation
    • High “costs” of administration
    • “Other” (no assessment, no rationale)
  • NOTE: Document the problems; provides one set of directions for ‘closing the loop’

Raymond M. Zurawski, Ph.D.

document your work
Document Your Work!
  • “If you didn’t document it, it never happened…”

The clinician’s mantra

Raymond M. Zurawski, Ph.D.

analyzing and interpreting data
Analyzing and Interpreting Data

Raymond M. Zurawski, Ph.D.

analyzing and interpreting data12
Analyzing and Interpreting Data
  • General Issues
      • Think about how information will be examined, what comparisons will be made, even before the data are collected
      • Provide Descriptive information
        • Percentages (‘strongly improved’, ‘very satisfied’)
        • Means, medians on examinations
        • Summaries of scores on products, performances
      • Provide Comparative information
        • External norms, local norms, comparisons to previous findings
        • Comparisons to Division, College norms
        • Subgroup data (students in various concentrations within program; year in program)

Raymond M. Zurawski, Ph.D.

interpretations
Interpretations
  • Identify patterns of strength
  • Identify patterns of weakness
  • Seek agreement about innovations, changes in educational practice, curricular sequencing, advising, etc.

that program staff believe will improve learning

Raymond M. Zurawski, Ph.D.

ways to develop targeted interpretations
Ways to Develop Targeted Interpretations
  • What questions are most important to you? What’s the story you want to tell?
    • helps you decide how you want results analyzed
  • Seek results reported against your criteria and standards of judgment so you can discern patterns of achievement

Raymond M. Zurawski, Ph.D.

interpreting results in relation to standards
Interpreting Results in Relation to Standards
  • Some programs establish target criteria
    • Examples
      • If the program is effective, then 70% of portfolios evaluated will be judged “Good” or “Very good” in design
      • The average alumni rating of the program’s overall effectiveness will be at least 4.5 on a 5.0-point scale

Raymond M. Zurawski, Ph.D.

standards and results four basic relationships
Standards and Results:Four Basic Relationships
  • Four broad relationships are possible:
      • A standard was established that students met
      • A standard was established that students did not meet
      • No standard was established
      • The planned assessment was not conducted or not possible
  • Some drawbacks to establishing target criteria
      • Difficulties in picking the target number
      • Results exceeding standard do not justify inaction
      • Results not meeting standard do not represent failure

Raymond M. Zurawski, Ph.D.

reporting results
Reporting Results

Raymond M. Zurawski, Ph.D.

reporting assessment findings
Reporting Assessment Findings
  • Resources
    • An Assessment Workbook
      • Ball State University
    • Another Assessment Handbook
      • Skidmore College
  • An important general consideration:
    • Who is your audience?

Raymond M. Zurawski, Ph.D.

sample report formats
Sample Report Formats
  • Skidmore College
  • Old Dominion University
  • Ohio University
  • George Mason University
  • Montana State University (History)
    • Other programs
  • Institutional Effectiveness Associates, Inc.

Raymond M. Zurawski, Ph.D.

local examples of assessment reports
Local Examples of Assessment Reports
  • Academic Affairs Divisions
    • Division of Humanities and Fine Arts
    • Division of Natural Sciences
    • Division of Social Sciences
  • Student Life
  • Mission and Heritage

Raymond M. Zurawski, Ph.D.

ok but just how do i report
OK, but just HOW do I report…
  • Q: How to report….
    • Survey findings, Major Field Test data, Performance on Scoring Rubrics, etc.
  • A: Don’t Reinvent the Wheel
    • Consult local Assessment and Program Review reports for examples

Raymond M. Zurawski, Ph.D.

closing the loop the key step
Closing the Loop: The Key Step
  • To be meaningful, assessment results must be studied, interpreted, and used
  • Using the results is called “closing the loop”
  • We conduct outcomes assessment because the findings can be used to improve our programs

Raymond M. Zurawski, Ph.D.

closing the loop24
Closing the Loop
  • Where assessment and evaluation come together…
    • Assessment:
      • Gathering, analyzing, and interpreting information about student learning
    • Evaluation
      • Using assessment findings to improve institutions, divisions, and departments
      • Upcraft and Schuh

Raymond M. Zurawski, Ph.D.

why close the loop
Why Close the Loop?
  • To Inform Program Review
  • To Inform Planning and Budgeting
  • To Improve Teaching and Learning
  • To Promote Continuous Improvement

(rather than ‘inspection at the end’)

Raymond M. Zurawski, Ph.D.

steps in closing the assessment loop
Steps in Closing the Assessment Loop
  • Briefly report methodology for each outcome
  • Document where the students are meeting the intended outcome
  • Document where they are not meeting the outcome
  • Document decisions made to improve the program and assessment plan
  • Refine assessment method and repeat process after proper time for implementation

Raymond M. Zurawski, Ph.D.

ways to close the loop
Ways to Close the Loop
  • Curricular design and sequencing
  • Restriction on navigation of the curriculum
  • Weaving more of “x” across the curriculum
  • Increasing opportunities to learn “x”

Raymond M. Zurawski, Ph.D.

additional ways to close the loop
Additional Ways to Close the Loop
  • Strengthening advising
  • Co-designing curriculum and co-curriculum
  • Development of new model of teaching and learning based on research or others’ practice
  • Development of learning modules or self-paced learning to address typical learning obstacles

Raymond M. Zurawski, Ph.D.

and don t forget
And don’t forget…
  • A commonly reported use of results is to refine the assessment process itself
      • New or refined instruments
      • Improved methods of data collection (instructions, incentives, timing, setting, etc.)
      • Changes in participant sample
  • Re-assess to determine the efficacy of these changes in enhancing student learning.

Raymond M. Zurawski, Ph.D.

a cautionary tale
A Cautionary Tale
  • Beware the Lake Woebegone Effect
    • …where all the children are above average…

Raymond M. Zurawski, Ph.D.

a cautionary tale31
A Cautionary Tale
  • When concluding that…

no changes are necessary at this time…

  • Standards may have been met but…
    • There may nonetheless be many students failing to meet expectations
      • How might they be helped to perform better?
    • There may nonetheless be ways to improve the program

Raymond M. Zurawski, Ph.D.

facilitating use of findings
Facilitating Use of Findings
  • Laying Appropriate Groundwork
    • Assessment infrastructure
    • Conducive policies
    • Linking assessment to other internal proceses
      • (e.g., planning, budgeting, program review, etc,)
    • Establish an annual assessment calendar

Raymond M. Zurawski, Ph.D.

factors that discourage use of findings
Factors that Discourage Use of Findings
  • Failure to inform relevant individuals about purposes and scope of assessment projects
  • Raising concerns and obstacles over unimportant issues
  • Competing agendas and lack of sufficient resources

Raymond M. Zurawski, Ph.D.

what you can do fulks 2004
What You Can Do (Fulks, 2004)
  • Schedule time to record data directly after completing the assessment.
  • Prepare a simple table or chart to record results.
  • Think about the meaning of these data and write down your conclusions.
  • Take the opportunity to share your findings with other faculty in your area as well in those in other areas.
  • Share the findings with students, if appropriate.
  • Report on the data and what you have learned at discipline and institutional meetings.

Raymond M. Zurawski, Ph.D.

group practices that enhance use of findings
Group Practices that Enhance Use of Findings
  • Disciplinary groups’ interpretation of results
  • Cross-disciplinary groups’ interpretation of results (library and information resource professionals, student affairs professionals)
  • Integration of students, TAs, internship advisors or others who contribute to students’ learning

Raymond M. Zurawski, Ph.D.

external examples of closing the loop
External Examples of Closing the Loop
  • University of Washington
  • Virginia Polytechnic University
  • St. Cloud State University
  • Montana State University (Chemistry)

Raymond M. Zurawski, Ph.D.

closing the loop good news
Closing the Loop:Good News!
  • Many programs at SNC have used their results to make program improvements or to refine their assessment procedures

Raymond M. Zurawski, Ph.D.

local examples of closing the loop
Local Examples of Closing the Loop
  • See HLC Focused Visit Progress Report Narrative on OIE Website
  • See Program Assessment Reports and Program Review Reports on OIE Website

Raymond M. Zurawski, Ph.D.

ask your colleagues in about their efforts to close the loop
Ask your colleagues in … about their efforts to close the loop
  • Music
  • Religious Studies
  • Chemistry
  • Geology
  • Business Administration
  • Economics
  • Teacher Education
  • Student Life
  • Mission and Heritage
  • Etc.

Raymond M. Zurawski, Ph.D.

one example of closing the loop
One Example of Closing the Loop
  • Psychology
    • Added capstone in light of curriculum audit
    • Piloting changes to course pedagogy to improve performance on General Education assessment
    • Established PsycNews in response to student concerns about career/graduate study preparation
    • Replaced pre-test Major Field Test administration with a lower cost, reliable and valid externally developed test

Raymond M. Zurawski, Ph.D.

conclusions
Conclusions

Raymond M. Zurawski, Ph.D.

conclusions42
Conclusions
  • Programs are relatively free to choose which aspects of student learning they wish to assess
  • Assessing and reporting matter, but . . .
  • Taking action on the basis of good information about real questions is the best reason for doing assessment
conclusions43
Conclusions
  • The main thing…
    • …is to keep the main thing…
      • …the main thing!

Douglas Eder, SIU-E

Raymond M. Zurawski, Ph.D.

conclusions44
Conclusions
  • It may be premature to discourage the use of any method
  • It may be premature to establish a specific target criteria
  • It may be premature to require strict adherence to a particular reporting format
  • Remember that sample reports discussed here are examples not necessarily models

Raymond M. Zurawski, Ph.D.

additional resources
Additional Resources
  • Internet Resources for Higher Education Outcomes Assessment (at NC State)

Raymond M. Zurawski, Ph.D.

concluding q a a one minute paper
Concluding Q & A:A One-Minute paper
  • What remains most unclear or confusing to you about closing the loop at this point?

Raymond M. Zurawski, Ph.D.