1 / 23

Cycle of Assessment

Cycle of Assessment. How to love assessment and learn more about your students…. Dr. Sheri H. Barrett Director, Office of Outcomes Assessment Johnson County Community College Kay King Professor, Chair, Administration of Justice Johnson County Community College. Introductions.

ura
Download Presentation

Cycle of Assessment

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Cycle of Assessment How to love assessment and learn more about your students….

  2. Dr. Sheri H. Barrett Director, Office of Outcomes Assessment Johnson County Community College Kay King Professor, Chair, Administration of Justice Johnson County Community College Introductions

  3. Is linked to decision making about the curriculum – Paloma & Banta • Measures real-life gaps in desired skills & performance – Swing, et. al. • Leads to reflection and action by faculty – Paloma & Banta What is Assessment?

  4. Assessment Maze When good intentions go wrong!

  5. Cycle of Assessment Getting on the cycle

  6. Good assessment starts with a good question! • Provides focus to the assessment design • Helps determine appropriate assessment instrument(s) • Questions to help faculty focus their assessment activities • What should students learn? • How well are they learning it? • What evidence do you have? • What are you doing with the evidence? That’s a really good question!

  7. Your research question should be: • Meaningful • Actionable • Relatable • Measurable • Manageable What makes a good research question?

  8. How many undergraduate students dropped out of our department in 2011? • What is wrong with this question? • How might it be improved in order to provide more useful information? • Which of the following variables predicted departmental student dropout in 2011: • GPA (grade point average) • Transfer to 4-year program • Financial Aid Concerns The Good, the Bad and the Ugly Research Questions

  9. What is the impact of test anxiety on test performance? • How might you revise this item if you wanted to determine if study strategies is a mediating variable? • Do study strategies mediate the relationship between test anxiety and test performance among students in introductory biology classes The Good, the Bad and the Ugly Research Questions

  10. Multiple stages to planning a good assessment • Define involvement of colleagues • Will you…? • Engage Full-time & Adjunct Faculty • Use a Pilot or full-scale roll-out • Benefits to a Pilot process • Who are your champions for the process? • It is important to identify someone in the department/division/program that will champion your project Planning makes perfect!

  11. Multiple stages to planning a good assessment • Choose an appropriate instrument to answer the question • Determine Dates/Timelines for collection • Set your Benchmarks Planning makes perfect!

  12. Embedded Assessment • Pre – Post Tests • Portfolios • Surveys • Performances • Capstone Experiences • Commercial Tests Assessment Toolbox

  13. Part II - Cycle of Assessment How to love assessment and learn more about your students….

  14. The goal of assessment is to gather data! • Statistical rigor is of primary importance when gathering assessment data. • The more data the better. • Once assessment data is analyzed, it will clearly show trends or relationships. • It will be clear what actions to take in response to the data. Myths of Assessment Data

  15. How will you organize the data? • Create a method to organize and store data • Be consistent with formatting • Keep a clean sheet of your original data (always work in a copy – trust me!) • Start a new sheet if you are making a chart or graph Collecting and Scoring Data

  16. Basic Tips and Tricks • Organizing your data • Frequently Used Functions • Average • Median • Mode • Frequency • Sorting But I don’t want to use Excel…

  17. What is your assessment data saying to you? • What will you want to look for? • Patterns • Groups/clusters • Cut points • Changes • Spread (frequencies) Analyzing the Data

  18. Discuss Data with your Colleagues • Summarize Data and Process used to collect it • Record observations about the data, e.g. gaps, successes, relationships, anomalies, etc. • What do you know about the data? • What do you still want to know about the data? • How can the data be used for instructional decisions? Analyzing the Data

  19. Data leads to students not understanding a concept • What is the benchmark of performance? • Curriculum mapping – where does the concept occur? • How is the concept taught (pedagogy) • Where is the concept reinforced? (Scaffolding) • What changes can we (faculty/department) make to the curriculum to help students understand and apply concept • How will we measure this curricular change to see if it is successful Follow the data trail

  20. Case Study – Visual Communications • General Education Curriculum Assessment Cycle • Visual Communications • Mass Exodus of Visual Communications Classes • Faculty couldn’t come up with acceptable assignments to show how they were assessing visual communications • Focused discussion with department chairs/faculty groups on the issue • Faculty felt unqualified to “teach” visual communications • Professional Development, in-service workshops, teaching circles, etc. Follow the data trail

  21. Next Assessment Cycle – what difference did the change make? Was there a difference in performance? • Make sure appropriate time has elapsed for changes to be in effect • Make sure the measurement is parallel to the previous assessment • Report your results! Closing the loop

  22. When are we done with a learning outcome?” • Did you see improvement? • Did you meet your benchmark performance? • Are you satisfied? • Do you see a greater need/question that needs to be asked? Are we done yet?

  23. Dr. Sheri H. Barrett sbarre13@jccc.edu Kay King kking05@jccc.edu Questions? Contact us

More Related