1 / 28

Compiling and Reporting Your Assessment Plan Data

Compiling and Reporting Your Assessment Plan Data. Office of Institutional Assessment & Effectiveness SUNY Oneonta Spring 2012. Providing a Context: Oneonta’s Process for Assessing Student Learning. Fall 2009: Approval by President’s Cabinet of APAC Guidelines for Academic Program Assessment

nakia
Download Presentation

Compiling and Reporting Your Assessment Plan Data

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Compiling and Reporting Your Assessment Plan Data Office of Institutional Assessment & Effectiveness SUNY Oneonta Spring 2012

  2. Providing a Context: Oneonta’s Process for Assessing Student Learning • Fall 2009: Approval by President’s Cabinet of APAC Guidelines for Academic Program Assessment • Spring 2009: Submission by programs of Step 1 (Establishing Objectives) of guidelines • December 2010: Submission by programs of Step 2 (Activities & Strategies) of guidelines • June 1, 2011: Submission of Steps 3 (Assessment) and 4 (Closing the Loop) [plans only] • All plans approved by academic deans by the end of Fall 2011 • 2011-12 academic year: First round of data collection, with reports due June 1, 2012 as part of annual reports

  3. Emphases Throughout the Process • Establishing congruence among institutional goals, programmatic and course objectives, learning opportunities, and assessments • Linkages to disciplinary (and, as appropriate, accreditation/certification) standards • Using a variety of assessment measures, both quantitative and qualitative, in search of convergence • Value of course-embedded assessment • Course- vs. program-level assessment

  4. Course- Vs. Program-Level Assessment • Focus of SUNY Oneonta assessment planning is programmatic student learning objectives, not individual students or faculty • For the most part, data collection described in approved assessment plans will take place in the context of the classroom (i.e., course-embedded assessment) • Focus at this point, however, is how to compile and aggregate data across courses and course sections, as appropriate, and to report these results in a meaningful way

  5. What You’ve Done So Far • Development of programmatic student learning objectives • Curriculum mapping • Development of plan for collecting data across a three-year period • APAC guidelines recommend that programs assess 1/3 of SLOs every year • Development of plan for reviewing data and using that information to revise courses and programs, as appropriate (i.e., “closing the loop”)

  6. Follow Your Curriculum Map – It’s Your Assessment Database! Designed correctly, your curriculum map tells you what to assess, when to assess, and how to assess it!

  7. Collecting Data: Important Points from Spring 2011 Workshop • Collaborative work among faculty to assure: • SLOs are being assessed using direct measures • Measures are of sufficient quality, either by review process or through the use of a checklist that includes good practice criteria • A priori success indicators are in place for each measure as well as common categories for results (e.g., exceeding, meeting, approaching, not meeting standards) • Decisions on how issue of multiple course sections will be addressed • Process in place for assuring: • Reliable scoring of qualitative measures • Courses (and students) to be included in assessment are representative

  8. Value of Assessment Data Will Be Maximized IF: • Avariety of assessment measures is used • Quantitative and qualitative • Course-embedded and “stand alone” measures • Benchmarking as available • For each SLO, data are collected from multiple courses/course sections • Criteria for scoring and determining what meets standards are consistent across courses • Do measures have to be the SAME? No, but…….

  9. And Perhaps Most Important There’s no point in collecting useless data!

  10. Compiling and Reporting Assessment Data Across Courses Aggregating and Documenting Your Assessment Results

  11. Approaching the Task Systematically • Start with your curriculum map, which depicts: • The courses in which you’re covering – and assessing – SLOs • And, depending on the detail of your map, the levels at which SLOs are being addressed and the measures being used to assess them • Select courses – and faculty – to be included in the assessment and communicate that expectation clearly and establish a time frame for administering assessment and submitting results

  12. Approaching the Task Systematically (cont.) • Collect and compile data (categorized as agreed upon by department) on spreadsheet for review • Review results collectively, reach conclusions regarding strengths and weaknesses, and make recommendations for change as appropriate • Be sure to document all this (i.e., write it down)

  13. A Step-by-Step Example

  14. Following Your Curriculum Map The Simplest Case: Which Courses Cover Which SLOs? To What Extent Do Courses Cover SLOs?

  15. Following Your Curriculum Map At What Level Do Courses Cover SLOs? How Do Courses Assess SLOs?

  16. Selecting Courses to be Included in Assessment • Do all courses covering a particular SLO need to be included? • Ideally, yes – remember, each SLO is only assessed every 3 years • Especially important if SLOs are being assessed at different “levels” (in terms of proficiency and/or course levels) • Do all course sections covering a particular SLO need to be included? • No, but why not? • If not, important that: • Adequate number of students are included in the assessment • Sections are selected randomly

  17. Selecting Courses to be Included in Assessment (cont.) • Once courses are selected, communicate this fact quickly and clearly to faculty, being sure to emphasize: • SLOs to be assessed and categories to be used in reporting results • Mapping of assignments – and eventually assessment results – to specific SLOs and the need for a priori criteria in categorizing student performance • Time line for submitting results (late enough in semester for students to demonstrate competence but early enough for data to be compiled) • Must assignments and a priori criteria be the same? • No, but the more comparable, the more meaningful the results • Ultimately, it’s a programmatic decision

  18. A Few More Hints About Assignments • Final grades are NEVER an adequate measure for a specific SLO • Overall assignment grades are rarely – if ever – an adequate measure for a specific SLO • Often only minor adjustments are needed in existing assignments to gather information on specific SLOs • Limit assignments to one SLO • Break assignments into subparts, each of which addresses a specific SLO • Identify test questions that map to specific SLOs

  19. Compiling Data for Review and Discussion • Important for all results be submitted in agreed-upon format, organized, and maintained in a single place (an Excel spreadsheet is perfect) • Different ways to organize results for review and deliberation • Best to start with all data for each SLO, showing each course and its overall results for student performance organized by performance categories • The more you can aggregate, the better (i.e., by course, course level, level of proficiency), being attentive to variations within aggregating categories • Importance of removing identifiers

  20. Sample Spreadsheet (for Internal Review by Program Faculty)

  21. Reviewing and Discussing Results and Closing the Loop • Important points of discussion • How are data consistent? • Do students at different course levels perform similarly? • Eventually, it will be possible to look at this issue over time • How are they distinctive? • Do students perform better on some objectives than others? • Are assessment results “acceptable” to program’s faculty? • What strengths (and weaknesses) are revealed and what explains them? • What should (and will) the program do to improve areas of weakness?

  22. Incorporating Assessment Data into Annual Report At present, directions for Annual Report include the following: “Descriptions of student learning outcome assessments conducted for your students during the year. These descriptions should include a summary of the assessment methods that were used and the results, including benchmarking as appropriate. Also, whenever possible, detail course-specific student learning outcomes and their relationship(s) to departmental or programmatic expectations.”

  23. Incorporating Assessment Data into Annual Report (cont.) • APAC’s recommendation to the Provost and Deans is that Annual Reports include two distinct components with respect to the assessment of student learning, effective June 2012 • Chart that summarizes the assessments that were done during the academic year and results • Narrative that summarizes the program’s discussion of the assessment results, overall conclusions, and planned changes based on the assessment data • Programs may also report other relevant data (e.g., results from senior/alumni surveys)

  24. Sample Outcomes Assessment Chart – Aggregate Data Simplest Case (and All That’s Required)

  25. Sample Outcomes Assessment Charts – More Detailed By Course Level

  26. Sample Outcomes Assessment Charts – More Detailed By Competency Level

  27. Issues to Be Addressed in Narrative • Overall conclusions of student performance on SLO as revealed by assessment • Description of strengths and weaknesses revealed by data • Planned revisions for improvements as appropriate • Planned revisions to the assessment process itself as appropriate

  28. APAC Members • Paul French • Josh Hammonds • Michael Koch • Julie Licata • Richard Lee • Patrice Macaluso • Anuradhaa Shastri • Bill Wilkerson (Chair) • Patty Francis (ex officio)

More Related