1 / 37

Effective and Efficient Assessment

Effective and Efficient Assessment. Assessment Workshop October 27, 2006. Workshop Objectives. Discuss some external pressures to assess student learning. Understand some ways we can respond to these pressures through meaningful and effective assessment at department and program levels.

wilda
Download Presentation

Effective and Efficient Assessment

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Effective and Efficient Assessment Assessment Workshop October 27, 2006

  2. Workshop Objectives • Discuss some external pressures to assess student learning. • Understand some ways we can respond to these pressures through meaningful and effective assessment at department and program levels. • Recognize the resource limitations associated with assessing student learning effectively at program and department levels. • Discuss the four basic areas of emphasis having to do with assessment. • Understand some ways to record or document the process in the interest of continuous student learning and teaching enhancement. • Understand where to get assistance.

  3. External Pressure to Assess • Federal pressures to assess—NCLB Act, Spellings Commission Reports, etc. • State pressures will not dissipate, given budget constraints • Regional accreditors (for us, the Middle States Association) require it • Local constituencies want it—they want to know how well universities and colleges are “adding value” to a student’s education • The public increasingly expect it • Even grant funders want it—assessment systems submitted with grant applications, assessment results included with grant reports, etc.

  4. Example of Regional Accreditor’s Review: Middle States Association Suggestion for Binghamton University

  5. Speaking Different Languages: Assessment Communities In Practice University Community • Federal Guidelines • State Guidelines • North-Central Association Guidelines • Public Expectations and Norms Regulatory Community • Mission Statement • General Education • Objectives • Constituent Expectations and Norms • Faculty Expectations • Dispositional Expectations • Professional Standards • Advisory Board • Expectations • Professional Organization Norms and Expectations • Employer Expectations Program Community Assessment Plan

  6. “Every publicly supported social services agency now has an outcome-based agenda.” Trudy Banta

  7. Despite External Pressures, There Are Real Advantages to Program Assessment • Impacts student learning when linked to specified objectives. • Enables programs to answer external requests for information, quickly and without undue effort. • Empowers faculty, not bureaucracies (when assessment is faculty-led), to make decisions about curriculum, instruction, and learning. • Empowers students to defend their choice of a major; also helps them know why curriculum and teaching are pointed in certain directions. • Empowers faculty to reflect on student teaching and learning. • Enhances faculty’s ability to publish and obtain grants.

  8. Defined Student Learning Objectives Evidence-Based Program Enhancements—(Curriculum, Learning, Teaching, etc.) Faculty Meaningful Measurement (Qualitative, Quantitative, Triangulated) Focused Reflection/Discussion Effective and Efficient Assessment Is A Faculty-Based Process

  9. Four Focus Areas • What are the learning objectives a program expects students to achieve upon graduation? • What methods are used to assess those objectives, and are they summarized/aggregated? • What processes are in place, and what efforts have occurred, that have led to faculty discussions about what those assessments have to say about students mastering those objectives? • How has all of this been used, reflected upon, acted upon, etc., all in respect to student learning and teaching? How has this impactedstudent learning, in reference to the original student learning objectives?

  10. Student Learning Objectives • Need to be unambiguous • Should be agreed upon by faculty • Often, this is the most difficult part of the process (especially producing unambiguous student learning outcomes); however, once agreed upon, they lend themselves toward meaningful assessment • Can often be based on national associations’ definitions of student learning objectives, but this assumes consensus by local, program-level faculty

  11. Step One:Defining Learning Objectives • Knowledge—what do we want students to know when they graduate? (i.e.., content knowledge, etc.) • Skills—ability to perform specific tasks, think in certain ways, etc.; what should a graduating student be able to do? • Competencies—ability to perform specific tasks “in real time,” or “authentically” [knowledge + skills  competencies]; also, what are some values, attitudes, behaviors we feel are important for graduates to have?

  12. Exercise Create A Visual Representation of What You Would Like Your Students To Achieve, and List Them As Student Learning Objectives For Your Program

  13. Example: Ballroom Dance Program • Knowledge Objective: Students will know the steps of the “Cha Cha,” and “Waltz” from memory. • Skills Objective: Students will know how to dance the steps of the “Cha Cha” and “Waltz” to levels of mastery. • Competency Objective: Students will demonstrate mastery of the “Cha Cha” and “Waltz” with dance partners through the use of various styles and techniques, according to their own choices of expression; audiences will enjoy students’ performances; students will have an appreciation for dance in its various multicultural and functional forms.

  14. Sample Ballroom Dance (B.A.) Student Outcomes Grid

  15. Please complete first exercise:Think of knowledge, skills, and competencies you and your faculty would expect students to acquire by the time they graduate

  16. Discussion Why is it difficult to proceed with an assessment system if objectives are not well-defined? Why is it sometimes difficult to identify student learning objectives that are clear and “assessable?”

  17. Step Two:Selecting Meaningful Assessments • At least one should be a “direct assessment” of student learning, meaning that the assessment should involve observations of actual student performance. • Indirect assessment —assessments that include student opinions about a program’s ability to deliver on the student learning objectives, etc. can also be very helpful. • Using a combination of these—two or three—might provide meaningful information; “triangulation” is increasingly being required by evaluators. • It is rarely a surprise to find out that departments and programs initially feel that they never directly assess student learning, only to find out after discussion that they have been doing so, only informally.

  18. “Not everything that can be counted counts, and not everything that counts can be counted.” -Albert Einstein

  19. “Data is the plural of anecdote.” -Ronald Coase

  20. Tips • The process of selecting appropriate assessments is usually dependent on a number of factors—experience in process, resource limitations, etc. It is most important to link to a program’s objectives. • While the preference is to move toward a comprehensive assessment system, reality suggests that the organization of an assessment system is very much developmental; it takes time! • Many program and regional accreditation organizations expect 3-5 years of periodically collected assessment information. • When beginning, start with assessments that balance a need to conserve resources with a need to maximize the meaning gleaned from the process. • The most important question to ask at this point: “How will this information provide faculty with legitimate information that will affect learning, teaching, and curriculum?” • Should be periodic (on a regular schedule), not episodic (in fits and starts); this way, results can be revisited on a normal cycle. • A “shameless plug:” Ask the Assistant Provost for Curriculum, Instruction, & Assessment for assistance!

  21. Assessment Methods And Their Communities Of Focus Assessment Plan • Examples: • State Licensure Exams (Passage Rates) • General Education Assessments • Comparative Standardized Exams University Community Regulatory Community • Examples: • Senior Surveys • Portfolio Assessments • Juried Panels • Student work samples (rubric evaluated) • Expert panels • Examples: • Advisory Board Input/Feedback • GRE Subject Exam Scores • Employer Surveys • Intern Supervisor Surveys • Alumni Surveys • Professional Feedback/”Assessment Day” Program Community

  22. Exercise: Selection of Appropriate Assessments • Using grid provided in workshop materials, and list of assessments provided, list a couple of assessments you might want to consider using. • Feel free to rely on one another for ideas.

  23. Discussion • What are some assessments that take less effort than others? • What direct assessments appear to be most beneficial?

  24. Step Three: When Will Faculty Periodically Discuss Information? • Faculty (or an assessment committee) should be given information in advance to consider. • Focus of meeting is to discuss what assessments say about student learning objectives and any recommendations that might stem from such discussion; it should not be a “rubber stamping” of findings or solely an opportunity to complain—instead, the focus should be upon recommendations. • It is important to focus on communicating results to faculty, and tracking ways that information is used.

  25. Step Four: Ways to Track Effect of Faculty Discussions/Recommendations • Encourage inclusion in annual report (section under teaching effectiveness). • Encourage submissions of faculty narratives—how were faculty discussions and recommendations used in course design, selection of courses, etc.? • Track how recommendations were enacted—curriculum process, department/program initiatives, course sequencing, equipment requests, etc. • Not so important to make an academic study out of this, or to make this too huge of a process, but documentation is helpful when writing annual reports on assessment, curriculum, and instruction.

  26. Example: Construction Management • Need to respond to accrediting body’s new requirement to assess student learning • Upon brainstorming, already assessed student opinions, received periodic evaluations of student performance in internship sites; chose “Delphi method” because they had specific student work samples they could look at, and there were too many standards to design a simple rubric • Used services of assessment and accreditation office to place results on grid…

  27. Assessment Summary Student Performance in Construction Management Program According to ACCE Criteria Discussed at faculty meeting, October 2004 Recommendations reviewed by faculty, October 2005

  28. Results • Faculty looked at assessment results on one page • Faculty were able to evaluate relative worth of assessments • Faculty found specific strengths and weaknesses, in student learning, in teaching, and in the curriculum: Oral communication skills were weak, but sense of ethics and application of computer skills were good • Able to communicate that students were doing well in respect to ethics and computer skills (which was a great marketing point for them), and were able to point this issue out through representative on general education committee, as well; faculty also discussed how to deal with oral communication issue, such as asking faculty to give students more opportunities to give oral presentations as part of their course work.

  29. Future Directions and Vision • Will ask for report on four questions at end of each academic year. • Focus is not upon “reporting for reporting’s sake,” but instead upon how we can assist. • Focus will be upon serving faculty—in helping them in accomplishing the last two of the four core questions. • Focus primarily upon impact—how has process impacted learning, teaching, and curriculum processes. • The provost’s office is currently working on central data warehouse for program review document, assessment web page as a resource, freshman and senior surveys, enhanced internship and alumni surveys that assess general education outcomes, etc.

  30. Conclusion • Four foci: • Objectives • Assessments • Faculty Usage • Impact

  31. All assessment is a perpetual work in progress. -- Linda Suske, Vice President, Middle States Association

  32. In assessment, "the perfect is the enemy of the good." Let's keep striving for the good. -- Tom Angelo

  33. Questions? Comments?

More Related