1 / 33

Methods for Assessing Student Learning Outcomes

Methods for Assessing Student Learning Outcomes. Beth Wuest Interim Director, Academic Development and Assessment Lisa Garza Director, University Planning and Assessment February 15, 2006. Workshop Goals. To become:

Antony
Download Presentation

Methods for Assessing Student Learning Outcomes

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Methods for Assessing Student Learning Outcomes Beth Wuest Interim Director, Academic Development and Assessment Lisa Garza Director, University Planning and Assessment February 15, 2006

  2. Workshop Goals To become: • more aware of the importance of methods of assessment in relation to student learning outcomes and program improvement • more knowledgeable about direct and indirect assessment methods • more competent at developing methods for assessing student learning outcomes • more knowledgeable about using and adapting assessment methods that are currently in practice • more adept at reviewing methods for assessing effectiveness and efficiency

  3. Overview • For evidence of success and program improvement • All programs are requested to have 5-8 learning outcomes with two assessment methods for each outcome by March 31, 2006 • An assessment report of these outcomes will be due toward the end of the 2006-2007 academic year

  4. Linkages to Other University Assessment • Academic Program Review • College and University strategic planning • Program and University accreditations

  5. Definitions • Outcomes • Desired results expressed in general terms • Methods • Tools or instruments used to gauge progress toward achieving outcomes • Measures • Intended performance targets expressed in specific terms

  6. Focus At present we are focusing only on outcomes and methods. Although measures should be considered when developing these, they will not be specifically addressed until the first assessment cycle (2006-2007).

  7. Student Learning Outcomes • Describe specific behaviors that a student of your program should demonstrate after completing the program • Focus on the intended abilities, knowledge, values, and attitudes of the student after completion of the program

  8. Key questions to consider • What is expected from a graduate of the program? • What is expected as the student progresses through the program? • What does the student know? (cognitive) • What can the student do? (psychomotor) • What does the student care about? (affective)

  9. Why are Student Learning Outcomes So Important? • basis for program improvement • instruction, course design, curricular design • communicate instructional intent • increase awareness of learning (for students) • common language • advising materials • promotional materials • support accreditation

  10. Methods of Assessing Learning Outcomes • should provide an objective means of supporting the outcomes, quality, efficiency or productivity of programs, operations, activities or services • should indicate how you will assess each of your outcomes • should indicate when you will assess each outcome • provide at least two ways to assess each outcome

  11. Categories of Assessment Methods • student learning • direct assessments evaluate the competence of students • exam scores, rated portfolios • indirect assessments evaluate the perceived learning • student perception, employer perception • program or unit processes • direct assessments evaluate actual performance • customer satisfaction, error rates, time, cost, efficiency, productivity • indirect assessments evaluate the perceived performance • perceived timeliness, perceived capability • curriculum • methods used to check alignment of curriculum with outcomes • curriculum mapping

  12. Examples of Direct Methods • Samples of individual student work • Pre-test and post-test evaluations • Standardized tests • Performance on licensure exams • Blind scored essay tests • Internal or external juried review of student work • Case study/problems • Capstone papers, projects or presentations • Project or course imbedded assessment • Documented observation and analysis of student behavior/performance • Externally reviewed internship or practicum • Collections of work (portfolios) of individual students • Activity logs • Performances • Interviews (including videotaped)

  13. Examples of Indirect Methods • Questionnaires and Surveys • Students • Graduating Seniors • Alumni • Employers • Syllabi and curriculum analysis • Transcript analysis

  14. Describing Assessment Methods • What are you going to use? • presentation, assignment, test, survey, observation, performance rating • Of and/or by whom? • student, mentor, focus group, alumni • Context (e.g., where or when)? • point-of-service, capstone, throughout the year, end of program • For what purpose? • desired learning outcome • example:Test the students at the end of the program for their level ofknowledge in XYZ

  15. Creating Assessment Methods

  16. Creating Assessment Methods

  17. institutional level alumni survey academic advising survey image survey student satisfaction survey Joe Meyer, Director,Institutional Research program or department level advisory board surveys employer surveys customer surveys program-specific surveys graduating senior survey Locally Developed Surveys

  18. Curriculum or Course-based • performance-based • capstone courses • capstone projects • case studies • classroom assessment • course-embedded assignments • course-embedded exam questions • portfolios • reflective essays

  19. Types of Examinations or Tests • standardized exams • national test • state test • juried competitions • recitals • shows or exhibitions • locally developed exams • pre-post tests • course-embedded exam questions • comprehensive exam • qualifying exam

  20. Assessment Matrix Can be Useful to Link the “Where” with the “Outcomes”

  21. Hints on Selecting Methods • match assessment method with learning outcome • Students completing BS in xyz will demonstrate competence in abc principles comparable to graduates of other similar national programs • Student will be tested with a locally developed exam administered at the end of the program • Students’ scores on the xyz principles on the xyz national examination administered twice a year will be examined • the assessment results should be usable • Students completing BS in xyz will demonstrate competence in conducting research • Seniors at the end of their capstone course develop a research design to address the intended research question posed in a case study. A rubric will be designed to assess the effectiveness of their ability to construct a research design. • Graduating seniors will complete a senior research project. Completion of the project will be recorded. Not Useful

  22. Hints on Selecting Methods • results should be easily interpreted and unambiguous • data should not be difficult to collect or access • information should be directly controllable by the unit or program • identify multiple methods for assessing each outcome • direct and indirect methods • qualitative and quantitative • passive or active methods • within different courses • conducted by different groups • identify subcomponents where other methods may be used that allow deeper analysis

  23. Hints on Selecting Methods • use methods that can assess both the strengths and weaknesses of your program • capstone or senior projects are ideal for student learning outcomes assessment • when using surveys, target all stakeholders • build on existing data collection • accreditation criteria • program review

  24. exercise

  25. Selecting the “Best” Assessment Methods • relationship to assessment— provide you with the information you need • reliability — yields consistent responses over time • validity — appropriate for what you want to measure • timeliness and cost — preparation, response, and analysis time; opportunity and tangible costs • motivation — provides value to student, respondents are motivated to participate • other • results easy to understand and interpret • changes in results can be attributed to changes in the program

  26. After Identifying the Potential List of Assessment Methods You Need to… • select the “best” ones • try to identify at least two methods for assessing each outcome • consider possible performance targets for the future • balance between stretch targets versus achievable targets • Examples of methods • survey (using the Graduating Senior Survey) the students at the end of the program as to their intention to continue their education in a graduate program (indirect method) • students will rate their likelihood of attending a graduate program on a survey (using the Graduating Senior Survey) that they will complete at the end of the program • xyz graduates’ admission rate to xyz graduate program in the State of Texas will be reviewed

  27. After Identifying the Potential List of Assessment Methods You Need to… • develop assessment instruments • surveys • exams • assignments • scoring rubrics • portfolios • ideally you want them to be reliable, valid, and cheap • approaches • use external sources • seek help from internal sources (e.g., Academic Development and Assessment Office) • do it yourself • the instrument may need to be modified based on assessment results

  28. Challenges and Pitfalls • one size does not fit all — some methods work well for one program but not others • do not try to do the perfect assessment all at once — take a continuous improvement approach • allow for ongoing feedback • match the assessment method to the outcome and not vice-versa

  29. Example • Outcome 1:Graduates will be satisfied that their undergraduate degree has prepared them to succeed in their professional career • xyz graduates will be surveyed in the annual alumni survey on their preparedness to succeed in their career • 95% of the xyz graduates surveyed in the annual alumni survey report that the xyz program enabled them to be “very prepared” or “extremely prepared” to succeed in their career (next phase) • on-site internship supervisors each semester will rate interns from the xyz program on their skills necessary to succeed in the workplace • 90% of on-site internship supervisors each semester rate interns from the xyz program as having the skills necessary to succeed in their career (next phase)

  30. students in their capstone course will be administered a locally developed, standardized exam regarding career preparedness • 95% of student in their capstone course are able to successful answer 90% of the questions regarding career preparedness on a locally developed, standardized exam (next phase) • scores of graduates who have taken the state licensure exam in xyz within one year of graduating from the xyz program will be evaluated • 85% of the graduates are able to pass the state licensure exam in xyz within one year of graduating from the xyz program (next phase)

  31. senior portfolios will be examined annually using a locally devised rubric to show evidence of preparedness for success in related professional careers on three key measures: communication, leadership, and ethics • 90% of senior portfolios examined annually using a locally devised rubric will show evidence of preparedness for success in related professional careers on three key measures: communication, leadership, and ethics (next phase) • students will be observed performing basic technical lab skills necessary for successful employment in a senior laboratory course • 90% of students will be able to perform basic technical lab skills necessary for successful employment in a senior laboratory course(next phase)

  32. Re-Cap of Process Step 2: Define program goals Step 3: Define student learning outcomes Step 4: Inventory existing and needed assessment methods Step 5: Identify assessment methods for each learning outcome Step 1: Define program mission

  33. questions and comments

More Related