1 / 26

Selecting Assessment Tools for Gathering Evidence of Learning Outcomes

Selecting Assessment Tools for Gathering Evidence of Learning Outcomes. Lehman College Assessment Council. February 24, 2010. Timeline. Ongoing assessment. Spring 2011 Middle States report due April 1 Second completed assessment cycle of student learning goals Analyze evidence

kagami
Download Presentation

Selecting Assessment Tools for Gathering Evidence of Learning Outcomes

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Selecting Assessment Tools for Gathering Evidence of Learning Outcomes Lehman College Assessment Council February 24, 2010

  2. Timeline Ongoing assessment • Spring 2011 • Middle States report due April 1 • Second completed assessment cycle of • student learning goals • Analyze evidence • Report on how assessment results were used (May)

  3. Timeline: Spring 2010 • February 16 • Curriculum Maps • Assessment Plans • April 16 • Assessment Council workshop • May 31 • Assessment Results (w/supporting documents) Ongoing • Evidence gathering • Meetings with ambassadors • Development opportunities • *** Syllabi ***

  4. What do we want our students to learn? • What… • knowledge, • skills, • abilities, and • habits of mind • …do we expect graduates of our program to have?

  5. Assessment ToolboxAssessment tools recommended by Suskie (2004) • Portfolios • Tests (blueprinted – i.e., mapped back onto objectives) • Focus Groups • Interviews • Assignment Scoring Guides/Rubrics • Surveys SEE ALSO: Suskie, Assessing Student Learning, 2nd ed., Ch. 2

  6. Direct vs. Indirect Evidence • Direct evidence of student learning is tangible, visible, self-explanatory evidence of exactly what students have and haven’t learned. • Indirect evidence provides signs that students are probably learning, but evidence of exactly what they are learning may be less clear and less convincing. • While indirect evidence (feedback/surveys) can be useful, direct evidence is often best for getting concrete indications that students are learning what we’re hoping they’re learning.

  7. This Assessment Cycle: Direct Evidence • Embedded course assignments (written/oral) • Department wide exams (blueprinted) • Standardized tests (blueprinted) • Capstone projects • Field experiences • Score gains, Pre-Test/Post-Test • Videotape and audiotape evaluation of own performance • Portfolio evaluation and faculty designed examinations • Summaries and assessments of electronic class discussion threads • Student reflections on outcomes related to values, attitudes, beliefs See: Suskie (2009), ch. 2, table 2.1

  8. What Is a Scoring Guide or Rubric? • List or chart describing the criteria used to evaluate or grade completed student assignments • such as presentations, papers, performances, etc. • Includes guidelines for evaluating each of the criteria • Both a means of evaluating student work and providing meaningful feedback to students

  9. Using Scoring Guides/Rubrics to Assess Program Goals How can Rubrics be used to assess program learning goals? • Embedded course assignments • Capstone experiences • Field experiences • Employer feedback • Student self-assessments • Peer evaluations • Portfolios

  10. Using a Scoring Guide/Rubric: Advantages • Clarify vague, fuzzy statements – “Demonstrate effective writing skills” • Help students understand expectations • Help students self-improve (metacognition) • Make scoring easier and faster • Make scoring accurate, unbiased and consistent • Reduce arguments with students • Help improve teaching and learning

  11. Developing a Scoring Guide / Rubric:Steps Step I – Look for models Step II – Define the traits or learning outcomes to assess: Structure, Content, Evidence, Presentation, Technical, Accuracy, etc Step III – Choose the scale / level of performance (5pt/3pt, letter grades, Excellent-Poor, etc) Step IV - Draw a table Step V - Describe the characteristics of student work at each level. Start with high end and then low end and then describe point in between Step VI – Pilot test the rubric Step VII – Discuss the results

  12. Developing a Scoring Guide / Rubric: List the Things You’re Looking For • Why are we giving students this assignment? • What are its key learning goals? • What do we want students to learn by completing it? • What skills do we want students to demonstrate in this assignment? • What specific characteristic do we want to see in completed assignments?

  13. Using Scoring Guides/Rubrics: Rating Scale Source: Assessing Student Learning: A Common Sense Guide by Linda Suskie

  14. Using a Scoring Guide/Rubric: Descriptive Rubric Source: Assessing Student Learning: A Common Sense Guide by Linda Suskie

  15. How it All Fits Together: Course Embedded EXAMPLE 1 – Communication (any discipline) Program Goal I: Students will be able to communicate effectively Learning Objective IA: Express ideas in a clear and coherent manner in an oral presentation Class: Speech 101 Assignment: Students will make a persuasive argument (pro or con) on a current domestic or international issue (health care, Afghan war, financial crisis, etc) Assessment Technique: Using an oral presentation rubric, students will be evaluated on Organization, Content and Style

  16. How It All Fits Together: Course Embedded Rubric Source: Assessing Academic Program in Higher Education by Mary J. Allen

  17. How It All Fits Together: Capstone EXAMPLE 2 – Research (any discipline) Program Goal I: Students will understand how to conduct research Program Goal II: Students will be able to demonstrate proficiency in writing mechanics Program Goal III: Students will understand sociological concepts and theories Objectives: several objectives pertaining to each of the above goals will be assessed (see program’s learning goals and objectives) Class: Sociology 495 Assignment: Students will write a 20 page research paper on a topic in sociology. Students are expected to develop a thesis, gather and analyze information, synthesize this information to support their argument and demonstrate proficiency in writing mechanics. Assessment Technique: Using a research rubric developed and tested by department faculty, students work will be evaluated on six different criteria. The rubric is attached.

  18. How It All Fits Together: Capstone Rubric

  19. Group Exercise Write a Rubric!

  20. Using Surveys as an Assessment Potential Advantages • Measure of attitudes, dispositions, values, habits of mind • Complement other forms of assessment data • Triangulation of data from different perspectives about how well a goal or objective is met • Efficient way of gathering information from program completers or alumni in the workforce

  21. Example: Using Complementary Survey Data to Evaluate Program Outcomes • ECCE Undergraduate Program Goal: • Candidates must be able to plan instructional tasks and activities appropriate to the needs of students who are culturally diverse and those with exceptional learning needs in elementary schools. They must be able to teach the literacy skills of listening, speaking, reading, and writing to native English speakers and students who are English language learners at the childhood level, including methods of reading enrichment and remediation. (NYSDOE, ACEI) • Divisional surveys of education program completers conducted every semester showed that graduates overall do not feel they are well prepared to teach English Language Learners effectively.

  22. Example: Using Complementary Survey Data to Evaluate Program Outcomes • ECCE courses where goal is addressed: ECE 300, ECE 301, ECE 431, ECE 432, ECE 433 • ECCE program assessment data: • Student portfolios (lesson plans demonstrating differentiated instruction for ELLs and effectiveness at having ELLs meet lesson objectives) • Student teaching evaluations (completed by students, teachers, supervisors) • Online survey data at focus school from graduates and “experts” (cooperating teachers, college supervisors) using Likert Rating Scale • Recent changes in program • Explicit attention to working with ELLs during field placements attached to courses prior to final semester student teaching • Program faculty are revisiting content of courses at program meetings (return to curriculum mapping)

  23. Example: Using Complementary Survey Data to Evaluate Program Outcomes One Goal. Three Assessments. Triangulation: • Direct evidence: portfolio assessment. • Direct evidence: teaching performance evaluations scored using rubrics. • Indirect evidence: later survey (graduates, cooperating peers) regarding student preparedness for real world teaching.

  24. Using Surveys as an Assessment Potential Limitations • Self-report data from a survey may or may not accurately reflect student learning (indirect measure of student learning) • Responses might be influenced by participants’ perceptions of what they believe the evaluator wants to hear, e.g., if the course instructor is conducting the survey • If used as an “add on” assessment, participation is voluntary and may require follow-up • Issues of sampling, validity & reliability

  25. Assessment Council Membership • Salita Bryant (English) salita.bryant@lehman.cuny.edu • Nancy Dubetz (ECCE) nancy.dubetz@lehman.cuny.edu • *Robert Farrell (Lib) robert.farrell@lehman.cuny.edu • Judith Fields (Economics) judith.fields@lehman.cuny.edu • Marisol Jimenez (ISSP) marisol.jimenez@lehman.cuny.edu • Teresita Levy (LAPRS) teresita.levy@lehman.cuny.edu • Lynn Rosenberg (SLHS) lynn.rosenberg@lehman.cuny.edu • Robyn Spencer (History) robyn.spencer@lehman.cuny.edu • Minda Tessler (Psych) minda.tessler@lehman.cuny.edu • Janette Tilley (Mus) janette.tilley@lehman.cuny.edu • Esther Wilder (Soc) esther.wilder@lehman.cuny.edu *Committee Chair Administrative Advisor – Assessment Coordinator • Ray Galinski - raymond.galinski@lehman.cuny.edu

  26. References/Resources Suskie, L. (2004). Assessing student learning: A common sense guide. San Francisco: Anker Publishing Co., Inc. Suskie, L. (2009). Assessing student learning: A common sense guide. San Francisco: John Wiley & Sons, Inc. SurveyMonkey: www.surveymonkey.com

More Related