1 / 19

Assessment & Technology

Assessment & Technology. UH-M COLLEGE OF EDUCATION. COE Outreach & Technology. Electronic Exhibit Room. Systematic assessment of evidence of student learning at multiple points in program Program assessment data compiled to an internal website

calix
Download Presentation

Assessment & Technology

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Assessment & Technology UH-M COLLEGE OF EDUCATION COEOutreach & Technology

  2. Electronic Exhibit Room • Systematic assessment of evidence of student learning at multiple points in program • Program assessment data compiled to an internal website • NCATE reviewers access at their leisure • Data remain available between reviews • Easy for program faculty to maintain

  3. Program Assessment • Assessed by: • Candidates (exit surveys, course evaluations) • Alumni (survey, focus groups) • Employers • Mentor Teachers • COPR Process • Learned Societies • Candidate Learning Outcomes Review Assessing Programs by Assessing Candidates

  4. Candidate Assessment • Candidate Assessment • Evidence Collection = Portfolio • Grade Reports • Exam Scores • Work Samples • Faculty Observation Summaries • Students’ Work Samples • Candidate Portfolio Tools • PowerPoint (hyperlinks, external files, branching) • Task Stream – online • CD-R

  5. Assessing Learning Outcomes • Define Program Objectives • Define Points of Measurement • Define Evidence for Objectives • Define Rubric for Assessing Evidence • Delineate Who Evaluates and When

  6. Example (part 1: Program defines Objectives) • ABC Program Objective 1: (Knowledge) Candidates know . . . (Skill) Candidates are be able to … (Disposition) Candidates exhibit… Objective 2: Objective 3: Objective 4:

  7. Secondary Program Objectives

  8. Example (part 2: Program Chooses Points of Measurement) Program will Assess Candidates • Beginning (defined: immediately upon admission) • Middle (defined: conclusion of EDUC XXXX course and/or prior to student teaching) • End (defined: conclusion of field experience XXXX)

  9. Example (part 3: Program assigns evidence to objectives) • Objective 1: Professional Legal Responsibilities The teacher candidate demonstrates an understanding of (knowledge) and ability to apply (skill) and model (disposition) legal responsibilities expected of professional educators.

  10. Sample artifacts

  11. Example (part 4: Program defines rubric scale for evidence) 3 Target: Evidence reflects in-depth knowledge and understanding of standard; outstanding data and evidence of application 2 Acceptable: Evidence indicates knowledge and understanding of standard; satisfactory data and evidence of application 1 Unacceptable: Evidence shows little or inadequate knowledge of standard; limited data and evidence of application

  12. Example (part 5: Program states who will measure and when) Candidate Outcomes Review • Faculty assigned to review candidate outcomes • Review Committee determines program completion for candidate • Candidate outcomes aggregated • Summary data provided to Associate Dean on cohort

  13. Example (part 6: Composite candidate scores defined, measured) • Summarize each Candidate’s Assessment Mid-Point Assessment: • e.g. Overall Unacceptable: 1 or more unacceptables • e.g. Overall Acceptable: 0 unacceptables, <4 superiors • e.g. Overall Superior: 0 unacceptables, 5 or more superior scores

  14. Example (part 7: Summary data submitted)

  15. NCATE Review Website (mock-up)

  16. Measurements: 5 year period

  17. Use of Technology • Candidate use to collect and present evidence • Program use to assess candidate learning • Program use of aggregated data for program review • College use to maintain data overtime for accreditation purposes

  18. Challenges • Requires a shift in thinking: from grades to authentic assessment of learning outcomes • Program objectives must be made explicit • Faculty agreement on rubrics and scales • Need to identify ways to manage the process • Technology must be helpful, not burdensome

More Related