1 / 29

Classroom Assessment Tools: Next Steps

Classroom Assessment Tools: Next Steps. Strengthening Student Success October 9, 2009 Joan Sholars , SLO Coordinator Mt. San Antonio College. Assessment. Assessment is the process of documenting, usually in measurable terms, knowledge, skills, attitudes and beliefs.

manning
Download Presentation

Classroom Assessment Tools: Next Steps

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Classroom Assessment Tools:Next Steps Strengthening Student Success October 9, 2009 Joan Sholars , SLO Coordinator Mt. San Antonio College

  2. Assessment • Assessment is the process of documenting, usually in measurable terms, knowledge, skills, attitudes and beliefs. • In order to gather evidence of student learning, a variety of assessment methods, direct and indirect, qualitative and quantitative methods, should be used to give adequate feedback to the program to identify strengths and weaknesses (Maki, 2004).

  3. In CTE programs, letter grading and grade point averages characterize typical educational assessment but fail to impel employers to hire the students. • In CTE programs, a major outcome for students is that they will be hired in the field after they complete their studies.

  4. Capacity Matrix An Assessment Tool

  5. Mt. SAC’s Computer Graphics program has been using the “Capacity Matrix” . • The Capacity Matrix … focuses the outcome of student performance on the quality of learning and understanding – not testing.

  6. Capacity Matrix • The Capacity Matrix is a Quality Learning Tool designed and developed by David Langford and Dr. Myron Tribus. • It is a charting technique that breaks down Aims/Outcomes into specific competencies/capacities to be developed and shows the different levels of learning (knowledge) achieved.

  7. Without the Learning Tool • Students attend lectures and demonstrations of techniques in classes. They are also given reading and writing assignments and skill-based instruction; preparing “projects” which reflect specific skills and practice. • Assessing student knowledge and competencies in this process is very subjective and most students are encouraged to “paint by the numbers” or mimic assumed good examples.

  8. Without the Learning Tool • Students are not focused on why they chose a certain problem-solving strategy. They are instead focused on the overall grade. • Sample portfolios assembled by students used for job interviews are often collections of class “A” projects; static assessments of level/course achievement and not actually indicative of the student’s levels of learning and understanding.

  9. With the Learning Assessment Tool • At the course level students evaluate the quality of their work and assess their progress towards a mastery of knowledge and skills. • The matrix focuses students on the quality of their respective competencies, capacities, and abilities – not grades; and these in turn can be then be compared with the requirements of prospective industry careers.

  10. With the Learning Assessment Tool • Student performance is driven by the quality of their overall learning and understanding. They work on projects to further and improve their capacities in many different areas. They control the effectiveness of the learning system. • Projects now focus on a student’s ability to put ideas together which require original, creative, thinking: and study situations which weigh the consequences of applying knowledge or know-how.

  11. Assessment Methods • There are basically two types of assessment methods: • Direct Assessment • Indirect Assessment • Direct methods of assessment require students to produce work to see how well they meet the expectations. • Indirect methods of assessment provide opportunities for students to reflect on their learning experiences.

  12. Clicker Question • SLOs are broad statements about what students will think, know, feel or be able to do as a result of an educational experience. • Indirect assessments, such as surveys or focus groups, are a good way to gauge what students “feel” as a result of an educational experience. • Press 1 for “Yes” • Press 2 for “No”

  13. Example • In HSAnim1 (High School Video Editing 1) • SLO: • Students will be able to demonstrate video editing skills. • Assessment Method: • Students will develop and edit a video animation according to a faculty developed rubric. • Criteria: • 75% of students will score at least 3.5 out of 5 on a faculty developed rubric.

  14. Example • ANIM 108 (Principles of Animation) • SLO • Students will be able to render fluid “squash and stretch” movements for a bouncing ball. • Assessment • A question embedded in each semester’s final exam will require that the student create key poses for a ball that will “squash and stretch”. • Criteria • 80% of students taking the final exam will score at least 80% of the allowable points for the question, evaluated by the instructor, demonstrating a clear understanding of “squash and stretch” technique. A simple rubric will be used for assessment.

  15. Embedded Assignments • Embedded assignments, like the previous example, are integrated into specific courses. • These assessments are typically graded by course instructors, and then pooled across sections to evaluate the course-level or program-level SLOs. • Students are motivated to show what they have learned since embedded assignments are tied to the grading structure of the course.

  16. Clicker Question • Embedded assignments are easy for faculty to develop and assess. • Rate your response to this question from 1- 5 with “1” being strongly disagree and “5” being strongly agree.

  17. Add-on Assignments • Add-on assignments are additional tasks that go beyond course requirements and are usually done outside of class. • An example of this is a standardized test. • These are not typically part of the course’s grading structure. • Because of this, students could be less motivated to perform well on these.

  18. Example • AiRC20 (Refrigeration Fundamentals) • SLO • AIRC 20 course completers will properly handle refrigerants. • Assessment • EPA Exam • Criteria • 90% of AIRC 20 course completers will score an average score of 80%, with no lower than 72% on any subcategory, on the section 608 EPA refrigerant handling exam.

  19. Completing the Cycle • AIRC 25 (Electrical Fundamentals for Air Conditioning and Refrigeration) • SLO • AIRC 25 course completers will understand the electrical sequence of operation for a 5 ton air conditioning system. • Assessment • 80% of course completers will develop a functional electrical schematic for a residential Cooling/Heating system. • 75% of the successful students will physically wire correctly on the first attempt. The system will include all the components of a standard residential cooling/heating system.

  20. Summary of Data • 94 of 121 students (78%) produced a functional schematic that followed the sequence of operation for a 48KLA. Of the 94 successful students, 90 (96%) wired the system properly on the first attempt, without notes. • Use of Results • 3 hours of direct instruction was added to the sequence of operation of the secondary, low voltage control system for the standard 60,000 btu Straight Cooling System. The AIRC department will include 2 more hours of compressor instruction.

  21. FASH 31 (Fashion Design and Product Development 2) • SLO • Students finishing FASH 31 will be able to produce and develop a collection of garments pursuant to the apparel industry. • Assessment • 70% of FASH 31 students will score a minimum of 10 out of 15 points on this project using a faculty developed rubric.

  22. Summary of data • 100% of the students were rated at 2 or above. • Use of results • The average rating for students is not acceptable for vocational students almost at the end of their technical studies. The projects in FASH 31 need to be re-evaluated, as well as the core feeder classes. Overall performance levels need to be raised. • Resources needed • Faculty training needs to be completed in portfolio development.

  23. Clicker Question • What is the hardest part of assessing SLOs? • Developing the assessment tool. • Evaluating the assessment. • Norming faculty evaluators. • Summarizing the data. • Use of results.

  24. Examples • In ANAT 10A – Introduction to Human Anatomy – faculty use “clicker” technology to aid in the assessing and evaluating of their SLOs. • Clicker classroom data on accuracy of metric problem solving will be collected and stored by instantaneous technology that is embedded in the clicker system. • This data will inform instructors about how many students understand various metric problems, how many students guessed on the clicker questions, and how many students felt that they learned from the additional instruction that followed the clicker questions.

  25. Examples • In ARTS 30A – Ceramics – the faculty administer a survey at the end of each semester to assess and evaluated their SLOs. • In PE48—Lifeguard Training– faculty observe students making the proper entry and approach to drowning victim. • In CHEM 10 – Chemistry for Allied Health Majors – a math survey that assesses algebra skills was developed and given at different times in the semester. • Initial test – 38% correct • After 4 weeks – 66% correct • After 16 weeks – 58% correct

  26. In CHEM 10 – Chemistry for Allied Health Majors – a math survey that assesses algebra skills was developed and given at different times in the semester. • Initial test – 38% correct • After 4 weeks – 66% correct • After 16 weeks – 58% correct • The summary of their data included the comment that “students entering the course do not have adequate math skills. At the end of the semester, students’ math skills had improved, but still do not meet the criterion.” • The Chemistry faculty are exploring the possibility of a math entrance exam for this course and the possibility of a software/tutoring program that will remediate students who need math remediation.

  27. How do we get all faculty involved in the process?

More Related