1 / 33

Assessment Decisions in ABC: Types, Instruments, and Considerations

This chapter discusses the components of assessment decisions made by teachers, including identification, placement, and instruction. It covers the types of assessment instruments, measurement and evaluation, norm-referenced and criterion-referenced assessment, validity and reliability, administrative feasibility, and general guidelines for assessment. The chapter also explores factors of assessment, basic rules, and what to do with assessment data. It concludes with information on creating assessment instruments, authentic assessment, and using rubrics for evaluation.

Download Presentation

Assessment Decisions in ABC: Types, Instruments, and Considerations

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Chapter 7 Assessment

  2. 5 COMPONENTS of ABC

  3. Three Types Assessment Decisions Teacher’s Make • Identification • Placement* • Instruction

  4. Definitions • Assessment-the process teachers use to make decisions • Instruments-procedures teachers use to collect information • Measurement-type of information collected by the instrument • Evaluation-comparing pre-assessment data to post-assessment data

  5. Matching Assessment Instrument to Assessment Decision • What do you want to measure? • What criteria are important to measure the level of proficiency of the skill? • Where do you look to determine what you will measure?

  6. Norm Referenced • Standardized • Ensures that all students being tested are tested under the exact same conditions • Specifies all details of testing situation • Almost exclusively measures “Product” • PACER TEST, ACT, BEST, etc. • Student scores compared to set of standards provided with the instrument • Objective • Don’t need training

  7. Criterion Referenced (CRI) • Judged on Performance Criteria • Standardized or non-standardized • Designed to collect process measures • Teacher designed, more adaptable and flexible, teacher makes more decisions about how to conduct the test, or score the test

  8. What Can the student DO or Not Do • More accurate measure of what the student knows • Process • Need training to accurately record • Test administrator must have knowledge of the skill • HOW, how well, did the student perform the skill

  9. Reference Standards • Data collected is interpreted according to standardized NORMS • Usually by Gender and Age • NRI are usually numerical • CRI are usually identify criteria for performance

  10. Valid • Face validity • What can confound validity in physical education? • Complexity of measure • Pre-requisite skills • Complexity of instructions • Reliable • Intra-rater; Inter-rater

  11. Objective vs. Subjective • More Objective measures, easier to achieve reliability • Objective measures? • Subjective?

  12. Administrative Feasibility • Cost • Time to administer • Setting needed to administer • Training and Preparation of administrator

  13. General Guidelines for Assessment • Choose (or develop) the right instrument • Know how! • Set up in advance • Optimal view • Organization of the class • Quick and efficient checklist, recording procedure, etc.

  14. “ACE” Factors of Assessment • In order to make sure the data you collected actually represents the content the student has learned/skill acquired, what must you make sure your students are doing?

  15. ACE • Attention • Comprehension • Effort • How do you know the students paid attention to the correct cues, understood the request, performed to the best of their ability? • Are you measuring ability or motivation to perform?

  16. Basic Rules • Make sure the students know • Be Aware of Audience effects • Don’t teach during assessment • General feedback only • If in doubt, don’t give credit • Make sure testing situation doesn’t confound results • Be thoroughly familiar with the components of the skill, memorize the skill and the test procedure

  17. What do we do with Data? • Decide focus of next lesson, unit, etc. • Decide how to group students • Formation, organization for instruction • Determine if students are progressing • Is your teaching effective • Are the methods you chose appropriate for the content

  18. Creating Assessment Instruments • How many assessments should an ABC K-12 curriculum have? • ONE Per Objective • Assess more than one objective per instrument • Your project only requires one per goal area (choose one particular objective), can assess multiple skills in that one assessment instrument. • Make sure it is clear which objective the assessment relates to! • When should the assessments be created/chosen? • During curriculum planning

  19. Authentic Assessment • Observing a student playing a game of tennis in a natural setting • Reduces audience effects • Actually measure the students ability to “play a functional game of tennis”

  20. Rubrics • Visible Signs of Agreed Upon Items • Must be Built for specific evaluation • Can be flexible, may need change • Should provide students with Rubrics

  21. Authentic Tennis Forehand Assessment Rubric: Placed on Checklist with multiple columns for forehand strokes, checked while watching student actually play a game 4 – Hips perpendicular to net; level swing; ball contacted waist high just behind front foot; weight transferred from back to front as contacted and follow through 3 – Hips slanted to net; swing exaggerated high to low, or low to high; ball contact in front of front foot or behind back foot; weight transfer step after contact; shortened follow through 2 – Hips open, some angle; swing from elbow rather than shoulder; reach for ball rather than moving to the ball; hit off back foot or little weight transfer 1- Hips square, no angle; choppy, uncoordinated swing; reach or reach and miss; standing flat footed, no attempt at weight transfer

  22. Guidelines For Developing Rubrics • Determine Learning Outcome • Keep it short and simple! • Each Rubric Item should focus on a different skill • Evaluate only measurable criteria • Clearly state the criteria • One sheet of paper if possible • Multiple names; multiple skills

  23. Curriculum Embedded Assessment • Must continually assess! • Throughout instructional unit, not just at end • Do we do a good job of this in P.E? • Must use instruments that directly align with the instruction. • Must assess authentically when possible.

  24. 1st Clearly Define Objective • To effectively manage continual authentic assessment: • Key components of the skill • By Task Analysis • Must be appropriate developmental level

  25. 2nd clearly define conditions for administering the assessment • Criteria should already be imbedded from when we wrote the objective. • Under what conditions, size of object, how it is thrown, what the catch should look like, how many times completed (3/5) • Want to make sure any changes are due to student learning, NOT a change in testing procedure!

  26. How could you assess throwing in an “authentic” setting? • What happens when we put students in a contrived testing situation in which they must perform individually? • What other factors must you consider if do individual testing?

  27. What do you give up if you try to assess basketball skills in a regular game? • Would that be standardized and repeatable?

  28. Do the rules for your game illicit the skill, or behavior that you want to assess? • What are some of the problems you might encounter if you are trying to assess dribbling skills, passing skills, shooting skills, offensive tactics (decisions), and defensive stance and positioning in a 5 on 5 game? • How could you organize the testing situation to make it more feasible, and practical to obtain the data you want?

  29. 3rd Must Determine How to Score the Assessment • What “number” or “score” will sufficiently describe the level of performance? • How will that score translate into a “grade?” • How can we design the instrument to be simple enough to use efficiently, but complex enough to detect any changes in student performance?

  30. 4th Efficient method of recording data • What are some ways to design an efficient/quick way to record the score on the instrument? • What items must be included on the actual score sheet? • How to use Technology! • JOPERD 77 (1), January 2006 Integrating Assessment and Instruction: Easing the Process with PDA’s

  31. Concepts • Presence vs. absence • Complete vs. Incomplete • Many, Some, None • Major, minor • Consistent vs. inconsistent • Frequency: • Always, Usually, Sometimes, Rarely, Never

  32. Terms and Ranges • Needs Improvement, Satisfactory, Good, Exemplary • Beginning, Developing, Accomplished, Exemplary • Needs Work, Good, Excellent • Novice, Apprentice, Proficient, Distinguished • Pee Wee, Club Team, NCAA, Professional • Bogey, Par, Birdie, Eagle • Numeric

  33. Assessment Data is only valuable if . . . • You use it! • Is easy enough to record, so you will use it. • Is organized enough to manage, so you will use it! • Use it to . . . • Make instructional decisions • Evaluate instructional effectiveness • Determine if students are mastering objectives! • Determine if the program/curriculum is working.

More Related