1 / 30

SECTION V: Research, Program Evaluation, and Appraisal

Chapter 12: Testing and Assessment Chapter 13: Research and Evaluation. SECTION V: Research, Program Evaluation, and Appraisal. Testing and Assessment. Chapter 12. Defining Testing and Assessment. Testing: a subset of assessment Assessment includes: Informal Assessment Personality Testing

Download Presentation

SECTION V: Research, Program Evaluation, and Appraisal

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Chapter 12: Testing and Assessment Chapter 13: Research and Evaluation SECTION V:Research, Program Evaluation, and Appraisal

  2. Testing and Assessment Chapter 12

  3. Defining Testing and Assessment • Testing: a subset of assessment • Assessment includes: • Informal Assessment • Personality Testing • Ability Testing • The Clinical Interview • See Figure 12.1, p. 396

  4. Why Testing? • You will be administering and interpreting assessment instruments • You may consult with others on their proper use • You may use them in program evaluation and research • You will read about them in the professional literature • School counselors: Sometimes the only expert on assessment in the schools • Other counselors: Will likely be using them in your setting and consulting with others who use them • Why testing? Why not testing? Testing is an additional method of gaining information about your client

  5. A Little Background (History) • 2200 BCE: Chinese developed essay type test for civil service employees • Darwin, set the stage for modern science and the examination of differences • Wundt, Fechner: 1st experimental labs to examine differences in people • Binet: Hired by Ministry of Public Education in France to develop intelligence test • Binet test, later became “Stanford Binet”—revised by Terman

  6. History (Cont’d) • Spread of testing at beginning of 20th century: • Psychoanalysis spurred on development of objective and projective personality tests • Industrial Revolution and need for vocational assessment • WWI: Ability and personality tests used to determine placements of recruits • 1940s and 1950s: advances in statistics led to better test construction • 1980s and on: Personal computers make tests easier to develop, analyze, use, administer, and interpret

  7. Types of Assessment Techniques • Ability Testing (Testing in the Cognitive Domain) (see Figure 12.2, p. 399) • Two types • Achievement Testing (What one has learned) • Aptitude Testing (What one is capable of learning) • Achievement Testing • Survey Battery Tests • Diagnostic Tests (see Box 12.1, p. 400: PL 94-142) • Readiness Tests

  8. Types of Assessment Techniques • Ability Testing (Testing in the Cognitive Domain) (see Figure 12.2, p. 399) (Cont’d) • Aptitude Tests (What one is capable of learning) • Intellectual and Cognitive Functioning Testing • Intelligence Tests • Neuropsychological Assessment • Cognitive Ability Tests • Special Aptitude Tests • Multiple Aptitude Tests

  9. Types of Assessment Techniques • Personality Assessment (Testing in the Affective Domain; see Figure 12.3, p. 399) • Objective Tests • Projective Tests • Interest Inventories • Informal Assessment (see Figure 12.4, p. 399) • Observation • Rating Scales (see Box 12.2, p. 404) • Classification Systems (see Box 12.3) • Environmental Assessment • Records and Personal Documents • Performance-Based Assessment

  10. Types of Assessment Techniques • The Clinical Interview • Sets a tone for the types of information that will be covered during the assessment process • Allows client to become desensitized to information that can be very intimate and personal • Allows examiner to assess nonverbals of client while he or she is talking about sensitive information • Allows examiner to learn problem areas firsthand • Gives client and examiner opportunity to study other’s personality style to assure they can work together

  11. Norm-referenced and Criterion Referenced Assessment • Norm-referenced Tests • Your results are compared to your peer group • Criterion-referenced Tests: • Preset learning goals are established • Examinee has increased time to meet educational goals • Often used for individuals with learning disabilities • Norm-Referenced and Criterion Tests Can Be Standardized or Non-Standardized • Standardized: Given exactly the same way each time • Non-Standardized: Vary in how administered. Generally not as rigidly researched as standardized tests (e.g., teacher made tests) • See Table 12.1, p. 407

  12. Test Statistics • Relativity and Meaningfulness of Scores • Raw scores don’t hold much meaning unless you do something to them • By comparing raw scores to those of an individual’s peer group, you are able to: • See how the individual did in comparison to similar people • Allow test takers who took the same test, but are in different norm groups to compare their results • Allow an individual to compare his or her results on two different tests

  13. Test Statistics • Some statistics help us make meaning of test scores • Measures of Central Tendency • Mean • Median • Mode • Measures of Variability • Range • Interquartile Range • Standard Deviation • See Figure 12.5, page 409 • See Figures 12.6 and 12.7; page 410 and 411

  14. Derived Scores(Converted Raw Scores) Types of Derived Scores Types of Derived Scores Normal Curve Equivalents (NCEs) Stanines Sten Scores Grade Equivalent Scores Idiosyncratic Publisher-Derived Scores • Percentile Rank • T-Scores • Deviation IQ • SAT/GRE Type Scores • ACT Scores

  15. Correlation Coefficient • A basic statistic not directly related to interpretation of test but crucial in test construction • Ranges from -1.0 to +1.0 • The closer to -1.0 and +1.0 the strong the relationship between variables • Positive correlation: tendency for two sets of scores to be related in same direction • Negative correlation: tendency for two sets of scores to be related in opposite direction • 0 = no relationship between variables • See Figure 12.8, p. 413

  16. Test Worthiness • Four Types • Validity: Is the test measuring what it’s supposed to measure? • Reliability: Is the test accurate (consistent) in its measurement? • Practicality: Is this a practical test to use? • Cross-Cultural Fairness: Has the test been shown to be fair across different cultures?

  17. Test Worthiness: Validity • Three types • Content • Criterion-Related • Concurrent • Predictive • Construct • Experimental • Convergent • Discriminant • Factor Analysis • Face validity • Not a “real” type of validity. Does the test, on the surface, seem to measure what it’s supposed to measure • Some tests may be valid, but may not seem to be measuring what it’s supposed to measure

  18. Test Worthiness: Cross-Cultural Fairness • Is bias removed—as best as possible? • Does it predict well for all cultural groups? • Griggs v. Duke Power Company: Tests must show that they can predict for job performance • A number of ethical and legal issues have been addressed (see later under “Ethical, Professional, and Legal Issues”) • See Table 12.2, p.417: Summary of Types of Validity and Reliability

  19. Test Worthiness: Reliability • Four Types: • Test-Retest • Alternate (Parallel; Equivalent) Forms • Split-Half (Odd-Even) • Internal Consistency • Cronbach’s Coefficient Alpha • Kuder-Richardson

  20. Test Worthiness: Practicality • Is this a realistic test to give? • Based on: • Cost • Time to administer • Ease of administration • Format of test • Readability of test • Ease of interpretation

  21. Where to Find Tests and Assessment Techniques • Over 4000 assessment procedures • How do you find them: • Publisher resource catalogs • Journals • Source Books and On-Line Source “Book” Information • Buros Mental Measurement Yearbook • Tests in Print • Books on Testing and Assessment • Experts • The Internet

  22. Writing Assessment Reports • Info usually included: • Demographic information • Reason for referral • Family background • Other relevant information (e.g., legal, medical, vocational) • Behavioral observations • Mental status • Test results • Diagnosis • Recommendations • Summary

  23. Writing Assessment Reports • Usually a few pages long • Problems with: • Overuse of jargon • Focusing on assessment procedures & downplaying person • Focusing on person and downplaying assessment results • Poor organization • Poor writing skills • Failure to take a position • Demographics

  24. Multicultural Issue/Social Justice Focus • Caution in Using Assessment Procedures • Cultural bias continues to exist in testing • Standards and ethical codes have been developed to help us: • Understand the cultural bias inherent in tests • Know when a test should not be used due to bias • Know what to do with test results when a test does not predict well for minorities • Standards for effective use of assessment instruments • Association for Assessment in Counseling’s Standards for Multicultural Assessment • Code of Fair Testing in Education • ACA Ethics Code

  25. Multicultural Issue/Social Justice Focus • Take A Stand—Do Something! • Our duty and moral responsibility to do something when • Tests have been administered improperly • Tests are culturally biased and the bias is not addressed • Cheating has taken place • Tests were used with limited validity or reliability

  26. Ethical, Professional, and Legal Issues • Ethics • Guidelines for use of assessment instruments(see bottom p. 420) • Informed consent • Invasion of privacy and confidentiality • Competence in the use of Tests • Levels A, B, and C • Technology and Assessment • Sometimes, counselor not used with computer-generated reports • Issues of confidentiality and privacy • Knowing laws relative to the impact of on-line technology • Adequate training in technology

  27. Ethical, Professional, and Legal Issues Ethical issues Professional Issues Professional Issues Computer-Driven Assessment Reports Can be very good Make sure they reflect “you” Professional Association Assoc. for Assessment in Counseling and Education (AACE) Adivision of ACA • Ethics (Other Issues) • Proper release of test results • Selecting Tests • Administering, Scoring and Interpreting Tests • Keeping Tests Secure • Picking up-to-date tests • Proper Test Construction

  28. Ethical, Professional, and Legal Issues • Legal Issues • Americans with Disabilities Act: Accommodations must be made when taking tests for employment • (FERPA) Buckley Amendment: Right to access school records, including test records • Carl Perkins Act (PL98-524): Right to vocational assessment, counseling, and placement for disadvantaged • Civil Rights Act (‘64) & Amendments: Tests must be shown to be valid for the job

  29. Ethical, Professional, and Legal Issues • Legal Issues (Cont’d) • Freedom of Information Act: Right to access federal records, including test records • PL94-142 and IDEIA: Right of students to be tested, at school’s expense, for a suspected disability that interferes with learning • Section 504 of Rehabilitation Act: Instruments must measure person’s ability, not be a reflection of his or her disability • HIPAA: Right of privacy of records, including test records

  30. The Counselor in Process • Assessment of clients is not just giving a test • Use multiple methods and be wise • Remember, people can and will change over time • Don’t view them as “stagnant” and always the same

More Related