1 / 42

Cultural Validity in Assessment Practices

Cultural Validity in Assessment Practices. Guillermo Solano-Flores American Institutes for Research Mid-Atlantic Equity Center Annual Regional Conference, Washington, DC, March 12, 2004. Traditional: Classifies individuals into large categories Focuses on differences between groups

nate
Download Presentation

Cultural Validity in Assessment Practices

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Cultural Validity in Assessment Practices Guillermo Solano-Flores American Institutes for Research Mid-Atlantic Equity Center Annual Regional Conference, Washington, DC, March 12, 2004

  2. Traditional: Classifies individuals into large categories Focuses on differences between groups Uses adaptation and accommodation of tests Centralized Deficit model view Alternative: Defines a socio-cultural context Focuses on score dependability Emphasizes the process of test development Community-based Multidisciplinary perspective Paradigms in testing

  3. Overview • The need for interdisciplinary approaches to test design • Understanding how culture and language influence test taking • Implications: new paradigms in testing

  4. First NSF study: Assessing assessment cultural validity • Participants: Students from twelve sociocultural contexts (ethnicity, language background, SES, geographical area, locale, type of school) • Students were given items used in standardized testing that are supposed to be “culturally-sensitive” • Students were asked to explain how they interpreted the items and how they related their content to their personal experience

  5. Second NSF study: Sociolinguistic perspectives in testing • Participants: Students from different geographical areas who are assumed to speak different dialects of the same language; teachers who are familiar with the dialect spoken by their students • Teachers from different sites adapt the same set of items based on their knowledge of their students’ dialects • Students take the tests in both dialect versions

  6. Pressing issues in the testing of linguistic and cultural minorities • Accountability based on standardized test scores • Bilingual and multicultural education under attack

  7. Limitations of current approaches to testing linguistic and cultural minorities • Lack of theories of language and culture • Testing practices driven by erroneous assumptions about language and culture • Erroneous assumptions about the effectiveness of current testing practices • Recent, cognitive approaches to testing overlook the important cultural influences on cognition

  8. Examining the linguistic demands posed by test items

  9. A measurement of 60 inches is equal to how many feet?

  10. The Lunch Money item

  11. What is the least number of $1.00 bills that his mother should give him so he will have enough money to buy lunch for 5 days?

  12. Major (yet preliminary) finding: The syntactical structure of some test items is unnecessarily complex

  13. Understanding sociocultural influences on test taking

  14. Assessing others: The Kayak experience

  15. Socio-cultural activity questions How do you see [this item] as part of... • …what you do when you are not at school? • …what you do for fun when you are at school? • ...your school day in the classroom? • ...any traditions that you have?

  16. World views: a low-income student’s response to the Lunch Money item Interviewer: What is this item about? Student: It’s about Sam, trying to get her lunch, but her mom only has one dollar, and she needs more for five days, so I think she should give her a dollar ninety-five.

  17. Cultural factors relevant to assessment • Epistemologies -- ways of constructing knowledge and making sense of experience • Teaching and learning styles -- ways of transmitting and acquiring knowledge • Discourse styles -- ways of expressing ideas

  18. Cultural influences in assessment Different cultural backgrounds produce different ways in which students: • interpret what a test item is about • use their knowledge and experience to solve problems • demonstrate their knowledge

  19. Understanding the students’ linguistic proficiency

  20. Language dominance: Traditional, simplistic view

  21. Patterns of language dominance diversity: A more realistic view

  22. Assessing linguistic minority students: Common mistakes • Underestimating students’ English proficiency • Overestimating students’ proficiency in their native language • Lowering academic standards and expectations

  23. Dimensions of item design and review: data sources, methods

  24. Dimensions of item design and review: focus

  25. Difficulty reading “$1.00 bills” in “His mother has only $1.00 bills.” Identified by linguists: Yes Anticipated by at least 20% of teachers: No Observed in at least 20% of students: Yes Statistically significant differences between groups: Yes

  26. Difficulty reading “least” in“What is the least numberof $1.00 bills…?” Identified by linguists: Yes Anticipated by at least 20% of teachers: Yes Observed in at least 20% of students: No Statistically significant differences between groups: No

  27. Why we should pay more attention to the process of test development

  28. Validity and high-stakes testing Information on the validity of a test for a given population of students is usually available after decisions affecting those students and based on the scores from that test have been made.

  29. Validity and high-stakes testing

  30. Language: Tolerance to error

  31. What about English-language learners?

  32. Inadequate test adaptation: An example

  33. Thinking about language

  34. Traditional: Psychometrics Cognitive psychology Other: Cultural anthropology Sociolinguistics Structural linguistics Reading Disciplines relevant to test development

  35. Conclusions Current approaches to testing English language learners do not address effectively the fact that assessments are extremely sensitive to wording

  36. Implications for assessment • New paradigms in testing • Multidisciplinary approaches • Combining quantitative and qualitative methods

  37. Workshop exercises

  38. Exercise 1: Gumball machine The gum ball machine has 100 gum balls; 20 are yellow, 30 are blue, and 50 are red. The gum balls are well mixed inside the machine. Jenny gets 10 gum balls from this machine. What is your best prediction of the number that will be red? 

  39. Exercise 2: Metals

  40. Exercise 3: Mountains

  41. References (1) • Solano-Flores, G., & Trumbull, E. (2003). Examining language in context: The need for new research and practice paradigms in the testing of English-language learners. Educational Researcher, 32(2), 3-13. • Solano-Flores, G. (2003). The multidimensionality of test review and test design: A conceptual framework for addressing linguistic and cultural diversity in testing. Paper presented at the !0th Biennial Conference of the European Association for Research on Learning and Instruction, Padova, Italy - August 26 – 30, 2003. • Solano-Flores, G., Trumbull, E., & Nelson-Barber, S. (2002). Concurrent Development of Dual Language Assessments: An Alternative to Translating Tests for Linguistic Minorities. International Journal of Testing, 2(2), 107-129.

  42. References (2) • Solano-Flores, G., & Nelson-Barber, S. (2001). On the cultural validity of science assessments. Journal of Research in Science Teaching, 38(5), 553-573. • Solano-Flores, G., Lara., J., Sexton, U., & Navarrete, C. (2001). Testing English language learners: A sampler of student responses to science and mathematics test items. Washington, DC: Council of Chief State School Officers.

More Related