1 / 79

Session 5

Session 5. Item Analysis. Item Analysis. Subjective items (Rasch model) Objective items (classical) Facility value (pass rate): the ratio between those who pass the item and the total number of testees Difficulty ( Δ ,delta) 15%: difficult 70%: medium 15%: easy Discriminability (r bis ).

sveta
Download Presentation

Session 5

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Session 5 Item Analysis

  2. Item Analysis • Subjective items (Rasch model) • Objective items (classical) • Facility value (pass rate): the ratio between those who pass the item and the total number of testees • Difficulty (Δ,delta) • 15%: difficult • 70%: medium • 15%: easy • Discriminability (rbis)

  3. Discrimination • Simple: • Rank students according to their total scores • Divide them into three groups (top and bottom have equal numbers of students) • Count how many students in the top group get an item right, and how many in the bottom group • Formula: • Example

  4. D.I.= = = .047619 Statistical computation Facility value=44/62=71%

  5. Statistical computation • Simple: • Rank students according to their total scores • Divide them into three groups (top and bottom have equal numbers of students) • Count how many students in the top group get an item right, and how many in the bottom group • Formula: • Example • Point biserial correlations (Rbis)

  6. Item analysis Item facility values (p) and biserial correlations (rbis.) of the items

  7. Program Data editing

  8. number of items in each section Names of the different parts Total number of items Name of the test Number of digits for the Ss number

  9. Missing data Keys: first key should align with the column following the Ss number Students’ responses

  10. GITEST – Item Analysis

  11. GITEST – Item Analysis • Save the data to the directory where you put the program.

  12. GITEST – Item Analysis

  13. GITEST – Item Analysis

  14. GITEST – Item Analysis

  15. GITEST – Item Analysis

  16. GITEST – Item Analysis

  17. GITEST – Item Analysis

  18. GITEST – Item Analysis

  19. GITEST – Item Analysis

  20. GITEST – Item Analysis

  21. GITEST – Item Analysis

  22. GITEST – Item Analysis

  23. GITEST – Item Analysis • Save the data to the directory where you put the program.

  24. GITEST – Item Analysis

  25. GITEST – Item Analysis

  26. GITEST – Item Analysis

  27. GITEST – Item Analysis

  28. GITEST – Item Analysis

  29. GITEST – Item Analysis

  30. GITEST – Item Analysis

  31. GITEST – Item Analysis

  32. GITEST – Item Analysis

  33. GITEST – Item Analysis

  34. Testing listening comprehension • discourse structure • spoken language • lectures • registers • conversation • talk / monologue (scripted talk) • unscripted talk (lectures) • non-verbal signals • Speech rate

  35. Text type Words/minute Syllables/minute Syllables/second Syllables/word 160 250 4.17 1.6 Radio monologues Conversations 210 260 4.33 1.3 Interviews 190 250 4.17 1.3 Lectures to NNS 140 190 3.17 1.4 Testing listening comprehension

  36. Taxonomies of listening sub-skills

  37. Approaches to assessing listening • the discrete-point approach • A. phonemic discrimination tasks • e.g. test-taker hear: I hear they have developed a better vine near here. • Test-taker read: I hear they have developed a better vine/wine near here. • paraphrase recognition • e.g. test-taker hear: John ran into a classmate on his way to the library. • Test-taker read: (a) John ran to the library. • (b) John exercised with his classmate (c) John unexpectedly met a classmate. • (d) John injured his classmate with his car.

  38. The integrative approach • A. reduced redundancy • i) noise tests • Although the script underwent constant revisions, //the chemistry between the two stars was //apparent from the first day of filming.// Bergman and Bogart were unforgettable.// “We had a very poor script //and they never knew what we are going to do the following day //and we never knew how to end the movie which was a bigger worry to me// because I didn't know which man I really loved. //I asked them, 'Do I love my husband, or do I love Humphrey Bogart,//because Paul Henry played my husband?’ They said they don't know because we hadn't made up our minds.// Play in between. So very often when I see that movie I can see my face that tells no expression whatsoever.”

  39. listening cloze • shown to have high validity • difficulty of making the recordings

  40. gap-filling • i) listening recall test (Henning, Gary, & Gary, 1983) • Similar to cloze test, but the content words are deleted. • Claimed to have good reliability, validity, and discrimination. • No clear evidence of comprehension. • ii) gap-filling on summaries • Listen to a passage and fill in blanks in a passage which summarizes the listening passage. • Shown to be reliable and valid • Not easy to make, and require pre-testing to make sure the students can’t fill in the blanks without listening to and understanding the listening passage. <ask some people to fill in blanks without listening to the passage.>

  41. Dictation • Listen to the passage twice. • Scoring • By words: spelling mistakes should be ignored in cases when it is obvious that the mistake is indeed a simple spelling mistake. • By chunks • Chunks are long enough, usually shorter at the beginning, but gradually get longer. • Used also as a means of language teaching

  42. Translation (Buck, 2001) • Steps: • A native speaker is asked to record a short description of a city or other things in English. • This recording is cut into several, say, 20 short sections at natural pauses. • Sufficient time is included to allow the test-takers to write down what they had heard, not in English, but in the test-takers’ native language. • Test-takers listen to the recording again • Test-takers write down in their native language a translation of what the speaker has said, for the benefit of a person who is considering a trip to the city. • Scoring is based on ideas.

  43. The communicative approach • Authentic texts • Spontaneous recording • Listening to songs • by asking, e.g. these questions: • How is the singer feeling? • Why does she feel like this? • What color are her eyes? • Authentic tasks • First students read a passage. • Then students listen to a passage related to the reading passage. • Students are to find the differences between the two passages.

  44. Guides to writing listing test items • Four principles • Test points should be based on information from the listening materials.

  45. The information should be unknown to listeners before the listening. • The information should come from various levels and categories. • Test points should cover the whole listening materials.

  46. Testing reading comprehension • Reading skills: • 1. recalling word meanings • 2. drawing inferences about the meaning of a word in context • 3. finding answers to questions answered explicitly or in paraphrase • 4. weaving together ideas in the content • 5. drawing inferences from the content • 6. recognising a writer’s purpose, attitude, tone and mood • 7. identifying a writer’s techniques • 8. following the structure of a passage • (Davis, 1968)

  47. Variables that affect the nature of reading • 1. Reader variables (Alderson, 2000) • Background knowledge • knowledge of the language, • knowledge of genre (text type) • metalinguistic knowledge • knowledge of subject matter/topic • knowledge of the world • cultural knowledge, • Reader skills and abilities (experienced vs. new readers)

More Related