1 / 39

T he Impact of Literacy Software on Reading Content Area Text

T he Impact of Literacy Software on Reading Content Area Text. Cindy Okolo okolo@msu.edu , okolo.wiki.educ.msu.edu Ira Socol Shani Feyen Summer Ferreri.

kiet
Download Presentation

T he Impact of Literacy Software on Reading Content Area Text

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The Impact of Literacy Software on Reading Content Area Text Cindy Okolo okolo@msu.edu, okolo.wiki.educ.msu.edu Ira Socol Shani Feyen Summer Ferreri

  2. This project was supported by grants to the University of Oregon and to Michigan State University from the Office of Special Education Programs, United States Department of Education

  3. Literacy Software • Operating systems that “read” text • Text readers or screen readers • Add Ons and Extensions • Multipurpose software • WYNN • Kurzweil • Crick Software • Don Johnston Solo • Read&Write Gold

  4. Literacy software = accessible text? Photography by D Sharon Pruitt

  5. Literacy software makes it possible to transform text The Machine is Us/ing Us http://www.youtube.com/watch?v=NLlGopyXT_g

  6. Literacy software lets us create supported digital text

  7. Supported Text = Text + Resources* Presentational *Anderson-Inman & Horney, 2007 photo by Jack Chan

  8. Supported Text = Text + Resources • Navigational Photo taken by Carrie Ansell

  9. Supported Text = Text + Resources • Translational

  10. Supported Text = Text + Resources • Instructional • Photo by Fixlepix

  11. Supported digital text • UDL • Accessible instructional materials • Textbook adoption policies • Cross-platforms

  12. Research Support for Supported Digital Text?

  13. Studies of computer-based text go back to the 80s • Text on computer or text on page: doesn’t really matter (Okolo, Cavalier, Ferretti, & MacArthur, 2010) • Text to speech technology seems most helpful to students who can’t access the text in other ways (visual impairments) or who have poor word recognition skills and/or low fluency (Elkind, Black, & Murray, 1996) • Some studies favor simultaneous presentation of text (audio and visual); others don’t (Balajthy, 2005; • Strangman & Hall, 2003)

  14. Literacy software may improve reading rate and comprehension (Dimmitt, Hodapp, Judas, Mann, & Rachow, 2006) • Impact increases over time (13 weeks) • Some studies show improvements on other literacy skills, not only on text read during intervention • Homophone & spelling error detection, word comprehension,reading comprehension (Lance, McPhillips, Mulhern, Wylie, 2006) • Problem solving skills (but not text comprehension) (Twyman & Tindal, 2006)

  15. Unanswered Question: Access and Outcomes in Content Area Text? • Does literacy software improve access to the general education curriculum? • In content areas • In middle and high school • Textbook based instruction remains most common instructional practice • If students could read their textbooks with literacy software, would it matter?

  16. Research Questions • What is the difference in comprehension of social studies content when students read text orally versus when they hear text read aloud by a literacy software program? • Which features of a literacy software program do students chose to use, and for what purposes, when reading social studies text?

  17. Research Questions • Do students prefer reading social studies text with a literacy software program? • Are there demographic factors that affect comprehension of social studies text, software use, and preferences?

  18. Design & Instruments • Multi-element design • Randomly alternate reading social studies text via literacy software and student reading printed text aloud • Reading comprehension • Text retelling • Multiple choice questions (1 main idea, 2 details, 1 vocabulary; 3 “logical” choices, 1 “illogical” choice) • Technology survey (start of study) • Student preference interview (after each session) • Which did you prefer?Which text was more interesting?

  19. Participants • Seven 8th grade students • All had identified disability in reading • Three “at risk,” four “special education” • FSIQ 78 to 104 • Reading at least 2 years below grade level • Cs or lower in adapted social studies class • Rural/suburban school district • Middle school • Low use of technology or literacy software in school • Most students used technology at home • 5 of 7 had broadband access at home • None has used literacy software or digital books

  20. Procedures • Prepared text samples • World History text used by many local schools • 500 words, self-contained topics • 40 samples • Rewritten to 8th grade readability level (grade appropriate text) • Included at least 1 main idea, 2 details, and 1 vocabulary word that could be defined by context • Samples created by 1 author and reviewed by other 2 • Prepared instruments • Also at 8th grade readability level

  21. Procedures • WYNN literacy software • http://www.freedomscientific.com/lsg/products/wynn.asp • Features typical of literacy software programs • Word by word and full sentence highlighting • Link words to definitions • Variable speed control for synthesized voice • Choice of voices • Control panel simplicity (e.g. read/stop toggle) • Rate set at 113 words per minute • Reading rate considered appropriate for 5th grade readers

  22. Procedures • Individual sessions, computer lab • Researcher-implemented • Two training session in use of WYNN software • Condition (print versus software; text passage; time of day) randomized across students • Read at own pace (stop reading/listening to pause or conclude; review as needed)

  23. Procedures • Take text away • Retell • Tell me everything you can remember about the passage • Multiple choice (read aloud by the researcher) • Preference: • Which did you like better? • Which was more interesting? • Field notes taken during sessions about spontaneous comments, approaches to reading, and software use

  24. Paper versus software • Text samples randomly assigned to condition, student, time of day • Text typed on piece of paper • Asked to read aloud • Encouraged to read at own pace, stop, review as needed • Retelling • Multiple choice read aloud • Text samples randomly assigned to condition, student, time of day • Text displayed on a single page on the computer screen • Asked to listen to text by toggling “read” button • Encouraged to use features to read at own pace, stop, review as needed • Retelling • Multiple choice read aloud

  25. Data analysis • Text samples divided into idea units • Retells audiotaped & transcribed • Number of idea units counted • Interrater reliability per sample ranged from 85% to 100%, average 93% • Multiple choice questions scored 1 to 4 • Looked for differences in accuracy by question type and by logical/illogical options

  26. Results: Overall Comprehension • Nearly identical across conditions • 4.2 facts recalled with print; 3.9 with software • Multiple choice accuracy about 56% in each condition

  27. Results: Question Type • Detail questions somewhat better in paper condition (55% versus 48%) • Vocabulary questions somewhat better in software condition (69% versus 52%) • Larger differences in illogical choice responses favoring software (chosen 5% software condition versus 14% in traditional condition)

  28. Results: Observations and Preferences • Took about 3 sessions, for students to seem comfortable using software • e.g., to change features such as read/stop or access to definition • In terms of time, this is less than one hour • Students preferred software condition 90% of sessions • Students rated text more interesting in about 84% of software sessions • Boys more likely than girls to prefer software condition and to rate text more interesting in software condition • Boys somewhat more likely to use different software features

  29. Example of ResultsStudent A: Retells

  30. Example of ResultsStudent A: Multiple Choice

  31. Who is Student A? • Male • Receives services for reading, reading about 3 years below grade level • FSIQ = 78, “at risk” services • Cs in adapted social studies classroom • Reported using audiobooks in 5th grade, no other formal school experience with assistive technology • Reads slowly, uses finger, mouths words • Experimented with software features (definitions, pace, critiqued software informally) • Military history buff • Strong preference for software condition

  32. Example of ResultsStudent B: Retells

  33. Example of ResultsStudent B: Multiple Choice

  34. Who is Student B? • Female • Receives services for LD, 2 years below reading level • FSIQ = 104 • Cs in adapted social studies classroom • Used Read 180 in prior school district in elementary school • Read at a consistent, moderate speed • Looked up 2 definitions, little other interaction • Preferred software 75% of the time but rated software passage more interesting only 50% of time

  35. Conclusions • Just because content-area text is available through a literacy software program doesn’t mean students will understand it better • However, most students, in most cases, prefer to read text via software • Many students rate text more interesting when read by literacy software

  36. Conclusions • Students in this study did not take full advantage of software features • Even when accessing websites of interest to them (mid-intervention session), they did not use many of software features • Use of features increased over sessions • Took about 3 sessions (ie, one hour) to begin to personalize the software experience • Some differences between males and females in preferences and use of software features

  37. Conclusions • Difficult text is difficult text • Would results be the same for fiction? For other content? • Would results be the same for skill-appropriate text? • It takes time for students to take control of their reading experience with literacy software • Students need experience • Student may need guidance • Girls may need more time/guidance than boys

  38. Conclusions • Does literacy software help students succeed in content area classes? • We don’t know • Probably (with time, experience, different texts) • How important are student preferences? • If students find text more interesting and prefer reading text by software—will they read more? participate more? learn more over time?

  39. Anderson-Inman, L., & Horney, M. A. (2007). Supported eText: Assistive technology through text transformations. Reading Research Quarterly, 42(1), 153,-160. Balajthy, E. (2005, January/February. Text to speech software for helping struggling readers. Reading Online, 8/4. Accessed April 15, 2011 at: http://www.readingonline.org/articles/balajthy2/ Dimmitt, S., Hodapp, J., Judas, C., Munn, C., & Rachow, C. (2006). Iowa Text Reader Project. Impacts on student achievement. Closing the Gap, 24(6), 12-13. Elkind, J., Black, M., & Murray, C. (1996). Computerbased compensation of adult reading disabilities. Annals of Dyslexia, 46,159–186. Lance, A. A., McPhillips, M., Mulhern, G., & Wylie, J. (2006). Assistive software tools for secondary-level students with literacy difficulties. Journal of Special Education Technology, 21(3), 13-22. Okolo, C. M., Cavalier, A. R., Ferretti, R. P., & MacArthur, C. A. (2000). Technology, literacy, and disabilities: A review of the research (pp. 179-250). In R. Gersten, E. P. Schiller, & S. Vaughn (Eds.). Contemporary special education research: Syntheses of the knowledge base on critical instructional issues. Mahwah, NJ: Erlbaum. Strangman, N., & Dalton, B. (2005). Technology for struggling readers: A review of the research. In D., Edyburn et al, (Eds.). The handbook of special education technology research and practice (pp. 454-569). Whitefish Bay: Knowledge by Design. Twyman, T. & Tindal, G. (2006). Using a computer-adapted, conceptually-based history text to increase comprehension and problem-solving skills of students with disabilities. Journal of Special Education Technology, 21(2), 5-16. All images from Pics4Learning: http://pics.tech4learning.com or Flickr: Creative Commons: http://www.flickr.com/creativecommons/

More Related