1 / 21

Computer-based Writing Assessment: Secondary Students’ Satisfaction and Motivation Indicators

Computer-based Writing Assessment: Secondary Students’ Satisfaction and Motivation Indicators. Randall Boone & Kyle Higgins University of Nevada Las Vegas. The software: Criterion.

rafael-may
Download Presentation

Computer-based Writing Assessment: Secondary Students’ Satisfaction and Motivation Indicators

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Computer-based Writing Assessment:Secondary Students’ Satisfaction and Motivation Indicators • Randall Boone & Kyle Higgins • University of Nevada Las Vegas

  2. The software: Criterion • When students used the Criterion essay-scoring software, they typed in their essay and the software immediately provided feedback for (a) structure, (b) grammar, and (c) spelling...along with a holistic writing score (1-6).

  3. The Research Question • What do students who use the Automated Essay Scoring software (Criterion)think of it as an evaluative component in their writing classes?

  4. Research Method • Semi-structured interviews were conducted with 23 randomly-selected students from three English classes at an ethnically diverse high school in a large city in the Southwest. Participating students were from Grades Nine, Ten, and Eleven.

  5. Data Analysis & Results • Results from the interviews were evaluated using domain analysis as the qualitative method to determine the themes and coding structure for analysis of the responses across students and across themes. • Ten distinct themes were identified

  6. Theme 1 The computer knows all (more effective than the teacher) • Five students indicated a significant trust and reliance on the computer. Comments included: “If the computer says you got a 6 (top score), it must be good.”

  7. Theme 2Software affirmations (positive remarks about the software) • Six students offered a total of nine unsolicited, positive remarks regarding the Criterion software. Remarks indicated that the software imbued confidence, helped with mechanics, and was “good to use.”

  8. Theme 3Fairness (the computer is not biased) • One student mentioned this twice during a single interview.

  9. Theme 4 Fix mistakes (grammar, spelling, etc.) • With 28 unique responses from 16 (70%) of the students interviewed, this theme was 2nd from the top in terms of discussion by students.The students overwhelmingly talked about making and fixing mistakes when discussing Criterion.

  10. Theme 5 Immediate feedback (quick error detection and grading) • Ten students mentioned speed or immediacy in the interview responses with 15 unique responses. And although much professional literature focuses on this aspect, it was not the top priority issue for the students.

  11. Theme 6 Revise or resubmit (resubmitting a revised paper) • Thirteen students provided 16 unique responses that referred to the process of revising, editing, or re-submitting an essay to the Criterion program. Many of these focused on the opportunity to “keep doing it and doing it again to fix it.”

  12. Theme 7Spelling and grammar (focus on mechanics of writing) • This theme captured the highest number of responding students (17) and the highest number of unique responses (29). Responses were focused both inward (“I suck at spelling”) and toward the software (“it said that I use lots of small words”).

  13. Theme 8Teacher gives more information (is preferred source) • This theme is significant regardless of relatively low response numbers (5 students with 7 unique responses). These students seemed to be looking for something more than what they perceived as “surface level” response to their writing.

  14. Theme 9Topic choice (student selection of topics is important) • Nine students mentioned writing topics or story prompts in the interview discussions. Almost all dealt with ownership of the topic (“If I know about it, I can do a good job”). Or (“I’m a good writer unless it is a boring subject”). Nothing specific to Criterion’s prompts were mentioned.

  15. Theme 10 Typing (handwriting vs. keyboarding) • A small number (5) of students made reference to their preference for typing essays into the computer rather than having to worry about penmanship or re-writing for the revision phase.

  16. ConclusionIt seems clear that... • Students were, at the very least, satisfied, and in many respects enthusiastic about using the automated essay scoring (AES) software • Student focus in using the AES software was almost completely on the mechanics of writing (spelling & grammar).

  17. Conclusion It seems clear that... • Students preferred to continue using the AES software if they had the opportunity. Only 4 of the 23 students said that they would prefer not to be in an English class in which the AES would be used the next year.

  18. Trait Analysis and Holistic Score Report

  19. Organization Feedback Report

  20. Support for ELL Students in Spanish

  21. Class Score Summary for the Teacher

More Related