1 / 18

Assessing the Assessments:

Assessing the Assessments:. 2001-2008: Nine assessment projects of introductory composition What do these add up to? What ideological currents within the department and college shape these assessments? Three general phases. Phase I. Focus on student self-assessment of learning

maine
Download Presentation

Assessing the Assessments:

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Assessing the Assessments: • 2001-2008: Nine assessment projects of introductory composition • What do these add up to? • What ideological currents within the department and college shape these assessments? • Three general phases

  2. Phase I • Focus on student self-assessment of learning • Understanding if and how students use peer review • Support and use of a social scientist from the “Office of Planning and Research”

  3. Phase II • English 2010: a stable curriculum and outcomes • Focus on evaluating actual student writing • Informed by outcome goals developed over two years (similar to WPA outcome goals) • No support from or trust in an institutional research office • Accreditation results = Financial funding

  4. Phase III • Much broader focus (anxiety and retention studies) • A refocusing: what had we really learned? Have we been asking the right questions? • Getting the lay of the land: deeper issues and our students • Response to curricular changes in English 1010 • Institutional support through grants and a more robust IR

  5. Knots, nooses, and loops: Why does everybody want a piece of me? • Local Goals and External Pressures • Phase II: Institutional Pressures and Accreditation • The Catch 22 of assessment • Finding meaning in spite of institutional pressure • Walking the institutional line by mimicking past assessments • The self-assessment debacle • Unpopular questions • But wait, maybe there’s something to be gained • Phase III: An audience assessment • Conclusions: • Discourse is what it’s about

  6. Local Goals and External Pressures The CCCC position statement on Writing Assessment: “Guiding Principles for Assessment” section number 1b it states: “Best assessment practice is undertaken in response to local goals, not external pressures. Even when the external forces require assessment, the local community must assert control of the assessment process, including selection of the assessment instrument and criteria.”

  7. Phase II: Institutional Pressures During their visit in October of 2004, The Northwest Commission recommended that SLCC: • develop processes for assessing its educational programs. • identify and publish the expected learning outcomes for each of its degree and certificate programs and demonstrate, through regular and systematic assessment, that students who complete their programs have achieved these outcomes. • provide evidence that its assessment activities lead to the improvement of teaching and learning.

  8. Phase II: Fall out from accreditation—a rush to cliché • Outcome goals are our friends • A “culture of assessment” • Closing the “assessment loop” • “Whip us up an assessment by morning”

  9. English 2010: A Catch 22 Discussion of the data. In terms of the overall assessment, we felt that the evidence, even given its limitations, suggests that the curriculum does a pretty good job of helping students to meet the outcomes of English 2010. Ninety-one percent of the portfolios we read achieved average, above-average, or high proficiency at the course outcomes. Even if we only include scores of eight or above as having achieved proficiency, we found that seventy-seven percent, or 135 portfolios, were at this level (2010 report, 2006).

  10. English 2010: Finding meaning in spite of institutional pressure Kathleen Blake Yancey argues in “Looking Back as We Look Forward: Historicizing Writing Assessment as Rhetorical”: "When writing assessment is located within practice, its validity is enhanced, to be sure. but equally important, it reflects back to us that practice, the assumptions undergirding it, the discrepancy between what it is that we say we value and what we enact…[making] that practice visible and thus accessible to change” (494).

  11. English 1010: Walking the institutional line by mimicking past assessments The rhetoric of “Stats at a Glance” • 78.6% the magic number • Only 64 third reads which demonstrates a 94% accuracy rate!!! Would you like a toaster oven with that?

  12. English 1010: The self-assessment debacle • 46% of students scored below average • Supporting the rhetoric of assessment: An assault on self-assessment writing practices and training (see action plan from report) • Resisting assessment ideology: Who is the enemy?

  13. English 1010: Unpopular questions • How did full-time faculty, all on the English 1010 committee for several years, come to understand such a vital part of the course in wildly different ways? • An aberrant student essay or two? Or… • Idiosyncrasies and/or deficiencies of one or two department members? Or… • a much deeper rift in the ways in which department members understand the writing process and the evaluations of it?

  14. English 1010: But wait, maybe there’s something to be gained “Hammering out an agreeable compromise” Peter Elbow argues that our “communal assessment” is both more realistic and productive: “The more we grades with others, the more attuned we become to community standards … we can then realize that all of us need to rethink and perhaps adjust our standards. And the greatest benefit of all comes when we return to our classrooms enriched by new ways of commenting on student texts”

  15. Phase III: A new direction through audience assessment • Non-linear: Pulling back because we want to and getting the lay of the land • Representing student and adjunct voices • Finding other funding: Faculty Teaching and Learning Center (FTLC) grants • Using or being used by the CCESSE survey on engagement

  16. Conclusions • The culture of assessment generally encourages researchers to downplay deficiencies in design and results • Assessments of student writing and process often tell us as much, or even more, about that which we are not specifically investigating • The discourse of institutional assessment privileges clear cut analyses of courses and programs • Institutions want to use assessments to merely prove that we are indeed doing well by meeting our outcomes and goals

  17. Discourse is what it’s about • Yancey use Pamela Moss’ “Response: Testing the Test of the Test” to ask how students understand themselves in light of our assessments: “‘it is important to study the actual discourse that occurs around the products and practices of testing—to see how those whose lives a testing program impacts are using the representations (interpretations) it produces’ (119). Writing assessment here, then, is rhetorical: positioned as shaper of students and as means of understanding the effects of such shaping” (498).

  18. And, identity “[E]ducation ultimately and always is about identify formation, and this is not less true for writing assessment than for any other discipline… [it] wields so much power, plays a crucial role in what self, or selves, will be permitted—in our classrooms; in our tests; ultimately, in our culture” (498).

More Related