1 / 40

The Collegiate Learning Assessment (CLA) Project

The Collegiate Learning Assessment (CLA) Project. Roger Benjamin RAND Corporation’s Council for Aid to Education October 10, 2003. Themes. Why Measure Educational Outcomes? Obstacles to Overcome The CLA Approach in Context Feasibility Study Results An Opportunity to Participate.

natane
Download Presentation

The Collegiate Learning Assessment (CLA) Project

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The Collegiate Learning Assessment (CLA) Project Roger Benjamin RAND Corporation’s Council for Aid to Education October 10, 2003

  2. Themes • Why Measure Educational Outcomes? • Obstacles to Overcome • The CLA Approach in Context • Feasibility Study Results • An Opportunity to Participate

  3. Why Measure Educational Outcomes? • Improve educational programs • Demand for accountability • Rising costs • Reduced budgets • Competition from distance learning

  4. Changing Context for CLA (1) • Accountability drive continues to mount • Bush administration likely to place performance measures in Higher Education Reauthorization Act • Tension between higher education leaders and state leaders appears to be increasing • Strong interest in assessment among private higher education institutions • Participation/attainment gap between ethnic/racial groups continues to widen

  5. Changing Context for CLA (2) • Budget Crisis • Private colleges: Endowments have declined significantly • Public colleges: 43 states exhibit medium to severe deficits, totaling $78 billion • Tuition increases sharply • 10% in during ‘02–‘03 / ‘03–‘04 increases could be higher

  6. The State Has A Critical Role in Higher Education • The state provides the instructional budget and infrastructure support • The state sets objectives for • Educational levels to be achieved by entering students • Participation rates by minority groups • Minimum passing scores for professional school graduates

  7. Basic Methodological Hurdles to Overcome • Direct comparisons between states problematic • Comparing aggregated scores of institutions at the state level flawed • Use of proxy measures problematic because of selection bias

  8. Are State-Based Comparisons Possible? • States may conduct comparisons over time within their states • States may wish to establish minimum performance levels and benchmark them against the same measures in states judged most similar to them.

  9. Institutional Barriers to State-Based Accountability Movement • Structure of higher education governance not conducive to top-down policy strategies • In particular, state-based strategies confront norms that cede decision making regarding pedagogy and curriculum, including assessment to the faculty

  10. The Link Between Productivity, Accountability and Assessment • There must be a metric against which to evaluate the productivity concepts • The quality of student learning outcomes is the only serious candidate • Moreover one cannot introduce accountability until standards of performance are set • However, unless the assessment strategy is acceptable to faculty little progress can be expected

  11. Competing Visions • Faculty use assessments that are focused on improving curriculum and pedagogy and more likely to be focused on the department or institution and not interested in inter-institutional comparisons • State-based approaches are focused on accountability, aggregate data to the state level, and use proxy measures

  12. Issues to Solve • Performance measures may offer opportunity to reconcile the goals and approaches of the state and institutions of higher education but agreement on rules of engagement need to be worked out • Consensus on measures, approach, and what is to be reported must be reached

  13. Current Approaches • Accreditation Review (inputs) • Actuarial indicators (graduation rates & access) • Faculty surveys (US News & World Report) • Student surveys (NSSE & CIRP) • Direct measures of student learning

  14. Problems with Direct Measures • No common core curriculum • Too many academic majors • Course grades are professor/school specific • Gen Ed skills limited sensitivity to instruction • Graduate/Professional school admission tests are not appropriate because: • Too few students take them • Selection bias in who takes them • Not focused on educational outcomes

  15. SampleCLA Performance Measure:“Crime Reduction”

  16. Sample CLA Performance Measure “Crime Reduction”

  17. The Task “Jamie Eager is a candidate who is opposing Pat Stone for reelection. Eager critiques the Mayor’s solution to reducing crime by increasing the number of police officers. Eager proposes the city support the a drug education program for addicts because, according to Eager, addicts are the major source of the city’s crime problem.” “Mayor Pat Stone asks you to do two things (1) evaluate the validity of Eager’s proposal and (2) assess the validity of Eager’s criticism of the mayor’s plan to increase the number of officers.”

  18. The Documents “Mayor Stone provides you with various documents related to this matter, but warns you that some of them may not be relevant. Your task is to review these materials and respond to the mayor’s request in preparation for tomorrow night’s public debate with Eager.”

  19. Memo

  20. Newspaper Article

  21. Crime Statistics

  22. Crime and Drug Use Tables

  23. Crime Statistics

  24. Research Brief

  25. Crime Rates Chart

  26. Research Abstracts

  27. Feasibility Study Measures • Six 90-minute CLA Performance Measures • Two types of GRE writing prompts • NSSE questionnaire • SAT (or converted ACT) score • Cumulative GPA • Task evaluation form

  28. Sample • 14 Schools varied greatly in: • Size • Type • Location • Student characteristics • About 100 students/school (total N = 1360) • Roughly equal N’s per class within a school • Not a random sample, participation optional

  29. Small but Significant Class Effects • After controlling on SAT scores and school • Mean test battery scale score increase relative to freshman (sd = 150): 10 pts Sophomores 27 pts Juniors 38 pts Seniors

  30. School Effects Averagescaledtaskscore Total scaled SAT score

  31. Feasibility Study Conclusions • General approach is sound for measuring school (as distinct from individual student) effects • Computer scoring of answers to GRE prompts works reasonably well and saves money • An acceptable 3-hour test package would contain one 90-minute task and two GRE prompts • Some tasks may interact with academic major

  32. CLA Administration: CAE Will… • Provide information on assembling the sample • Provide templates for letters to use in recruiting students • Provide guidelines for proctoring the session(s)

  33. Campus Representatives Have Flexibility In… • Scheduling the sessions • Campus representatives will need to • Collect registrar data • Collect IPEDS data

  34. Two Approaches • Cross-Sectional Studies • Longitudinal Studies

  35. Cross-Sectional Studies • During spring term, 100 seniors and 100 sophomores sampled. Analyses will permit value-added comparisons between institutions. • If subsequent fall term freshmen/first-year students also sampled, analyses will provide more sophisticated information about value-added within institution.

  36. Longitudinal Studies • All fall semester freshmen/first-year students sampled. • Students can be sampled through follow-up administrations during spring terms of their sophomore and senior years. Provides for most detailed analysis of value-added because individual variance can be controlled for.

  37. CLA Institutional Reports • Combining the results from the CLA measures with registrar data (students’ SAT/ACT scores and GPAs) and IPEDS data allows for analyses of patterns and trends across institutions.

  38. CLA Institutional ReportSample Page

  39. Motivation Strategies • Appeal to the importance of doing well for the sake of the institution • Create incentives for students to perform well • Develop incentives for the institution and the student • Align tests with general education and capstone courses • Create seminars aligned with the tests

  40. Important Characteristics for a Successful Missouri Pilot Project • Emphasis on improvement • Useful information for improvement • Legislative support • Cost effectiveness • Contextual understanding of data • Long-term commitment – focus on trends • Multiple comparative measures • Control variables on differential student characteristics • Clear understanding of consequences • Integrated within existing assessment activity • Faculty access to illustrations of assessment tasks and feedback reports • Incentives for participation • Diagnostic information for individual students

More Related