1 / 32

Assessing General Education: Options, Choices and Lessons Learned

Assessing General Education: Options, Choices and Lessons Learned. Jo-Ellen Asbury, Ph.D. Rebecca Kruse Office of Institutional Research and Assessment Stevenson University. Assessing General Education. We don’t have all the answers We invite audience input and insights

phiala
Download Presentation

Assessing General Education: Options, Choices and Lessons Learned

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Assessing General Education: Options, Choices and Lessons Learned Jo-Ellen Asbury, Ph.D.Rebecca KruseOffice of Institutional Research and Assessment Stevenson University

  2. Assessing General Education • We don’t have all the answers • We invite audience input and insights • We are not cheerleaders for national tests, it was the decision that we made at that time •  No, we get no kick-back from ETS! * We are not here to advocate use of the MAPP ETS Proficiency Profile or any specific test or assessment. We want to share our experience and generate a conversation.

  3. SU Core Curriculum Requirements (Bachelor’s Degree) (General “Cafeteria” Style) Min. 16 academic courses in liberal arts and sciences and 1 course in phys ed. All students must complete the following: • Skills Courses: • Three writing courses • One communication course • One physical education course • Computer literacy requirement • Distribution Courses: • One fine arts course • Two social science courses • Three math and science courses (at least one lab) • Four humanities courses humanities • Core Electives (2 courses, 6 credits) • Foreign Language (Bachelor of Arts only) 2 courses

  4. The problem: How to assess the General Education program • Unlike a major (psychology, math, etc.) does not have: • A firm fairly prescribed list of requirements. • A faculty member (or group of faculty members) who take sole responsibility for oversight. • A capstone project/paper/experience that could be used to assess student learning outcomes. • Student learning outcomes for gen ed were evolving. • Currently, no centralized oversight.

  5. Possible General Education Assessment Approaches Individual Course-Based Approach • Information collected about learning in individual courses. Faculty demonstrate that students acquiring knowledge, skills, values associated with one or more gen ed goals. Assignments, exams, portfolios, etc. Multicourse (Theme-Based) Approach • Focus on faculty from number of disciplines rather than individual courses. Review of syllabi, focus groups. Noncourse-Based Approach • Campuswide focusing on individual or groups of students rather than courses. Gen ed assessment given to all or a sample of students. Standardized testing, student and alumni surveys, transcript analysis. • Source: Palomba, C. A., & Banta, T. W. (1999). Assessing general education. In Assessment essentials: Planning, implementing, improving (pp. 239-268). San Francisco: Jossey-Bass.

  6. Selecting a Gen Ed Assessment Method • Method(s) used needs to match learning goals • Because gen ed programs include a broad range of learning goals and objectives, critical thinking, communication, values, attitudes…. Need to be careful that the methods used will address all of these objectives • May need more than one method • Settled on some type of nationally-normed instrument.

  7. Use of Published Tests / Assessments ~ from the paper, “The Role of Published Tests and Assessments in Higher Education”, March 2006, byLinda Suskie, MSCHE Vice President • Developed by testing professionals (test design, quality of questions better) • Can provide comparison data • Provide detailed, diagnostic feedback • Variety of published tests to reflect diversity among schools and programs • Longitudinal data confidence Pros

  8. Use of Published Tests / Assessments Consider the distinct set of knowledge, skills and competencies your institutions seeks to instill and should be used in combination with other evidence of student learning. “The Role of Published Tests and Assessments in Higher Education” Linda Suskie, Middles States Commission on Higher Education March 25, 2006

  9. Use of Published Tests / Assessments Cons • If no compelling incentive, students may not give best effort. Challenge to get students to take and to give best effort. • Published tests for higher ed have less evidence of quality than K-12 tests. Smaller # of students, may not be representative, less funds, etc. • Certain published tests may not yield enough useful feedback . from “The Role of Published Tests and Assessments in Higher Education”, Linda Suskie, Middles States Commission on Higher Education, March 25, 2006

  10. Use of Published Tests / Assessments Chosen Assessment Should: • Match goals for student learning set by the institution • Specific content must correspond with institution’s concepts (how does institution define critical thinking for example) • Provide rich, detailed feedback that can be used to identify areas for improvement • Have evidence of validity and reliability • Provide some incentive for students to do their best from “The Role of Published Tests and Assessments in Higher Education”, Linda Suskie, Middles States Commission on Higher Education, March 25, 2006

  11. ETS Proficiency Profile / MAPP Test Selected MAPP by ETS: Measure of Academic Proficiency and Progress. (now called ETS Proficiency Profile) • Corresponds well with university core and measures what we want to measure • Several different formats to choose from (online, standard, abbreviated) • Can add up to 50 of our own supplemental questions • Rich reporting features including comparative data and diagnostic feedback, norm-referenced scores and criterion-referenced scores • SU has changed so rapidly and is still changing – important for us to be able to do comparisons, benchmarking, see differences between cohorts, etc.

  12. More on ETS Proficiency Profile Measure of Academic Proficiency and Progress (now called ETS Proficiency Profile…) • Assesses four core skill areas – critical thinking, reading, writing and mathematics at three levels • Measures academic skills developed, as opposed to subject knowledge taught, in general education courses

  13. Results and Reporting • Multitude of reporting options available • Comparison between cohorts/subgroups (separate out specific groups - majors, schools w/in University, commuters vs. noncommuters, etc. Can ask different cohorts different suppl. questions.) • Identify specific proficiency level (1-3) of core skill deficiencies (ETS has specific definitions at each level) • External and internal benchmarking • Value-Added – compare against other metrics such as GPA, SAT, etc. • Identify patterns (e.g. do students do better in certain areas if certain courses are taken in a certain order? Etc.)

  14. Next Issue: Schedule for Administration

  15. Pre-Post Test • Test students when then enter, then test again at a later point in their Stevenson career. • WHEN should the second testing take place? • Internal validity threats • History • Maturation • Mortality • Selection • Testing

  16. Cohort-Sequential Design • Compensates for (most of) the internal validity threats • Provides both between subject and within subject data.

  17. How the cohort-sequential design is being used SU

  18. Next Issue: Planning for Administration

  19. Our Plan • Administer to incoming freshman • Test same students again in the end of sophomore year

  20. Freshmen • How do we get a large number of freshmen to take the test? • Commitment from Director of First Year Experience to administer in First Year Seminars (all incoming freshman take a FYS) • Goes on the syllabus • Peer leaders (not us) to administer

  21. Freshmen – Issues/Challenges • Test version? (long, abbreviated, online) • 2007 – used long version (2 hrs) switched to abbreviated (40 mins) • Cost (tests, materials) • Student leader instructions for administering • Very specific instructions / script • Customize instruction book • Materials to and from student leaders • Tests, pencils, instructions, ID Cards, calculators

  22. Retesting as Sophomores 384 freshmen took in fall 2008 • Where and how can we test that amount of students now as sophomores? • Do we test all 384 at same time on same day in same location? Do we have the room on campus? • Do we have enough supplies to test all at one time? • What’s the best time during the semester? • Who would proctor the tests? • How do we get sophomores to volunteer to take test? No way to capture – no one class that all take.

  23. Recruiting Sophomores • Used to use scholarship hours • Pizza lunch • Gift card drawings • Offered choice of two different days • Marketed through emails, plasma screens in student union, faculty

  24. Recruiting Sophomores • A week before, response still not great • Added more gift cards • Opened up to ALL sophomores, not just ones who took it as freshmen • 46 students out of 384 signed up • 27 showed up split between both days

  25. Other Recruiting Ideas: • Gift certificates or pay for all students who take the test • Change test format – use online format • Reward those with high scores so test is taken seriously and they do their best • ETS reports that most effective is combination of extrinsic and academic reward – something to get them there and something to get them to take it seriously • Put high scores on an honor roll • Make it a requirement for registration for junior year • Withhold grades until test is taken

  26. Latest Plan • Try online non-proctored version. • Recruit 100 random students from the 384 tested as freshman in 2008 who didn’t retake it in spring. • Give each one $10 gift card to take online

  27. Data ReceivedClosing the Loop/Going Forward

  28. Cohort 1: Summary of Stevenson University Proficiency Classifications (natl. comparison in parenthesis)

  29. Cohort 1: Distribution of Individual Student Scores and Subscores

  30. Closing the Loop/Going Forward • Determine the mechanism for internal decision-making and the process used for identifying deficiencies and implementing change • Share results • Other measures of same core skills • Content mapping

  31. Other Ideas for…. - assessing general education? - recruiting students? • using data and closing the loop? • other?

  32. References Suskie, L. (2006, March 25). The role of published tests and assessments in      Higher Education. In Middle States Commission on Higher Education [Report].      Retrieved from http://www.msche.org/publications/      published-instruments-in-higher-education.pdf ETS® Proficiency Profile Case Studies. (2008). Educational Testing Services.      Retrieved from http://www.ets.org/proficiencyprofile/case_studies/ ETS® Proficiency Profile Content. (n.d.). Educational Testing Service. Retrieved      from http://www.ets.org/proficiencyprofile/about/content/ Walvoord, B. E. (2004). For general education. In Assessment clear and simple: A practical guide for institutions, departments and general education (pp.      67-79). San Francisco: Jossey-Bass. Palomba, C. A., & Banta, T. W. (1999). Assessing general education. In Assessment essentials: Planning, implementing, improving (pp. 239-268).      San Francisco: Jossey-Bass.

More Related