1 / 46

Using Grades over 26 Years to Evaluate a Career Course: Two Studies

Using Grades over 26 Years to Evaluate a Career Course: Two Studies . Robert C. Reardon, Ph.D. Stephen J. Leierer, Ph.D. Donghyuck Lee, M.Ed. Florida State University. Course History. 1973: The career course developed 1980: The team-teaching of 3 or 4 instructors

lani
Download Presentation

Using Grades over 26 Years to Evaluate a Career Course: Two Studies

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Using Grades over 26 Years to Evaluate a Career Course: Two Studies Robert C. Reardon, Ph.D. Stephen J. Leierer, Ph.D. Donghyuck Lee, M.Ed. Florida State University

  2. CourseHistory • 1973: The career course developed • 1980: The team-teaching of 3 or 4 instructors • 1981: From the quarter to semester system • 1984: Plus/minus grading system • 1984 – 1988: Catalyst materials introduced.

  3. Course History • 1994: Cognitive Information Processing (CIP) theory introduced. • 1996 – 2000: Once per week sections offered. • 1998: Friday section offered. • 1999: CIP text and student manual introduced. • 1999: Internet used in students’ research

  4. Course Objectives • Perceive the individual as purposefully responsible and active in the life/career planning process and to develop skills for increasing such behavior in others and oneself. • Understand how personal characteristics influence career development. • Become oriented to the socioeconomic world of work as it impacts individual and family career systems. • Identify appropriate academic major and/or occupational alternatives in relation to personal characteristics.

  5. Course Objectives • Learn about and use a variety of information resources to explore academic major or occupational options. • Understand career development theories and use decision-making skills for life/career planning and management. • Learn about and use job-hunting skills needed for employment. • Formulate action plans and strategies for moving oneself or other persons toward implementation of life/career goals.

  6. Course Activities

  7. Course Structure • Unit I: “Career Concepts and Applications” • Focuses on self-knowledge, knowledge about options, and decision making • Assignments: Writing an autobiography, completing the Self-Directed Search, and a skills assessment activity.

  8. Course Structure • Unit II: “Social Conditions Affecting Career Development” • Focuses on current social, economic, family, and organizational changes affecting the career planning process and the need for students to develop more complex cognitive schema to solve career problems. • Assignments: Complete autobiography, Career Field Analysis (CFA) paper, and two information interview reports

  9. Course Structure • Unit III: “Implementing a Strategic Career Plan” • Focuses on employability skills and strategies for implementing academic/career plans. • Assignments: Two information interview reports, the completion of a resume and cover letter, and a strategic/academic career plan paper

  10. Instructional Methods • A mixture of lecture, panel presentations, and small and large group activities. • Each instructor works with a small group of students in breakout sessions and evaluates their work. • Instructors meet individually with the students to discuss their assessments and progress.

  11. Course Grading Procedures • Grades are based on the successful execution of a performance contract (PC) by the student. • The PC includes 16 different graded activities spread across the three units of the course. • 28 different activities are graded in the 3-credit version of the course.

  12. Population • 6,176 undergraduate students who completed the course, “Introduction to Career Development.” • 15% to 25% of the class composed of students with officially undeclared majors. • 60% unsure, dissatisfied, or undecided with current career situation. • 75 academic periods (semester/quarter) studied.

  13. Gender Distribution

  14. Grade Distribution

  15. Ethnic Distribution

  16. Satisfaction with Current Career Situations

  17. Data Collection Procedures • Study #1: Archived course grade data 1978-2004 by academic term obtained from the university registrar. • Study #2: Archived course grade data and student evaluation of teaching (SET) data 1999-2004 by course section obtained from registrar and instructors.

  18. Working with Archival Data • Retrieving electronic grades by historical term • Aggregate group records by quarter/semester • Precluded analysis of student characteristics • Issues in retrieving SET from faculty • Confidentiality and security • Course section data • Large dataset: 6,176 students over 75 terms

  19. Research Questions Study #1 • Did grades provide evidence of students meeting the course learning objectives? • How did changes in course structures and procedures affect student learning? • Did grades differ by semester? • Did the mean grade point average earned in the course change over time?

  20. Results • Evidence of Students Meeting the Course Learning Objectives • 74% of the students earned a B+ or better • Mean GPA was 3.44 (SD = .84) • Negative correlation between mean semester GPA and the semester identification number (r = -.38, p = .002)

  21. Results

  22. Results • Eight course objectives are connected to the 16 graded course learning objectives. • 3% receiving ‘F’ grade provides evidence of the course demands.

  23. Results • Course Structures and Procedures

  24. Results • Course Structures and Procedures

  25. Results • The changes in semester system and grading system did not produce a difference in course grades. • The intensive infusion of the work-family life balance materials associated with higher grades. • The Infusion of CIP theory and textbook made the course more challenging and lowered grades. • Using internet-based sources in researching occupations resulted in lower grades on the career field analysis (CFA) research paper.

  26. Results • Grade Distribution by Semester • Significant difference in the aggregated GPA by semester (F = 6280.86, p < .0005)

  27. Results • Course grades varied by semester. • Grades in the summer were significantly higher than in other terms. • The summer term provides a more intensive course experience (intensive class schedule; students’ study load). • Grades in the fall were significantly lower than in other semesters. • Four-month intermission after registration (perhaps lowed motivation).

  28. Results • Grade Distribution over Time • Significant difference in the aggregated GPA by time (F = 23.69, p < .0005) 1: Fall 1978 - Spring 1981 2: Fall 1981 - Summer 1985 3: Fall 1985 - Summer 1990 4: Fall 1990 - Summer 1995 5: Fall 1995 - Summer 2006 6: Fall 2000 - Summer 2004

  29. Results • Students in the latest time period (fall 2000 through summer 2004) had significantly lower grades than those in any other time period. • Grade inflation was not the case with this career course.

  30. Conclusions • The career course investigated in this study appears to be an effective intervention as evident in student grades. • Grade inflation was not the case for this course. • However, grades were affected by historical events, temporal conditions, and course modifications.

  31. Limitations • Using aggregate grades across academic terms rather than individual student grades. • This precluded an examination of ethnicity, gender, or other learner characteristics in this research.

  32. Research Questions Study #2 • What was the nature of students’ evaluation of teaching (SET) in this career course? • Were earned or expected grades in class sections different across semesters or class meeting times? • Were earned or expected grades related to SET ratings?

  33. Sample for Study #2 • Fall 1999 – Summer 2004 • 62 course sections led by 12 different instructors who taught from 1 to 10 times • 74% of sections reported expected grades (we found no bias pattern in missing data) • 92% of sections reported SET ratings

  34. Results Overall Student Evaluation of Teaching (SET) very positive

  35. Data Analysis • The ideal design to examine the research questions about SET, Earned Grade Point Average, and Expected Grade Point Average would be a Split-Plot design. • The data were archival, there are no observations for some Semester X Days combinations. There are two possible solutions to this problem • Split-plot design using ANOVA for unbalanced data (paper) • Create a 7-level variable of Semester-Days (NCDA presentation)

  36. Student Evaluation of Teaching (SET) Results Students’ Evaluation of Teaching

  37. Student Evaluation of Teaching (SET) Results Earned Grades across Class Semesters and Times

  38. Student Evaluation of Teaching (SET) Results Expected Grades across Class Semesters and Times

  39. Student Evaluation of Teaching (SET) Results Earned-Expected Grades across Class Semesters & Times

  40. Results Student Evaluation of Teaching (SET) Earned and Expected Grades with SET

  41. Student Evaluation of Teaching (SET) Summary

  42. Student Evaluation of Teaching (SET) Summary • The aggregated earned grade in the Summer semester was significantly higher than the aggregated earned grade during the Fall or Spring semesters. • The aggregated expected grade was significantly different across semesters. Summer > Fall-Spring-2 > Fall-Spring 1, 3 • The aggregated expected grade was significantly higher than the aggregated earned grade. • The difference between earned and expected grades was influenced by the Semester-Days variable. Summer > Fall-Spring-2 > Fall-Spring 1, 3

  43. Student Evaluation of Teaching (SET) Summary • SET ratings on the difference of aggregated earned and expected grade varies according to semester-class times Summer > Fall-Spring-2 > Fall-Spring 1, 3 • Difference in EGPA and XGPA influenced by SET

  44. Student Evaluation of Teaching (SET) Conclusions • When controlling for SET, there was no significant difference between earned GPA and expected GPA. • Aggregated EGPA and XGPA were influenced by variables related to the process of teaching, e.g., number of class meetings per week (1, 2, 3) and length of the semester (6 vs. 17 wks.). • A significant amount of variability that exists between earned and expected grades can be traced to Student Evaluation of Teaching (SET).

  45. Student Evaluation of Teaching (SET) Limitations • Archival data analysis can become complex. • Most of the variables in these studies are aggregate measures. • Using aggregated grades as individual performance indicators can be misleading. • SET is voluntarily collected, making for poor coverage and probably poor utility for an ongoing archival project. • It is often unclear whether SET came as a result of the outcome (high or low expected and earned grades) or it led to the outcome. • These data are based on archived records collected by the University in multiple classes for the purpose of grading students in a class.

  46. Student Evaluation of Teaching (SET) Contact Information Presentation available at: www.career.fsu.edu/techcenter Robert Reardon, PhD rreardon@admin.fsu.edu Steve Leierer, PhD sleierer@memphis.edu Donghyuck Lee, MEd ryan_dhl@yahoo.com

More Related