1 / 35

Using NSSE to Answer Assessment Questions

Using NSSE to Answer Assessment Questions. Regional User’s Workshop October 2005. Shimon Sarraf Center for Postsecondary Research, Indiana University Bloomington. “NESSIE”. Overview. Why should “engagement” be assessed Assessment Techniques with NSSE data Group Exercise and Discussion.

arleen
Download Presentation

Using NSSE to Answer Assessment Questions

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Using NSSE to Answer Assessment Questions Regional User’s Workshop October 2005 Shimon Sarraf Center for Postsecondary Research, Indiana University Bloomington

  2. “NESSIE” Overview • Why should “engagement” be assessed • Assessment Techniques with NSSE data • Group Exercise and Discussion

  3. Why should engagement be assessed? Because individual effort and involvement are the critical determinants of college impact, institutions should focus on the ways they can shape their academic, interpersonal, and extracurricular offerings to encourage student engagement. Pascarella & Terenzini, How College Affects Students, 2005, p. 602

  4. Who says engagement is important? Quality of Effort (Pace) Student Involvement (Astin) Social and Academic Integration (Tinto) Good Practices in Undergraduate Education (Chickering & Gamson) Student Engagement (Kuh)

  5. Assessment Approaches • Normative - compares your students’ responses to those of students at other colleges and universities. • Criterion - compares against a predetermined value or level appropriate for your students, given your institutional mission, size, curricular offerings, funding, etc. • Longitudinal – compare your average scores over time

  6. Assessment with NSSE Data • Descriptive displays of engagement patterns by any number of student characteristics • Use individual items and/or scales • Year-to-year tracking of student engagement • Multivariate models for retention, degree attainment, grades, other outcomes • Special peer comparisons with aspirational, regional, and mission-related institutions

  7. Descriptive Analysis • Comparisons by Student Background • Minority Students • First Generation College Student • Comparisons by Enrollment Characteristics • Greek • Athletes • College and/or Department

  8. Approaches to Descriptive Analysis • Most valued activities What is most valued at your institution, in departments, what does the data show? • Investigate “Nevers” Work on reducing or eliminating reports by students of never doing specific engagement activities. • How much variation? Box & Whiskers

  9. Descriptive Analysis Responses of Seniors by Major

  10. Descriptive Analysis Responses of Seniors by Major

  11. Descriptive Analysis

  12. T-test: p<.000; Effect Size: -.29 Descriptive Analysis Seniors Scale Scores by Transfer Status

  13. Variations in Student-Faculty Interaction by Discipline

  14. Data Consideration: Disaggregating Results • Experience indicates that survey results are most likely to be used when the results are disaggregated by specific program or unit (e.g., college or department). • Targeted oversamples of specific units may be warranted. • Sampling error statistics may not be a good indicator of data quality with smaller units.

  15. Comparisons Across Years FY Student Responses to Stu-Fac Items by Year

  16. Comparisons Across Years FY and Senior Stu-Fac Scale Scores by Year

  17. Comparisons Across Years FY Scores on Four Scales by Year

  18. FY Student t-test Comparisons 2003 and 2004 at Nesseville State

  19. Regression on Student-Faculty Interaction with Year

  20. Multivariate Modeling Regression model predicting grades at the end of the first year.

  21. Multi-equation Modeling A structural equation model explaining longitudinal relationships that lead to FY grades. Pre-college Engagement Outcome

  22. Special Peer Comparisons Selecting a peer group By mission By size By department By Race By Locale Current or Aspirant Peers

  23. Special Peer Comparisons Standard Frequency Report with Selected Peer Group

  24. Living on-campus Commuters Special Peer Comparisons Carnegie Group

  25. Special Peer Comparisons Student Level Benchmark Report

  26. Special Peer Comparisons: Student Distributions • First-year academic challenge scores • Are these two schools the same? • Same median benchmark score • Different range of scores

  27. Standard error of mean (precision of estimate) Non-response bias Weighting your sample to look like the population Comparability of survey items year-to-year Use other assessment techniques (i.e., focus groups, other surveys) to validate your findings—NSSE is but one source of assessment information Data Considerations

  28. NSSE Consortium • 6 or more institutions sharing comparative data • Great way to add value to participation • Often times mission specific • Ability to ask additional questions

  29. Sample Consortium questions

  30. Assessment Exercise :Department-Level Analysis • Scenario • Nesseville State University is preparing for an upcoming accreditation related to its engineering program • The college was encouraged to incorporate more “student voice” into their educational outcomes assessment • The University Provost and College Dean have worked to increase buy-in for using NSSE to collect information

  31. Assessment Exercise :Department-Level Analysis • Concerns to Address • Faculty are concerned that the Engineering College places too little emphasis on challenging and engaging pedagogical practice • The Dean is concerned that some of the departments are not preparing their students for life after graduation as well as others • The Provost would like to know how NSU engineering students compare to Engineering students nationwide • In previous Campus Surveys Engineering students have voiced dissatisfaction with their undergraduate experience

  32. Assessment Exercise :Department-Level Analysis • Building the Analysis • In submitting their population file, Nesseville State University included an extra variable to identify Engineering students and their departments within the College • Nesseville State indicated that they wished to oversample all Engineering seniors not identified for the random institutional sample • NSU constructed several NSSE student-level scales to use as a basis for their analysis, as well as requested a special analysis from NSSE to get normative data

  33. Assessment Exercise :Department-Level Analysis • What are some patterns that are evident in these results? • Were the expressed stakeholder concerns confirmed? • What differences are notable among departments? • What are some other sources of data that would be ideal to shed light on these results? • What additional analyses would you want to conduct?

  34. Using NSSE to Answer Assessment Questions Shimon Sarraf Research Analyst Indiana University Center for Postsecondary Research 1900 East 10th Street Eigenmann Hall, Suite 419 Bloomington, IN 47406 Ph: 812-856-2169 ssarraf@indiana.edu www.nsse.iub.edu

More Related