1 / 20

Updating the National Survey of Student Engagement: Analyses of the NSSE 2.0 Pilots

Updating the National Survey of Student Engagement: Analyses of the NSSE 2.0 Pilots. Allison BrckaLorenz Bob Gonyea Angie Miller. Goals and Purposes. To continue in our core purpose of assessing student engagement in effective educational practices to inform improvement efforts;

carrington
Download Presentation

Updating the National Survey of Student Engagement: Analyses of the NSSE 2.0 Pilots

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Updating the National Survey of Student Engagement: Analyses of the NSSE 2.0 Pilots Allison BrckaLorenz Bob Gonyea Angie Miller

  2. Goals and Purposes • To continue in our core purpose of assessing student engagement in effective educational practices to inform improvement efforts; • To stay current with movements and trends in higher education; • To improve the clarity, consistency, and applicability of the survey; • To improve the properties of existing measures; and • To incorporate new measures relevant to effective teaching and learning

  3. Pilot Instruments • 2011: new items about quantitative reasoning, effective teaching practices, collaborative learning, technology, global awareness, diverse perspectives, learning strategies, and reading comprehension • 2012: from the original NSSE instrument 24 items were deleted, 36 were new. Of the items that stayed a third did not change, a third had minor changes, a third had major changes

  4. Pilot Administrations • Institutions were selected to cover a range of institutions by Carnegie type, size, selectivity, minority-serving status, religious affiliation, urban status, geographic region, and online instruction • 2011 • 19 institutions; 20,000 students • Institutional response rate average of 35% • 2012 • 55 institutions; 50,000 students • Institutional response rate average of 28%

  5. Pilot Samples • Two-thirds women • Mostly under 24 years old • Half earning mostly “A” grades • Two-thirds White • Nearly all full-time enrolled • Half first-generation • More men in businessand engineering; more women in education, social sciences, and other professions • 57% of seniors were transfers in 2011 compared to 45% in 2012

  6. Methods: Qualitative • Qualitative information • In 2011 and 2012, 120 students in cognitive interviews, 79 students in 10 focus groups at 12 different campuses, phone interviews for specific questions, write-in responses from students completing the pilots, feedback from outside sources and institutional users • Using Cognitive Interviews to Improve Survey Instruments, Tuesday 1:55

  7. Methods: Individual Items • Item descriptives included frequencies, means, standard deviations, standard errors, skewness, kurtosis, and percent missing • Calculated by class level, gender, and major • Comparisons between pilots, pilot to the institution’s last standard administration, and co-administration at 7 institutions in 2012

  8. Methods: Content Areas Standard NSSE Updated NSSE Academic Challenge Deep Approaches to Learning Collaborative Learning Experiences with Faculty Diverse Interactions High-Impact Practices Campus Environment Self-Reported Gains • Level of Academic Challenge • Active and Collaborative Learning • Student-Faculty Interaction • Enriching Educational Experiences • Supportive Campus Environment • Deep Approaches to Learning • Self-Reported Student Gains

  9. Methods: Indicators • Exploratory factor analysis • Confirmatory factor analysis • Aggregate descriptives • Validity differences by groups (2011) • Concurrent validity (2011) • Predictive validity (2011) • Reliability • Item response theory • Generalizability theory (2012) • The Dependability of the New NSSE: A Generalizability Study, Monday 2:15

  10. Results: Content Areas & Indicators • Academic Challenge • Quantitative Reasoning • Learning Strategies • Deep Approaches to Learning • Higher Order Learning • Reflective and Integrative Learning • Collaborative Learning • Collaborative Learning • Experiences with Faculty • Student-Faculty Interaction • Good Teaching Practices • Diverse Interactions • Interactions with Diverse Others • Campus Environment • Quality of Interactions • Campus Support • Student-Reported Gains • Student-Reported Gains • High Impact Practices • Individual items

  11. Academic Challenge: Quantitative Reasoning, Learning Strategies • Writing, reading, quantitative reasoning, use of learning strategies, perception of challenging coursework, time spent preparing for class • Future indices of writing and challenge in the future • Generalizability issues: emphasizes the importance of looking within

  12. Deep Approaches to Learning: Higher Order Learning, Reflective and Integrative Learning • Integrating diverse perspectives, reflection on understandings, higher-order tasks such as application or evaluation • Content area likely to merge with Academic Challenge in the future

  13. Collaborative Learning • Working with peers, helping peers, receiving help from peers • Results from the 2011 pilot showed large differences for online students • 2012 results showed that these items are appropriate for online students despite collaborating less with peers

  14. Experiences with Faculty: Student-Faculty Interaction, Good Teaching Practices • Instructors’ use of clear teaching behaviors, faculty mentoring, working with faculty outside of class, in-class interactions with faculty • Online students report fewer experiences with faculty but items are still appropriate for online learners • Some issues with part/full-time students answering “In how many of your courses” so items will be reframed in 2013

  15. Diverse Interactions • Having serious discussions with people who are different from you • Qualitative Issues: Using Cognitive Interviews to Improve Survey Instruments, Tuesday 1:55 • Items rewritten for clarity in 2013

  16. High-Impact Practices • Students’ participation in, or plans to participate in a variety of high-impact educational experiences: • Learning community • Internship • Study abroad • Research with faculty • Culminating senior experiences • Service learning • Formal leadership experiences

  17. Campus Environment: Quality of Interactions, Campus Support • Perceptions of the quality of interactions with various people on campus, perceptions of different ways their institution supports success or encourages beneficial activities • Small differences for online students but items are still appropriate

  18. Self-Reported Gains • Students’ general perception of their learning in a variety of areas • Diverse grouping of items should not be interpreted as a unidimensional construct • An item from the 2011 pilot about becoming an active and informed citizen was removed in 2012 but added to the 2013 survey

  19. Looking Ahead • Updated survey content with both new and modified items • New groupings of items to serve as indicators of engagement • New items within optional modules • Academic Advising, Civic Engagement, Development of Transferable Skills, Experiences with Diverse Perspectives, Learning with Technology, Experiences with Writing

  20. Questions?Paper, presentation, and more information about NSSE at nsse.iub.edu Special thanks to our research team: Jim Cole, Yiran Dong, Kevin Fosnacht, Kevin Guidry, Heather Haeger, Amber D. Lambert, Thomas Nelson Laird, Wen Qi, Amy Ribera, Louis Rocconi, Shimon Sarraf, Rick Shoup, Malika Tukibayeva abrckalo@indiana.edu rgonyea@indiana.edu anglmill@indiana.edu

More Related