1 / 11

Presenter: Victoria Porterfield Office of Institutional Research and Academic Planning

Enterprise-wide Survey Software Management Systems and Response Rates Analysis and Strategies: Results from membership survey. Presenter: Victoria Porterfield Office of Institutional Research and Academic Planning Rutgers – The State University of New Jersey.

bryar-kidd
Download Presentation

Presenter: Victoria Porterfield Office of Institutional Research and Academic Planning

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Enterprise-wide Survey Software Management Systems and Response Rates Analysis and Strategies: Results from membership survey Presenter: Victoria Porterfield Office of Institutional Research and Academic Planning Rutgers – The State University of New Jersey

  2. Overview of Declining Response Rates • Much of the research conducted on survey response rates, including higher education surveys, has witnessed a decline of response rates. • Fosnacht, Sarraf, Howe & Peck (2013) • National Research Council (2013) • National Survey of Student Engagement (2014) • Sax, Gilmartinn, & Bryant (2003) • Web surveys have particularly suffered in recent years • Dillman, Phelps, Tortora, Swift, Kohrell, Berck, & Messer (2009) • Manfreda, Berzelak, Haas, & Vehovar (2008) • Petchenik & Watermolen (2011)

  3. Why this study was done? • Overall, response rates for SERU have declined over time • Two possible contributors: • General survey fatigue • Enterprise-wide survey software which has become increasingly available at higher education institutions in recent years

  4. General survey fatigue

  5. Enterprise-wide survey software • There are positives with enterprise-wide survey software • Easier to develop social science research • Cheaper to develop social science research • Many of the same enterprise-wide survey software products are used at different institutions (i.e. Qualtrics), which can be used to collaborate surveys with other institutions • But…. • With increased access to survey software, there are more surveys which can contribute to survey fatigue

  6. Enterprise-wide survey software • What are the response rate differences before and after the survey software became more available?

  7. Enterprise-wide survey software • But can’t these differences be attributed to general survey fatigue? Certainly! • Therefore, the access of each institution’s enterprise-wide survey software was evaluated. • Three schools have “high coordination” efforts • Clear and prominent terms of use policy • Formal survey coordination committee • Limits on access • Three schools have “low coordination” efforts • May require IRB but otherwise can gain access

  8. Enterprise-wide survey software • 2-sample t-tests revealed significant differences between each group

  9. Conclusions • For the most part, SERU response rates are declining • General survey fatigue is hurting response rates • Enterprise-wide survey software may be hurting response rates if coordination/restrictions are not employed across campus • Further research with more institutions is recommended to learn more about the intertwining relationship with survey fatigue, enterprise-wide software, and response rates.

  10. Questions? Comments? • Special thanks to Iowa, Minnesota, Pittsburgh, Texas A&M, and Washington for participating in this study • Contact information: Victoria Porterfield Email: porterfield@instlres.rutgers.edu

  11. References Dillman, D. A., Phelps, G., Tortora, R., Swift, K., Kohrell, J., Berck, J., & Messer, B. L. (2009). Response rate and measurement differences in mixed-mode surveys using mail, telephone, interactive voice response (IVR) and the Internetq. Social Science Research, 38, 1-18. Fosnacht, K., Sarraf, S., Howe, E., & Peck, L. (2013). How important are high response rates for college surveys? Paper presented at the annual forum of the Association for Institutional Research, Long Beach, CA. Manfreda, K. L., Bosnjak, M., Berzelak, J., Haas, I., & Vehovar, V. (2008). Web surveys versus other survey modes. A meta-analysis comparing response rates. International Journal of Market Research, 50(1), 79–104 National Research Council. (2013). Nonresponse in social science surveys: A research agenda. Washington, DC: The National Academies Press National Survey of Student Engagement. (2014). NSSE 2014 U.S. response rates by institutional characteristics. Petchenik, J., & Watermolen, D. J. (2011). A cautionary note on using the Internet to survey recent hunter education graduates. Human Dimensions of Wildlife 16(3): 216-218. Sax, L. J., Gilmartin, S. K., & Bryant, A. N. (2003). Assessing Response Rates and Nonresponse Bias in Web and Paper Surveys. Research in Higher Education, 44 (4), 409–432.

More Related