1 / 24

Off-the-Shelf or Homegrown? Selecting the Appropriate Type of Survey for Your Assessment Needs

Off-the-Shelf or Homegrown? Selecting the Appropriate Type of Survey for Your Assessment Needs. Jennifer R. Keup, Director National Resource Center for The First-Year Experience and Students in Transition keupj@mailbox.sc.edu. Institutional data are meaningless without a comparison group.

oke
Download Presentation

Off-the-Shelf or Homegrown? Selecting the Appropriate Type of Survey for Your Assessment Needs

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Off-the-Shelf or Homegrown? Selecting the Appropriate Type of Survey for Your Assessment Needs Jennifer R. Keup, Director National Resource Center for The First-Year Experience and Students in Transition keupj@mailbox.sc.edu

  2. Institutional data are meaningless without a comparison group.

  3. My institution is unique in its programs and goals.

  4. The main outcome of interest on my campus is student development.

  5. Goals for Today • Introduce & discuss Ory (1994) model for comparing and contrasting local vs. commercially-developed instruments • Identify elements of your institutional culture & structure that would influence decision • Discuss myths with respect to survey administration • Share examples of: • …the most prominent national surveys for first-year assessment • …software and services available to facilitate institutional assessment

  6. What do We Mean “Off-the-Shelf” “Homegrown” • Often commercially-developed • Scope to include multiple institutions • Primarily pre-set content • Examples: • CIRP • NSSE • EBI • Developed locally • Focused on institution • Content developed and adapted by the campus/unit • Examples: • Program review • Utilization/satisfaction surveys for specific programs Continuum

  7. Questions to Ask Who needs to see these data? What are my analytical capabilities? What is my budget? Who needs to make decisions with these data? How will this fit with my other responsibilities? What is my timeline?

  8. Ory (1994) Model for comparing and contrasting local vs. commercially-developed instruments

  9. Six Factor Comparison • Purpose • Match • Logistics* • Institutional Acceptance • Quality • Respondent Motivation to Return the Instrument

  10. Purpose Why are we doing this study, and how will the results be used? “Off-the-Shelf” “Homegrown” • Allows for comparison to national norm group • Examples: • Comparison to peer or aspirant group • Benchmarking • Contextualize a broad higher education issue or mandate • Allows for a thorough diagnostic coverage of local goals and interests • Examples: • Satisfaction with campus program • Achievement of departmental goals • Program review

  11. Match • What are the program/institutional goals, outcomes, and areas of interest? • Does an existing instrument meets my needs? • Does the survey address my purpose? • Can the existing instrument be adapted to meet my needs? “Off-the-Shelf” “Homegrown” • May provide in- complete coverage of local goals & content • Tailored to local goals and content Local Questions!

  12. Institutional Acceptance IRB • How will the results be received by the intended audience? • Who needs to make decisions with this data? • What is the assessment culture? “Off-the-Shelf” “Homegrown” Politics • Professional quality & national use may enhance acceptance • Failure to completely cover local goals and content may inhibit acceptance • Local development can encourage local ownership and acceptance • Concerns about quality may interfere with acceptance

  13. Quality • What is the track record of quality? • What is the psychometric soundness of the instrument? “Off-the-Shelf” “Homegrown” • Tend to have better psychometrics • Professional quality may compensate for incomplete coverage of local goals and objectives • Must fully test psychometric properties • Create a professional appearance • Lack of professional quality may affect results and institutional acceptance

  14. Respondent Motivation toReturn the Instrument “Off-the-Shelf” “Homegrown” • What will yield the highest response rate? • Can create instant credibility • Sometimes provide institutional or individual incentives • Local specificity may yield greater respondent “buy in” • Local instruments may not “impress” people • Can create student perception of immediate impact Incentives

  15. Logistics (10 considerations) • Availability • Preparation time • Expertise • Cost • Scoring • Testing time • Test & question types • Ease in administration • Availability of norms • Reporting The Devil is in the details!

  16. Logistics (continued) OTS: Availability HG: Availability • Does a survey currently exist for our needs? • If you can afford it, the survey is available • “If you build it they (i.e., data) will come” • Takes time & resources to develop OTS: Prep time HG: Prep time • Short • Can take considerable time • What is the survey timeline? Is it feasible? • Have you considered administration planning?

  17. Logistics (continued) OTS: Expertise HG: Expertise • Fully-developed protocol allows one to administer after reading manual • Takes content, measurement, and administrative experience • Psychometrics!!! OTS: Scoring HG: Scoring • Can be delayed if scoring off campus • Need to adhere to the administration cycle • Can be immediate Related to expertise

  18. Logistics (continued) OTS: Testing time HG: Testing time • Fixed based upon content and administration protocol • Flexible as long as the survey meets institutional & programmatic needs If administering in class do you have faculty buy-in? OTS: Test type HG: Test type • Allows for flexibility in type of test (objective /open-ended) and type of question (MC, rank ordering, etc.) • Type of test and questions are predetermined

  19. Logistics (continued) OTS: Ease of Admn HG: Ease of Admn IRB • Requires standardized administration • Special training for testers • Allows for greater flexibility OTS: Norms HG: Norms • National & inter-institutional comparison • Intra-institutional comparison OTS: Reporting HG: Reporting • Standard formats that don’t always relate to institution • Institutional tailoring of results and reporting

  20. Logistics (continued) OTS: Cost HG: Cost • Primary costs associated with purchase price • Other costs: • Scoring • Data • Specialized reporting • Human resources to coordinate campus administration • Recurring cost • Primary costs associated with development costs • Instrument development • Ensuring psychometric properties • Scoring & recording data • Reporting findings • Other costs • Software/hardware • Training • Primarily one-time investment

  21. Purpose Match Logistics Accept-ance Response Quality

  22. O-T-S vs. HG Myths • You can only gather comparison data from national (OTS) surveys • It is cheaper to develop and administer a homegrown survey • Off-the-shelf surveys don’t require any work • Homegrown surveys are hard. • You don’t need IRB approval for local assessment • Off-the-shelf surveys study all the important topics

  23. FYE Assessment Examples “Off-the-Shelf” “Homegrown” • CIRP • Freshman Survey • Your First College Year (YFCY) Survey • NSSE • Educational Benchmarking Incorporated • Services • Eduventures • Student Voice • Software • Zoomerang • Survey Monkey

  24. Continuum of Assessment Survey Monkey Zoomerang CIRP NSSE EBI Eduventures Student Voice Off-the-Shelf Home Grown

More Related