1 / 27

To Survey or not to Survey? That’s a Really good Question!

It is tempting, if the only tool you have is a hammer, to treat everything as if it were a nail. ~ Abraham Maslow, 1966. To Survey or not to Survey? That’s a Really good Question!. SARA Assessment Brown Bag February 10, 2011. Why is assessment important?.

amil
Download Presentation

To Survey or not to Survey? That’s a Really good Question!

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. It is tempting, if the only tool you have is a hammer, to treat everything as if it were a nail. ~ Abraham Maslow, 1966 To Survey or not to Survey? That’s a Really good Question! SARA Assessment Brown Bag February 10, 2011

  2. Why is assessment important? • Helps determine if you are meeting your educational objectives • Helps ensure that you have the resources you need • Helps prioritize efforts • Can contribute to our understanding of student learning and development

  3. Key steps - PDCA • Identify outcomes • Identify appropriate measures • Choose an appropriate assessment method • Choose and appropriate research design • Collect the data • Analyze the data • Disseminate the findings • Take action • Wash, Rinse, Repeat!

  4. Presentation outline • Why do you ask? • Review types of outcomes and measures • Benefits and drawbacks to surveys • Other types of assessments • It’s your turn - practice makes perfect!

  5. Why not use a survey? • Ease of web-based surveys has lead to their proliferation • Result is that surveys are the “go-to” research tool, but… • Not every question is a nail • Survey fatigue is a significant threat • Multiple types of evidence (assessment triangulation) build a stronger case

  6. Starting questions • What type of outcome do you want to measure? • Cognitive (knowledge) • Affective (attitudes) • What type of data do you want to collect? • Psychological (personal traits) • Behavioral (observable activities) • Time frame? • Short-term • Long-term

  7. Taxonomy of Student Outcomes (Astin, 1993)

  8. Time: Examples of Short- and Long-term Outcomes (Astin, 1993)

  9. Types of measures • Direct measures • Indirect measures • Norm-referenced • Criterion-referenced • Self-referenced • Although direct measures are typically preferred, practically speaking, your overall assessment plan should contain a mix of these

  10. Your question should guide your choice of assessment tool, and not the other way around!

  11. Surveys “Surveys represent one of the most common types of quantitative, social science research. In survey research, the researcher selects a sample of respondents from a population and administers a standardized questionnaire to them.” http://writing.colostate.edu/guides/research/survey/index.cfm

  12. Surveys can be a good tool if you are interested in: • Perceptions • Beliefs • Motivations • Future plans • Past behavior • Private behavior

  13. What about student learning? • Research indicates that aggregate self-reports of learning can provide a reasonable estimation of actual learning. * • **Self-reported data is most valid when: • the information is known to the respondents, • the questions are unambiguous and refer to recent activities, • the respondents take the questions seriously, and • responding has no adverse consequences nor does it encourage socially desirable, rather than truthful, answers. * (Anaya, 1999; Kuh, Kinzie, Schuh, Whitt, & Associates, 2005; Laing, Swayer, & Noble, 1989; Pace, 1985; Pike, 1995). **(Kuh et al., 2005 & Pike, 1995)

  14. Validity and reliability Surveys tend to be weak on validity and strong on reliability. The artificiality of the survey format puts a strain on validity. Since people's real feelings are hard to grasp in terms of such dichotomies as "agree/disagree," "support/oppose," "like/dislike," etc., these are only approximate indicators of what we have in mind when we create the questions. Reliability, on the other hand, is a clearer matter. Survey research presents all subjects with a standardized stimulus, and so goes a long way toward eliminating unreliability in the researcher's observations. Careful wording, format, content, etc. can reduce significantly the subject's own unreliability. http://writing.colostate.edu/guides/research/survey/index.cfm

  15. Representativeness • If your respondents are not representative of your population, then your results may be misleading • Example: You want to survey all undergraduate students about their attitudes towards a student honor code. Your friend is in charge of the FYS program and offers to have your survey passed out in all first-year seminars. You’re excited to get this direct push for your survey, but how might this effect your results?

  16. Strengths of surveys • Relatively inexpensive • Can reach large numbers of people • Large numbers allow for multivariate analyses • Can ask many questions relatively quickly • Standardized instruments (like NSSE) allow for comparisons between groups • Can be confidential or anonymous • Can have high reliability

  17. Strengths continued • Can provide student/alumni/employers perspective of the institution/program. • Can make respondents feel that their opinions matter. • Ease of response can provide information from hard to reach individuals • Results easily understood

  18. Weaknesses of surveys • Questions have to be general enough to apply to all or most respondents • Inflexible - Forced-response choice may not allow respondents to express their true opinions • Require good response rates to achieve representative results • May be hard for respondents to recall information or answer truthfully • Can seldom deal with “context”

  19. Weaknesses continued • Validity can be questionable – results tend to be highly dependent on wording of items, salience of survey, and organization of the instrument • Socially desirable responses • Indirect evidence which may have less legitimacy with stakeholders • Better for measuring and comparing the responses of groups rather than individuals

  20. Finally, it might not be a survey if…. • The questions you want to ask don’t have a limited number of known, well-defined possible answers • You want to be able to ask about relationships rather than inferring them • Your population of interest differs in culture or language from the majority

  21. Some other types of assessments • Standardized exams • Test of abilities or knowledge • Simulation or performance appraisals • Interviews and focus groups • External examiners • Archival records and transcript analysis • Portfolios • Behavior observations • Student self-evaluations • Reflective writing • Minute papers/muddiest point

  22. Final thoughts on choosing an assessment tool • Go back to your assessment question(s) • What do you want to know? • What are the resource limitations? (e.g., time, money, staff) • One shot or longitudinal? • Experimental design? • What type of analysis is appropriate?

  23. Now it’s Your turn

  24. Think about a program you want to assess… • What do you want to know? • What type of measure? • How many students are involved? • What type of evidence do you already have (if any)? • What type of evidence is most effective with your intended audience?

  25. Sample assessment questions • Do participants in an alternative spring break (n=20) develop an increased awareness of social injustice and subsequently a greater commitment to working for social justice? • Do students (n=5,000) who go through an alcohol intervention drink less as a result? • Do participants in the PRCC’s Learning Circle (n=15) exhibit improved ability to engage in positive cross-racial interaction with other participants?

  26. Resources • Astin, A. W. (1993). Assessment for excellence: The philosophy and practice of assessment and evaluation in higher education. American Council on Education Series on Higher Education. Phoenix, AZ: Oryx Press. • Improving Educational Programming online training module: https://www.sa.psu.edu/workshops/edprogram/index.htm • Pope, R. L., Reynolds, A. L., Mueller, J. A., & Cheatham, H.E. (2004). Multicultural competence in student affairs. San Francisco: Jossey-Bass. • SARA Assessment Resources website:http://studentaffairs.psu.edu/assessment/resources.shtml • Writing Guide: Survey Research, Colorado State University: http://writing.colostate.edu/guides/research/survey/index.cfm • Yin, A. C., & Volkwein, J. F. (2009). Assessing General Education Outcomes. In J. F. Volkwein (ed.), Assessing Student Outcomes: Why, who, what, how?New Directions for Institutional Research Assessment Supplement (pp. 79-100). San Francisco: Jossey-Bass.

  27. Questions?

More Related