Developing surveys for the outcomes assessment process
Download
1 / 18

- PowerPoint PPT Presentation


  • 225 Views
  • Updated On :

Developing Surveys for the Outcomes Assessment Process. Kim Anderson Course Evaluation Subcommittee Chair Summer 2009. What is a Survey?. NOUN: pl. sur·veys (sûrv ) A detailed inspection or investigation. A general or comprehensive view.

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about '' - merrill


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
Developing surveys for the outcomes assessment process l.jpg

Developing Surveys for the Outcomes Assessment Process

Kim Anderson

Course Evaluation Subcommittee Chair

Summer 2009


What is a survey l.jpg
What is a Survey?

NOUN: pl.sur·veys (sûrv )

A detailed inspection or investigation.

A general or comprehensive view.

A gathering of a sample of data or opinions considered to be representative of a whole.

The process of surveying.

(From the American Heritage Dictionary)

Assessment instrument that measures a characteristic or attitude that ranges across a continuum of values or identifies a value or belief on a rating scale. Typically use a sampling of data to be representative of a whole that is being studied.


Surveying for surveying s sake is problematic l.jpg
“Surveying for Surveying’s Sake” Is Problematic

Distinct and Practical Purpose

  • If not distinct then will get frequent changes in the survey

  • If not practical then will get fatigue from all involved and poor expenditure of resources

    • Survey fatigue

    • Bureaucratic fatigue

    • Assessment fatigue

    • Audience fatigue

  • Low quality

  • Indirect vs. direct assessment issue

  • “One shot” assessments are less valuable than continuous assessments

    If any issues arise or persist: Do not use a survey


Purpose preliminary planning l.jpg
PurposePreliminary Planning

  • Confronted with a need for information (questions it should answer)

  • Be specific, clear-cut, and unambiguous as possible with needed information (focus)

  • Best possible way to ascertain desired information

  • Write as few questions as possible to obtain information

  • Trade-offs exist


Survey development l.jpg
Survey Development

  • Step 1: Decide to whom and how the survey will be administered.

  • Step 2: Determine the content and wording of each question.

  • Step 3: Determine the structure of response to each question.

  • Step 4: Establish the sequence of questions.

  • Step 5: Design the format and appearance.


Step 1 decide to whom and how the survey will be administered l.jpg
Step 1: Decide to whom and how the survey will be administered.

  • Sample Size

    • General population/All

    • Sample = Portion of a population of interest (scientifically chosen or reliable projection or randomly selected)

  • Collection of Data

    • In-person, mail, e-mail, phone (paper surveys assume literacy and are time consuming to manage)

    • Optical Mark Reader, e.g. use a No. 2 pencil (requires a machine to read answer sheets)

    • Web-based, e.g. Survey Gizmo (online surveys are convenient, but often assume respondents have access to a computer, are technologically literate, and feel comfortable responding in an electronic format)

  • Professional & financial resources available


Step 2 determine the content and wording of each question l.jpg
Step 2: Determine the content and wording of each question.

  • Appropriateness based on purpose

  • Eliminate unnecessary questions

    • Is this a double-barreled question that should be split or eliminated?

    • Can the respondent answer this question? (too long ago or worded in a way that might sway)

    • Will the respondent answer this question? (personal)

  • Appropriate wording

    • Not too vague or confusing

    • Avoid double negatives

    • Unfamiliar terminology (lingo)

    • Loaded terms (sensitive/controversial questions)


Step 3 determine the structure of response to each question l.jpg
Step 3: Determine the structure of response to each question.

  • Open-ended: Not one definite answer; answer in their own words; requires the necessary time and effort to answer; yields quotable material; difficult to analyze; factor in time and effort for data compilation

  • Closed-ended: finite set of answers to choose; easy to standardize; data gathered lends to analysis; more difficult to write (must design choices including all possible answers)


Closed ended types l.jpg
Closed-Ended Types

  • Likert scale: How closely feelings match the statement on a rating scale

  • Multiple choice: pick the best answer(s) from the finite options

  • Ordinal: Rank ordered for all possible answers; rate in relationship to others

  • Categorical: Possible answers are in categories; respondent must fall into exactly one

  • Numerical: Answer must be a real number



Responses to questions general suggestions l.jpg
Responses to QuestionsGeneral Suggestions

  • Scale Point Proliferation: Too many points on a rating scale (more than 5) is confusing and hairsplitting

  • Order of Categories: Better to list a progression between a lower level to a higher

  • Category Proliferation: Minor distinctions among categories are not useful; brevity

  • “Other”: With a few exceptions, avoid this option


Step 4 establish the sequence of questions l.jpg
Step 4: Establish the sequence of questions.

  • First Part = easier questions (gains cooperation)

  • Middle Series = most important topics

  • End of the survey = demographic and other classification questions

  • Conclude with a thank you.


Step 5 design the format and appearance l.jpg
Step 5: Design the format and appearance.

  • Attractive, clearly printed, and well laid out

  • Appealing and simple to complete

  • Quality engenders better response

  • Representing program and the college


No survey is perfect l.jpg
No Survey is Perfect

  • Fallacy of Perfection

    • Ask for feedback in each step of the development process

    • Ask colleagues both in and out of the program or discipline for reactions and suggestions

    • Beta test

    • Many GREAT surveys have “crashed and burned” in prior revisions; just be patient

  • Administration

    • Cover Letter or script to provide consistency

    • Address protection of confidentiality


Surveys and outcomes assessment l.jpg
Surveys and Outcomes Assessment

  • Survey creation is the beginning of the process

  • Consider analysis requirement (statistical or otherwise) during survey development

    Types of statistical analyses:

    ✓ Descriptive statistics (means, medians, etc.)

    ✓ Correlation analysis

    ✓ Regression and logistic regression

    ✓ Graphs: Bar, Boxplots, ANOVA, etc.

  • Keep it simple—present basic, descriptive data regularly;

    More nuanced analysis is possible if there is a need to

    ✓ demonstrate differences (ANOVA, t-test)

    ✓ demonstrate correlation (basic spearman’s correlations)

    ✓ explain causation (regression)

  • Resources needed and available

  • Responder anonymity and data confidentiality

  • Key findings = presentation plan to improve service and student learning


Surveys and slos l.jpg
Surveys and SLOs

  • Survey assessments are considered indirect assessments. Therefore, it is best to compare findings with direct assessments of student learning.

  • Survey assessments can be very useful for observing

    ✓ what students believe they are learning,

    ✓ what alumni feel that have learned,

    ✓ how well employers feel graduates have been prepared.

  • Survey assessments create very useful findings if a program is concerned about the quality of student preparation (i.e., employer, mentor, or work experience surveys).

  • Closed-ended questions are derived from the content knowledge.

  • Open-ended questions lead to qualitative analysis that can be compared with closed-ended responses.

  • Open-ended survey responses can also be analyzed to detect trends or concerns. Useful information can be gained through the systematic analysis of open-ended questions.


Final thoughts l.jpg
Final Thoughts

  • Well-crafted surveys are methods of describing opinions, or even describing changes in perceptions and attitudes

  • More work is involved in creating surveys and managing the survey process than usually anticipated

  • No survey is perfect; it is often best to combine a survey-based assessment with an assessment involving direct assessment of process or performance

  • That said, survey information can be useful

  • Questions? Thank you.


ad