1 / 24

Using Surveys To Assess Student Learning

Using Surveys To Assess Student Learning. A Simple Guide. Office of Assessment and Accreditation Division of Academic Affairs Indiana State University. What Is A Survey?. A survey is a method of measurement by which respondents offer their opinions in respect to specific questions

niveditha
Download Presentation

Using Surveys To Assess Student Learning

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Using Surveys To Assess Student Learning A Simple Guide Office of Assessment and Accreditation Division of Academic Affairs Indiana State University

  2. What Is A Survey? • A survey is a method of measurement by which respondents offer their opinions in respect to specific questions • Answers are usually limited to certain questions and are measured on a specific scale • Surveys can also contain open-ended questions that lead to qualitative analysis, which can then lead to a comparison with closed-ended responses

  3. Why Use Surveys to Assess Student Learning? • It is usually convenient • Allows departments to limit questions to certain objectives, allowing for more efficient analysis • Several surveys can be comparatively analyzed to detect themes (triangulation) • Results can be easily summarized • Results can be compared with open-ended responses and/or interviews to detect themes • Can be used with direct assessments of student learning to comparatively analyze for themes (surveys are not considered methods of direct assessment of student learning)

  4. Outline of This Presentation • Survey Types • Knowing Purpose of Survey • Designing Structure • Designing Scale(s) • Writing Questions • Defining Respondents • Identifying Audience • Analyzing Results • Presenting Results • Defining Periodicity • Triangulating • Plan for Effective Use

  5. Types of Surveys • Student Surveys • Alumni Surveys • Employer Surveys • Performance Surveys • Field Experience Surveys • Internship Surveys

  6. Example of Employer Survey Demographic and program information Questions are designed on a four-point scale. Notice that the questions begin with verbs. The faculty expressed concerns in an assessment retreat about the four-point scale, preferring instead to insert a neutral point between 2 and 3

  7. Know Your Purpose • Is there a reason for performing the survey? • Given that the use of surveys can be time-consuming and expensive, is this the best method of assessing student learning? • Can survey questions be aligned with program objectives? • Can survey questions be aligned with student learning objectives? • Can survey questions be aligned with accreditation standards?

  8. Designing Survey Structure • Know the general format. Is it convenient, or is it an eyesore? • Location of question items • Location of scales • Location of boxes/sections

  9. Designing Scales Choose Type of Scale • Likert or likert-type scale • Rankings • Importance Scale • Link to what kind of analysis is desired

  10. Write Survey Questions • Usually good to begin with active verbs, or at a minimum words other than “the,” “a,” etc. • Beware of general concerns associated with survey question writing • Double-barreled questions • Double negatives • Loaded terms • Assumed knowledge • Vagueness • Lingo/use of unfamiliar terminology

  11. Example: Survey Prior to Revision Questions 1 through 6 ask respondents to share information about themselves; analysis can then focus on questions averages and means in respect to each. Faculty suggested inserting “elementary education” in #2, and deleting some questions altogether. Faculty also suggested a question asking students to identify how far through a program they had progressed. Several faculty were concerned about the 4-point scale and suggested the insertion of a middle point.

  12. Example: Survey After First Revision Elementary education was included in #2, question #4 included a question about on-line courses, and #5 about how far students had progressed through the graduate program. Note insertion of mid point and changes in meaning of 1 through 5 (in black box). Also note inclusion of –ing words to begin each sentence in each question.

  13. Define Respondents • Who are the respondents? • How will they be contacted? • How to maximize response rate? Answers to these questions will provide guidelines for problem solving. Identifying problems after attempting assessment might provide for “lessons learned,” but will not quench a need to generate useful assessment data.

  14. Identify Audience • Who will use this information? • Will questions on survey enable audience to create meaning? • In what ways will the survey facilitate focus on findings, and thereby encourage discussion about how to improve student learning in respect to program objectives?

  15. Analyze Results • Survey questionnaires often pre-determine the kinds of analysis that can be conducted • Types of analyses: • Descriptive statistics (means, medians, etc.) • Correlation analysis • Regression and logistic regression • ANOVA • Graphs: Bar, Boxplots, ANOVA, etc.

  16. Example of Tabular Information More than 180 surveys were inputted onto Excel. Results show minimum and maximum scores, means and standard deviations for faculty review. Means that had medians of 4 (the highest value on the survey) were noted with asterisk (*); all others had median of 3. Results were shared with faculty which then discussed findings.

  17. Example of Graphical Information When respondents note themselves by year, question averages can be placed on graph. This is especially helpful when faculty are interested in viewing trends over years. In this case, survey were originally placed in an Excel spreadsheet, then imported on SPSS, where a “means plots” (under ANOVA) was run.

  18. Present Results • Limit results to interest of your audience • Limit results to significant findings; often statistical analysis will help with identification of these findings, although familiarity with institutional history and program objectives is helpful • Focus on both strengths and weaknesses revealed from findings • Keep in mind that purpose of presentation is not to discuss statistics and methodology, but to encourage discussion about what to do about findings • Keep it simple!

  19. Define Periodicity • It is very important that faculty, program directors, and department chairs manage assessment on a regular basis • Collect, summarize, present, and produce action items (based on assessment) on a regular schedule

  20. Triangulate • It is often helpful to use several different sources of information (surveys and other assessments) so that common themes can be detected from different sources • Triangulation means that these themes are detected from different sources (preferably detected in faculty meetings)

  21. Example: Summaries Graduate student, alumni, and employer survey results in respect to a specific standard are listed. When placed on a grid, general comparative themes may be detected by faculty.

  22. Example: Summary Statements Based on Triangulation These general themes are then listed on a chart of strengths and weaknesses and shared with program or department faculty, who then discuss actions that might be taken to further enhance student learning outcomes and/or program objectives.

  23. Creating Meaning and Closing the Loop! • Ideally, faculty play a role in designing surveys, reviewing their use, analyzing data, and producing findings • Important note: Survey findings can only be useful in assessment if they lead to general findings about student learning and program objectives • Important also: Any assessment can only be useful if it leads to documented action that leads to enhancement of student learning and/or program objectives

  24. Concluding Remarks • Survey assessments are considered indirect assessments. Therefore, it is best to compare findings with direct assessments of student learning • Survey assessments can be very useful for observing what students believe they are learning, what alumni feel that have learned, and how well employers feel graduates have been prepared • Survey assessments create very useful findings if a program or department is concerned about the quality of student preparation (i.e., employer, mentor, or internship surveys). • Open-ended survey responses can also be analyzed to detect trends or concerns. Although not a focus of this analysis, useful information can be gained through the systematic analysis of open-ended questions.

More Related