1 / 52

Service Learning 2008 - 2010

Service Learning 2008 - 2010. Dr. Albrecht Research Team EXAMPLE of EVALUATION RESEARCH. Background Information about Project.

winka
Download Presentation

Service Learning 2008 - 2010

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Service Learning2008 - 2010 Dr. Albrecht Research Team EXAMPLE of EVALUATION RESEARCH

  2. Background Information about Project 18 public school districts in the state of Texas were involved in Service Learning Projects. The State of Texas Service Learning Center hired Dr. Carol Albrecht and her students to evaluate this program. These power point slides outline the steps they took to complete this evaluation. We met with them to identify their objectives. They wanted to know how the program impacted public school children, teachers, community partners and parents of students.

  3. Using a map of Texas and information from the Service Learning Center about the schools, we identified a group of schools that were diverse in terms of school size, school location, racial composition and socioeconomic status of students.

  4. Using, multi-methods, we measured how the following groups felt about the Service Learning Program and the impact they felt it had on students.

  5. They used the following six step process.

  6. Texas A&M students were sent out all over Texas to administer the pre-test. Five months later they were sent out to administer the post-test. Click here to see the surveys and codebooks.

  7. Texas A&M students also did telephone or face to face interviews with teachers/ parents/community partners. Finally, they conducted 10 to 12 focus groups with a small convenience sample of students and with teachers and community partners. were sent out all over Texas to administer the pre-test. Click here to see the surveys and codebooks.

  8. An Old Chinese Proverb States: I HEAR, I FORGET I SEE, I REMEMBER I DO, I UNDERSTAND By “doing” this project we learned and really understood some important components of valid and reliable evaluation research.

  9. Timing of the Pre-test • Many programs are ongoing, and this can have a major impact on the pre-test. • In our study, many of the students had already participated in a Service Learning activity at some point in their school years. So, we didn’t have a true “pre” test. The “pre” test scores were contiminated by prior participation.

  10. Timing of the Post-Test • The “May” Effect • Outside environmental/social factors need to be considered. • In our study, we discovered that both teachers and students were more negative about almost EVERYTHING related to school at the end of the year. This may be explained by two factors: First, they were just TIRED of school, and looking forward to vacation. Second, they had just taken standardized tests, TAKS. These tests were stressful for both teachers and students.

  11. Selecting the Right Indicators • Head Start Program • In the 1960’s the head start program was launched. The objective was to increase the IQ scores of underrepresented populations including children living in poverty. Early research showed that standardized tests of IQ increased for several years, and the decreased, until there was no difference between the experimental and control groups. While some felt this was evidence for discontinuing the program, parents came forward arguing that the researchers weren’t using the right measurements.

  12. Selecting the Right Indicators Head Start Program A group of researchers call The Perry Preschool Consortium, with The input of teachers and parents, identified (1) social, (2) educational and (3) socioeconomic indicators that differentiated preschool participants from a control group up to 19 years after participation in the program. The differences were compelling.

  13. Accurate Interpretation of Indicators Furthermore, this group argued that the decreasing IQ scores actually provided evidence that environmental factors CAN influence IQ – both positively and negatively. Thus being In an “enriched” environment (i.e., the Head Start Program) can increase IQ but then being transferred to an impoverished environment (i.e., public schools in poor neighborhoods) can decrease IQ.

  14. Schooling Success High School Graduation or Equivalent College or Vocational Training Functional Competence Ever Classified as Mentally Retarded Time Spent in Special Education Social Responsibility Ever Detained or Arrested Teen Pregnancies Employed Receiving Welfare

  15. Selecting the Right Indicators • Using focus groups and intensive interviews, we looked to (1) Teachers, (2) Parents (3) And student participants as well as past research, to help us identify valid and accurate indicators. Analysis of this qualitative data indicated (1) Some students did experience the desired impact. (2) We needed to use “control” variables to accurately assess the impact of Service Learning.

  16. Selecting the Right Control Variables The following control variables were all highly significantly related to EVERY outcome measurement.

  17. Our three indicators of student success were all related to the control variables Student Success Results from focus groups with students and intensive interviews with teachers indicated that these were valid indicators of the quality and quantity of participations were related to outcomes.

  18. Click here to see the Power Point Presentation.

  19. CHART 1. High School Students’ Perception of How Good They are at Speaking in Front of Groups by Whether or Not They Made Decisions about Service Learning Projects (p <0.0001)

  20. CHART 2. High School Students’ Perception of How Good They are at Finding Resources by Whether or Not They Made Decisions about Service Learning Projects (p <0.0001)

  21. Beware of Social Desirability In Evaluation Research, Participant Often Evaluate a Program Positively – EVEN when the Program is “poor” and Ineffective. It may not be seen as socially acceptable to do otherwise.

  22. Social Desirability • Why did we become concerned? • What are the “danger” signs? • How did we attempt to alleviate it? • How did we modify the construction of our surveys, our research design and analysis of the data to deal with this problem?

  23. Social Desirability • Danger Signs • Type of research • Past literature indicates that respondents tend to be very positive when asked about their participation in a program even when it is a poor program. They don’ want to believe they wasted their time, and they often feel an obligation to express appreciation for those who implemented the program.

  24. Social Desirability • Danger Signs • Self selection into the program • Students and teachers were not required to participate in the program. Therefore the program was more likely to attract participants who already had positive attitudes toward these “types” of activities.

  25. Social Desirability • Danger Signs • Consistently high scores on every aspect of the program – no variation • Response Set can occur. This is where respondents give you the same response (usually positive) without seriously considering the question. • The “ceiling” effect is a similar problem. This is when you get consistently highly positive scores on the pre-test. In this case, there is little room for improvement in scores.

  26. Dealing with Social Desirability when Constructing your Survey/Questionnaire Check List • Make participation voluntary and make answers anonymous or confidential. • Vary negative/positive statements in • Your index • Avoid misleading/biased questions • Make statements or questions very specific

  27. Dealing with Social Desirability when Constructing your Survey/Questionnaire Check List – continued • Make participation voluntary and make answers anonymous or confidential. • Put “sensitive” questions at the end • Ask how they would change program “under ideal circumstances”. • Avoid (1) yes or (2) no answers – ask “degrees” of positive or negative. • Ask for their input in improving the program – rather than simply evaluating the program for instance: • NOT – Is this a successful program, but rather - what factors increase or decrease the success of this program.

  28. Dealing with Social Desirability in Your Research Design Check List • If possible, don’t evaluate your own program • An “outsider” would tend to be more objective and participants would be more likely to provide unbiased answers. • Have a variety of participants evaluate the program so you can look for consistencies/inconsistencies in answers. • Students • Teachers • Parents of participants • Community Partners

  29. Dealing with Social Desirability in Your Research Design Check List - continued • Use multi-methods so you can compare results across to see if you get similar results and look for additional insights. These could include: • Focus groups • Participant observation • Surveys • Intensive interviews • Content analysis Content analysis is especially important for researchers who identify tangible products (e.g., bushels of grains) as their outcomes.

  30. Dealing with Social Desirability when Analyzing the Data Check List • Compare your program with other programs • Compare across different levels of participation within your sample to see if there are variations • Compare across different types of participation within your sample (i.e., in our study, we compared across types of Service Learning projects).

  31. Dealing with Social Desirability when Analyzing the Data The most important thing to remember here is to NOT just ask if the program was successful, but rather, HOW and WHEN it is most successful. Check List • Compare across different “types” of participants (This would include males vs. females, parents vs. children, rural vs. urban dwellers). • Compare scores across questions – especially questions that measure the same outcomes. • Compare answers across time fall vs. summer participants

  32. Laboratory Experiments Intensive Interviews

  33. Evaluation of the Service Learning Program • Examples of data • that the program is producing the desired outcomes • collected from Teachers • Using Telephone Surveys • Using Focus Groups

  34. From Focus GroupsHow Do Teachers Evaluate Service Learning? As One Teacher Stated, “Service Learning is the most powerful and impactful thing I ever did in the classroom as a teacher. It hooked me, and I am a believer in the power. Another Teacher Claimed, “I think this program has transcended anything that anyone expected when they began the program. It has extended beyond what they thought it could achieve.”

  35. Focus Groups - What do the Teachers Think Their Students Learned? One Teacher Argued, “I could have never ever taught the lessons they learned about human nature.” While Another Claimed, “It teaches kids the skills that are not book skills….skills like how to think, how to plan, how to organize, how to manage - stuff you can read about in a book, but until you do it, you don’t know you have the ability to do it.”

  36. Focus Groups – What do Teachers Think Their Students Learned? One Teacher Stated, “school is not as…engaging as when they learn through these projects…they are learning all of these things by action – their great public speaking skills, their writing skills, their marketing…” Another Teacher Explained, “in the writing TAKS, we had to write with a prompt so it kind of helped with the writing and the reading TAKS too.”

  37. Evaluation of the Service Learning Program • Examples of data • that the program is producing the desired outcomes • collected from parents and community partners • using telephone surveys • using focus groups

  38. Focus GroupsQuotes by Community Partners One Community Partner Described Their Relationship with the School, “We Actually came to the schools…and we were looking for assistance. It’s a great marriage. We are still married.” And when Describing the Benefits for Students, “..we’ve watched students mature into more socially aware students - much more mature. It’s amazing. It’s just amazing.”

More Related