School counselors and program evaluation
Download
1 / 101

- PowerPoint PPT Presentation


  • 391 Views
  • Updated On :

School Counselors and Program Evaluation. Washington School Counselors Association John Carey National Center for School Counseling Outcome Research UMass Amherst www.cscor.org. Data-Driven School Counseling Programs.

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about '' - johana


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
School counselors and program evaluation l.jpg

School Counselors and Program Evaluation

Washington School Counselors Association

John Carey

National Center for School Counseling Outcome Research

UMass Amherst

www.cscor.org


Data driven school counseling programs l.jpg
Data-Driven School Counseling Programs

  • Implement comprehensive programs based on national design and local need

  • Use data to determine directions (data driven decision making, needs assessment)

  • Measure results (program evaluation)

  • Share successes


What data do we use l.jpg
What Data Do We Use?

ASCA National Model asserts there are three broad categories of data sources:

  • Student Achievement Data

  • Achievement-Related Data

  • Standards and Competency Data


Student achievement data l.jpg
Student Achievement Data

1. Norm-Referenced Standardized Tests

  • Scores referenced to national average

  • PSAT, SAT, ACT, Iowa, Metropolitan

  • Predictive Validity

    2. Criterion-Referenced Standardized Tests

  • Scores referenced to performance standards

  • State achievement tests (AIMS)

  • Content related to state curriculum frameworks

  • Content Validity


Student achievement data5 l.jpg
Student Achievement Data

3. Performance tests or changes in

achievement levels(advancement in Math or

English, for example)

4. Portfolios

5. Course grades and GPA

6. Completion of college prep

requirements

7. Drop-out rate


Achievement related data l.jpg
Achievement-Related Data

  • Attendance rates

  • Behavioral problems

  • Student attitudes

  • Discipline referrals

  • Suspension rates

  • Drug, Tobacco, and Alcohol use patterns

  • Parent involvement

  • Extracurricular activities


Standards and competency related data l.jpg
Standards and Competency Related Data

1. College Placements

2. Financial Aid Offers

3. Vocational Placements

4. Percentage of students who:

Have 4- or 6-year plans

Participate in job shadowing

Have completed career interest inventories

ASCA National Standards


Data sources vs data types l.jpg
Data Sources vs. Data Types

The ASCA National Model identifies three data types for use in program evaluation:

  • Process Data – What was done for whom?

  • Perception Data – Attitudes, opinions, beliefs - generally self-report data

  • Results Data – Objective and measurable student outcomes such as academic achievement, attendance, and disciplinary interventions


Program evaluation process data l.jpg
Program Evaluation: Process Data

  • Process Data: What was done for whom?

    • Who received services?

      • Ninth graders? Students at risk of failing math?

    • What did they receive?

      • Curriculum intervention? Small-group intervention?

    • When did they receive it?

      • All year? Twice? For 30 minutes?

    • Where and How was it provided?

      • In the classroom? After school?


Program evaluation process data10 l.jpg
Program Evaluation: Process Data

  • Process data alone does not tell us whether or not the student is different (in behavior, attitude or knowledge) as a result of this activity.

  • Coupled with results data, process data can help identify what factors may have led to success in an intervention.


Program evaluation perception data l.jpg
Program Evaluation: Perception Data

  • Perception data measures how students are different as a result of an intervention

    • Did students gain competencies?

      • Every 10th grade student completed a career interest inventory.

      • 85% of 10th graders identified the 4 steps in the career decision making process.

    • Did they gain knowledge?

      • 87% of 9th graders demonstrated knowledge of graduation requirements.

    • Were there changes in their attitudes or beliefs?

      • 86% believe that pursuing a non-traditional by gender career is acceptable.


Program evaluation perception data12 l.jpg
Program Evaluation: Perception Data

Differences in student knowledge, competency and attitudes are measured through:

  • Pre-post tests

    • What do students know/believe before and after the intervention?

  • Completion of an activity

    • Completion of a 4-year plan

  • Surveys

    • What do students say they believe or know?


Program evaluation results data l.jpg
Program Evaluation: Results Data

  • Results data is the proof that the intervention has or has not influenced behavior.

    • An intervention may occur (process data), students may know the information (perception data), but the final question is whether or not the students are able to utilize the knowledge, attitudes and skills to affect behavior (results data).


Program evaluation results data14 l.jpg
Program Evaluation: Results Data

  • Results data can be complex because many factors impact behavior change.

    • An increase in enrollment at a vocational high school may be due to an intervention implemented for MS students. Conversely, finding no changes in results data does not mean that an intervention has necessarily been unsuccessful.


Clarifying terms l.jpg
Clarifying Terms

  • Research

  • Action Research

  • Program Evaluation


Program evaluation process l.jpg
Program Evaluation Process

  • Program Evaluation allows practitioners to evaluate programs and interventions in their specific contexts.

  • Practices can change immediately and in an ongoing manner as data are collected and analyzed.


Program evaluation process17 l.jpg
Program Evaluation Process

1. Identify the construct(s)

2. Review what is known

3. Develop specific hypotheses or questions you would like to answer and plan the program evaluation accordingly

4. Gather the data

5. Analyze the data

6. Interpret results and disseminate and use findings

7. Evaluate the process


Conducting program evaluation identify the construct s l.jpg
Conducting Program Evaluation: Identify the Construct(s)

1. Review the mission and goals of your program to identify the constructs you would like to look at. Ask yourself:

  • How are people different as a result of the school counseling program?

  • What is a question I want to answer?

  • What do I wish was different?


Conducting program evaluation identify the construct s19 l.jpg
Conducting Program Evaluation: Identify the Construct(s)

Constructs often have “sub-constructs”

Define the construct/question in clear, specific language. The more specific you are, the easier it will be to measure later.

You can use the ASCA National Model to help you define your construct


Conducting program evaluation identify the construct s20 l.jpg
Conducting Program Evaluation: Identify the Construct(s)

The ASCA National Model domains

  • Personal/Social Development Domain

  • Academic Domain

  • Career Domain


Conducting program evaluation what is already known l.jpg
Conducting Program Evaluation: What is Already Known?

2. Review what is known about your questions.

  • Has anyone in your organization asked this question before?

  • Who might have information?

  • What is the relevant research in professional journals?

  • What does an Internet search find on this topic?


Conducting program evaluation develop hypotheses l.jpg
Conducting Program Evaluation: Develop Hypotheses

3. Develop hypotheses or questions you would like to answer and plan the research process accordingly.

  • Ask yourself what you think the answer(s) to your question(s) will be.

  • Identifying the hypothesis helps you identify your biases, which may impact your process.

  • What is the opposite of your hypothesis (the “null hypothesis”)? What would the data look like if you were wrong?


Considerations l.jpg
Considerations

  • What are your biases/Mental Models? How are they impacting the questions you’re asking and the places you’re looking?

  • Evaluating findings from research literature and internet searches

    • What is the source? How reliable is it?

    • What are the strengths/weaknesses of the research design, sampling, effect size, measures used, treatment fidelity, researcher bias, instrument reliability and validity?


Considerations24 l.jpg
Considerations

  • Data:

    • How accurate is the data you’ve chosen to use?

    • What’s missing?

  • Instruments:

    • Reliability and validity

    • Just because it exists doesn’t mean it’s well done


Considerations25 l.jpg
Considerations

  • Sampling and Research Design:

    • Size of sample

    • Comparability of sample and control

    • Matching vs. random assignment

    • Assuring fidelity of treatment

      • Doing same thing across different groups?


Considerations26 l.jpg
Considerations

  • Ethical Considerations:

    • Consent

    • Human Subjects Review

    • Denying access to interventions - remediation?

  • Data Analysis

    • Do you have the capacity to analyze the data?


Conducting program evaluation develop hypotheses27 l.jpg
Conducting Program Evaluation: Develop Hypotheses

EXAMPLE:

Project Explorers – after school program at vocational high school for MS students

Hypotheses/Questions:

Does participation in in Project Explorers lead to:

  • Increased enrollment?

  • Increased awareness of the offerings of the VHS?

  • Career exploration?

  • Improved career decision making abilities?


Activity 1 l.jpg
Activity #1

With your SC program in mind, think about these questions to help you identify constructs:

  • How are people different after participating in my program?

  • What is a question I want to answer?

  • What do I wish was different?

  • Use these questions to develop a specific, measurable question you would like to answer, and place the answer on the planning tool


  • Conducting program evaluation gather data l.jpg
    Conducting Program Evaluation: Gather Data

    4. Gather the data.

    • What information do you need in order to answer your question?

    • Use multiple sources of data, or multiple outcome measures, wherever possible (triangulation).

    • Decide whether you need to consider student achievement data, psychosocial data, career data, school data, process data, perception data, and/or results data.


    Conducting program evaluation triangulation l.jpg
    Conducting Program Evaluation: Triangulation

    Results Data

    Process

    Data

    THE

    QUESTION

    Perception

    Data


    Conducting program evaluation triangulation31 l.jpg
    Conducting Program Evaluation: Triangulation

    INCREASED

    ENROLLMENT

    Process

    Data

    THE

    QUESTION

    Perception

    Data


    Conducting program evaluation triangulation32 l.jpg
    Conducting Program Evaluation: Triangulation

    INCREASED

    ENROLLMENT

    PARTICIPATION –

    # OF SESSIONS

    # OF STUDENTS

    THE

    QUESTION

    Perception

    Data


    Conducting program evaluation triangulation33 l.jpg
    Conducting Program Evaluation: Triangulation

    INCREASED

    ENROLLMENT

    PARTICIPATION –

    # OF SESSIONS

    # OF STUDENTS

    THE

    QUESTION

    SURVEY DATA


    Project explorers example l.jpg
    Project Explorers Example

    • Results Data

      • Did the total number of students enrolling in vocational programs increase when compared to the last three years?

    • Perception Data

      • Survey using combination of validated items from pre-existing survey (MCGES), and “hand written” items


    Project explorers example perception data l.jpg
    Project Explorers Example(Perception Data)

    Circle the correct response for questions 1 - 3.

    1) Which school (s) prepares its graduates for skilled employment, such as a mechanic?

    VHS My local high school Both schools

    2) Which school(s) prepares its graduates for 2-year colleges, such as Middlesex Community College?

    VHS My local high school Both schools

    3) Which school(s) prepares its graduates for 4-year colleges, such as UMASS?

    VHS My local high school Both schools

    4) List as many of the shops at VHS that you are aware of:


    Project explorers example perception data36 l.jpg
    Project Explorers Example(Perception Data)

    5) Check all that apply.

    I have used the following resources to learn about careers:

    ___Internet ___ Expert or person working in the field

    ___Counselor/Teacher ___Other (please specify)_____________

    ___Family member ___ I have not researched any career

    6) Circle the answer that indicates the level of your confidence for each item.

    (copyright laws prohibit duplication of these items – Missouri Comprehensive Guidance Evaluation Survey (MCGES), Lapan)

    7) Please check the 3 areas of employment that you are most interested in at this time.

    ____ Agriculture ____ Education

    ____ Arts & Communication ____ Business and Computer Services

    ____ Construction ____ Health and Hospitality Services

    ____ Manufacturing ____ Transportation


    Project explorers example process data l.jpg
    Project Explorers Example(Process Data)

    • Process Data documents “what was done for who”

      • Number of students participating (overall)

      • Attendance at each shop


    Activity 2 l.jpg
    Activity #2

    • Identify the PROCESS, PERCEPTION, and RESULTS data needed to answer your question / hypotheses

    • Use the planning tool!


    Conducting program evaluation gather data39 l.jpg
    Conducting Program Evaluation: Gather Data

    • Where is the data?

      • Does it already exist?

        • School records, intake forms, test results

      • Will you generate your own data?

        • Surveys

        • Interviews

        • Observations

    • Multiple data sources help you more accurately get at the complexity of a situation, whereas one measure or data source will give you a snapshot view.


    Conducting program evaluation gather data40 l.jpg
    Conducting Program Evaluation: Gather Data

    • Select and/or develop the instruments you will use to gather the data. Possibilities include: (more later)

      • Surveys

      • Tests (of achievement, aptitude, attitude, etc.)

      • Behavioral checklists or observations

      • School records

      • Performance assessments

      • Interviews


    Conducting program evaluation gather data41 l.jpg
    Conducting Program Evaluation:Gather Data

    • Identify and follow ethical and legal standards:

      • No participant should be exposed to physical or psychological harm.

      • Permission to use confidential data must be obtained.

      • Participation in a study is always voluntary.

      • Participants may withdraw from the study at any time.

      • Participants’ privacy rights must be respected.


    Conducting program evaluation gather data42 l.jpg
    Conducting Program Evaluation:Gather Data

    • Identify the group/sample to be studied:

      • Ideally either the entire sample is involved in the study (the class, grade, or school) or the group studied is a random sample of the population.

      • Stratified sampling uses a smaller sample which has the same proportions as the larger sample.

      • Systematic random sampling is when every x number of students is chosen from the whole population (every 4th student on the attendance list, for example).


    Conducting program evaluation gather data43 l.jpg
    Conducting Program Evaluation:Gather Data

    • Once data sources and measures are identified, ethical standards are considered, and the sample is identified, data can be gathered!

    • Ways to collect data include:

      • Questionnaires or surveys

      • School records

      • Interviews

      • Observation


    Collecting data demographics l.jpg
    Collecting Data: Demographics

    • Regardless of what other types of data you collect and analyze, demographic data is a critical first step

    • Collect the types of demographic data that are important for making sense of your results, and for describing who contributed to the data you collected


    Collecting data demographics45 l.jpg
    Collecting Data: Demographics

    • Questions to consider:

      • What information will I need to adequately describe my sample?

        • Think of the information you will need to provide to let unfamiliar people know who your program served

      • What information will I need to analyze the data the way I want to?

        • Think of the various ways you can describe your results . For example, the impact on different ethnic groups, special education vs. regular education, etc


    Demographic variables l.jpg
    Demographic Variables

    • Ethnicity

    • Gender

    • Class (Parent Educational Level)

    • Language Level (Limited English Proficient)

    • Low Income (Free or Reduced School Lunch)

    • Acculturation, Migration (Mobility)

    • Special Needs

    • School Performance (GPA, Achievement Quartile)


    Demographic variables school l.jpg
    Demographic Variables (school)

    • Student Participation in School Programs

    • Student Participation in Extracurricular Activities

    • Grade Level

    • Age

    • Number of Years in Current School

    • Parent Participation Level


    Identifying effective surveys l.jpg
    Identifying Effective Surveys

    • Key questions to consider:

      • How does the survey relate specifically to my program’s goals?

      • Is the survey appropriate for my age group of students?

      • How reliable and valid is the survey?


    Creating your own surveys l.jpg
    Creating Your Own Surveys

    1. Define Constructs

    2. Select/Write Items

    3. Write Instructions

    4. Test Survey

    5. Edit Items and Instructions


    Developing your own surveys writing items l.jpg

    1. Define Constructs

    2. Select/Write Items

    3. Write Instructions

    4. Test Survey

    5. Edit Items and Instructions

    Developing Your Own SurveysWriting Items

    Sometimes pre-existing surveys and measures are not able to adequately capture the attainment of the goals of your program, or the question you would like to answer

    Open-ended vs. closed questions:

    Open-ended questions can provide rich data, but are hard to summarizeWhat do you think about Project Explore?

    Why is career planning important?

    Closed questions are most common in surveys because the results are easy to summarize


    Developing your own surveys writing items51 l.jpg

    1. Define Constructs

    2. Select/Write Items

    3. Write Instructions

    4. Test Survey

    5. Edit Items and Instructions

    Developing Your Own SurveysWriting Items

    Characteristics of Good Likert Items

    • The Likert technique presents a set of attitude statements. Subjects are asked to express agreement or disagreement on a five-point scale. Each degree of agreement is given a numerical value from one to five. Thus a total numerical value can be calculated from all the responses or for subsets of responses.

    • 1 = Strongly Disagree, 2 = Disagree, 3 = Neither Disagree nor Agree, 4 = Agree, 5 = Strongly Agree


    Developing your own surveys writing items52 l.jpg

    1. Define Constructs

    2. Select/Write Items

    3. Write Instructions

    4. Test Survey

    5. Edit Items and Instructions

    Developing Your Own SurveysWriting Items

    Characteristics of Good Likert Items

    • Simple words—readability.

    • Use precise words—avoid ambiguity.

    • Reverse stated items—response set.


    Developing your own surveys writing items53 l.jpg

    1. Define Constructs

    2. Select/Write Items

    3. Write Instructions

    4. Test Survey

    5. Edit Items and Instructions

    Developing Your Own SurveysWriting Items

    • ISOMORPHISM – Each statement should clearly map onto one construct definition

    • SINGULARITY – Each statement should contain one idea – avoid “double barreled items”“I work hard in school because I have high expectations for myself”


    Developing your own surveys writing items54 l.jpg

    1. Define Constructs

    2. Select/Write Items

    3. Write Instructions

    4. Test Survey

    5. Edit Items and Instructions

    Developing Your Own SurveysWriting Items

    • SOCIAL DESIRABILITY MANAGEMENT – Avoid questions that may have more socially appropriate responses

      “Teachers are good people to go to for help”

    • KNOWLEDGE LIABILITY – Each statement should be answerable by the potential respondents

      “My career decision making abilities have increased because of Project Explore”


    Demographic items l.jpg

    1. Define Constructs

    2. Select/Write Items

    3. Write Instructions

    4. Test Survey

    5. Edit Items and Instructions

    Demographic Items

    • “Closed” format whenever possible

    • Meaningful Categories

    • Within knowledge of respondents


    Developing your own surveys writing instructions l.jpg

    1. Define Constructs

    2. Select/Write Items

    3. Write Instructions

    4. Test Survey

    5. Edit Items and Instructions

    Developing Your Own SurveysWriting Instructions

    • Should be clearly defined, easy to understand, and as brief as possible

      • Include the following:

        • Purpose of survey

        • How to answer items

        • Whether responses are anonymous or not

        • Honest answers are requested

        • Answer all items

        • Survey is not a test


    Developing your own surveys testing the survey l.jpg

    1. Define Constructs

    2. Select/Write Items

    3. Write Instructions

    4. Test Survey

    5. Edit Items and Instructions

    Developing Your Own SurveysTesting the Survey

    Why test the survey?

    • Identify possible problems

    • Evaluate wording of items

    • Ensure clarity

    • Assess amount of time required to complete survey


    Developing your own surveys testing the survey58 l.jpg

    1. Define Constructs

    2. Select/Write Items

    3. Write Instructions

    4. Test Survey

    5. Edit Items and Instructions

    Developing Your Own SurveysTesting the Survey

    • Pre-testing Items

      • (AKA “pilot testing”) - not the same as pre/post testing)

    • Survey is administered to small number of “readily available” people

    • Respondents answer items as though part of the study

    • Feedback is given about the survey and items themselves


    Developing your own surveys revising the survey l.jpg

    1. Define Constructs

    2. Select/Write Items

    3. Write Instructions

    4. Test Survey

    5. Edit Items and Instructions

    Developing Your Own SurveysRevising the Survey

    • Make changes to survey items based on “pilot test” information and participant feedback

    • Administer surveys to participants


    Activity 3 l.jpg
    Activity #3

    • Identify the source of each type of data

      • Process data

      • Perception data

      • Results data

    • What demographic data do you need to collect?

      • Is the impact of your program different for different groups?

      • Will the results of your evaluation vary based on demographic characteristics?


    Conducting program evaluation analyze the data l.jpg
    Conducting Program Evaluation: Analyze the Data

    • 5. Analyze the data:

      • Before data can be analyzed it may need to be edited, coded and organized for analysis.

      • Data is often input into a software program such as Excel or SPSS.

      • EZAnalyze is an inexpensive (free for you) data analysis program designed specifically for educators – for more information, visit www.ezanalyze.com


    Conducting program evaluation analyze the data62 l.jpg
    Conducting Program Evaluation: Analyze the Data

    • Descriptive statistics describe the data and can provide information about how a group has changed over time:

      • Measures of central tendency

        Mean, median, mode

      • Measures of variability

        Variance, standard variation, range

      • Measures of relative standing

        Percentile rank


    Conducting program evaluation analyze the data63 l.jpg
    Conducting Program Evaluation: Analyze the Data

    • Inferential statistics provide additional information:

      • Looking carefully at how a group changes over time

        • Use t-tests or Chi-Square

      • Looking at differences between control and intervention groups

        • Use t-tests or Chi-Square

      • Looking at differences among more than 2 groups

        • Use Analysis of Variance (ANOVA)

    • Finding a member of the evaluation team already comfortable with data input and analysis can make this part much less intimidating!


    Statistics 101 l.jpg
    Statistics 101

    Using data requires an understanding of some basic statistics – nothing too fancy is needed!

    Having a handle on some common terms will allow you to make sense of all the numbers and increase your ability to use data


    Statistics 101 common terms l.jpg
    Statistics 101 – Common Terms

    • N - Number of participants

    • Mean – the “average” score – all scores are added up and divided by the N

    • Standard Deviation – how far, on average, a single score deviates from the mean score


    Statistics 101 common terms66 l.jpg

    60

    50

    40

    30

    20

    10

    0

    55 70 85 100 115 130 145

    Score on IQ test

    Statistics 101 – Common Terms

    A frequency Histogram can be used to show how people scored on a variable – this is useful for demonstrating how several of these concepts work

    Number of people


    Statistics 101 common terms67 l.jpg

    60

    50

    40

    30

    20

    10

    0

    55 70 85 100 115 130 145

    Score on IQ test

    Mean = 100, SD = 15

    Statistics 101 – Common Terms

    Number of people


    Statistics 101 common terms68 l.jpg
    Statistics 101 – Common Terms

    Number of people

    60

    50

    40

    30

    20

    10

    0

    55 70 85 100 115 130 145

    Score on IQ test

    Mean = 100, SD = 30


    Statistics 101 common terms69 l.jpg

    60

    50

    40

    30

    20

    10

    0

    55 70 85 100 115 130 145

    Score on IQ test

    Mean = 100, SD = 10

    Statistics 101 – Common Terms

    Number of people


    Statistics 101 common terms70 l.jpg
    Statistics 101 – Common Terms

    • Median – the “middle” number. Obtained by putting all the observed values on a line and finding the one that lands in the middle. Useful for describing “skewed” distributions

    • Mode – the most frequent number


    Statistics 101 common terms71 l.jpg

    60

    50

    40

    30

    20

    10

    0

    55 70 85 100 115 130 145

    Score on IQ test

    Mean = 100, Median = 100, Mode = 100

    Statistics 101 – Common Terms

    Number of people


    Statistics 101 common terms72 l.jpg

    60

    50

    40

    30

    20

    10

    0

    55 70 85 100 115 130 145

    Score on IQ test

    Mean = 100, Median = 90, Mode = 80

    Statistics 101 – Common Terms

    Number of people


    Statistics 101 common terms73 l.jpg

    60

    50

    40

    30

    20

    10

    0

    15 30 45 60 75 90 105

    Household Income(in thousands)

    Mean = 62, Median = 47, Mode = 40

    Statistics 101 – Common Terms

    Number of people


    Statistics 101 common terms74 l.jpg
    Statistics 101 – Common Terms

    • Z-Score – a “standardized score”. The person’s mean score divided by the standard deviation

    • Percentile Rank – tells you the relative position of a person’s score, compared to other people’s scores


    Slide75 l.jpg

    From http://www.webenet.com/bellcurve2.gif


    Statistics 101 common applications l.jpg
    Statistics 101 – Common Applications

    • Categorical Variable – a variable that divides data into groups; has little or no numeric meaning

    • Dependent Variable – a variable that contains information you are interested in that has numeric value

    • Disaggregation – sorting a dependent variable by a categorical variable (or variables)

    • Correlation – a number between -1 and +1 used to describe the relationship between two variables



    Statistics 101 common applications78 l.jpg
    Statistics 101 – Common Applications

    • T-Tests

      • one sample – compare a group to a known value

        • For example, comparing the IQ of convicted felons to the known average of 100)

      • paired samples – compare one group at two points in time

        • For example, comparing pretest and posttest scores

      • independent samples – compare two groups to each other

    • ANOVA - compare two or more groups, OR, compare at two or more points in time (repeated measures)





    Statistics 101 common applications82 l.jpg

    60

    50

    40

    30

    20

    10

    0

    0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18

    Days Absent

    Non-significant t-test

    Statistics 101 – Common Applications

    Male

    Female

    Number of people


    Statistics 101 common applications83 l.jpg

    60

    50

    40

    30

    20

    10

    0

    0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18

    Days Absent

    Significant t-test

    Statistics 101 – Common Applications

    Male

    Female

    Number of people


    Statistics 101 common applications84 l.jpg

    60

    50

    40

    30

    20

    10

    0

    0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18

    Days Absent

    Non-significant t-test (SD’s increased)

    Statistics 101 – Common Applications

    Male

    Female

    Number of people






    Analyzing data creating the data template l.jpg
    Analyzing DataCreating the Data Template

    • For the data to work with EZAnalyze, it needs to be structured a certain way

      • The first row MUST contain variable labels

      • The remaining rows MUST contain the data from the surveys, one row for each person responding

      • A few rules can be applied to help you properly structure your data template


    Analyzing data creating the data template90 l.jpg
    Analyzing DataCreating the Data Template

    • Each survey should have a method for coding that will allow you to match what is entered into the Excel file with the paper-pencil survey

      • Number surveys

      • Use coding system

      • If identifying information is available, take steps to ensure confidentiality of respondents is maintained


    Analyzing data creating the data template91 l.jpg
    Analyzing DataCreating the Data Template

    • Each POSSIBLE RESPONSE should be given its own column in the data template

      • For each question that allows only one response, each question will become one column

      • For each question that has multiple responses, each possible response will require a column


    Analyzing data creating the data template92 l.jpg
    Analyzing DataCreating the Data Template

    Sample of what data will look like when only one response is possible for each question


    Analyzing data creating the data template93 l.jpg
    Analyzing DataCreating the Data Template

    trust1 = counselor trust4 = cafeteria

    trust2 = teacher trust5 = custodian

    trust3 = principal

    Question: Who do you trust in school?


    Analyzing data creating the data template94 l.jpg
    Analyzing DataCreating the Data Template

    • If you have PRETEST and POSTTEST data for a group of students, you will want to have both the pretest and the posttest for each student in the same row


    Analyzing data checking for accuracy l.jpg
    Analyzing DataChecking for Accuracy

    • Once all of your data are entered, you need to check to make sure the data were entered accurately

      • If you have a lot of data, you can select a sample (10%) to spot check for accuracy

      • You can use EZAnalyze’s Descriptive Statistics function to get the range of scores contained in your dataset, then use the SORT function of Excel to find problem data

      • This is where having the ID number on BOTH the survey and in the Excel file comes in handy!


    Analyzing data l.jpg
    Analyzing Data

    This is initially difficult to get a handle on if you have not done this sort of thing before

    Using the EZAnalyze manual and tutorials combined with your own data will make the process more concrete


    Activity 4 l.jpg
    Activity #4

    • How will you analyze the data to answer your questions?

      • Will statistical significance be determined?

      • Will disaggregating the results be useful? If so, how?

      • Remember to use data analysis procedures that will answer your hypothesis.

    • Key terms and statistics (usually true)

      • Relationship = correlation

      • Increase = improvement from pre to posttest, a paired samples t-test

      • Differences among groups (based on demographic characteristics or treatment/control groups) = independent samples t-test, ANOVA, or Chi Square


    Conducting program evaluation interpret results l.jpg
    Conducting Program Evaluation:Interpret Results

    6. Interpret your results, and disseminate and use findings to inform practice.

    • What do the results of your analyses mean?

    • What did you find out?

    • Were your hypotheses correct?


    Conducting program evaluation interpret results99 l.jpg
    Conducting Program Evaluation: Interpret Results

    The goal of program evaluation is to use the results to inform practice. Some options include:

    • Make recommendations that will resolve a problem.

    • Make plans and decisions about interventions based on the findings.

    • Make program plans based on the findings.

    • Develop action plans based on the findings.


    Conducting program evaluation l.jpg
    Conducting Program Evaluation

    • Report conclusions (the accountability part)

      • Decide on audience(s)

      • Structure report/presentation so that the most relevant information is presented to audience

      • Don’t exclude important information

      • Present all relevant results even if they don’t support your hypotheses

      • Relate findings to purposes of evaluation, hypotheses and previous research

      • Make recommendations and decisions based on conclusions


    Slide101 l.jpg

    Thank You!Center for School Counseling Outcome Researchwww.cscor.orgVisit the “Resources” section of the website for copies of the materials


    ad