assessment and evaluation n.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
Assessment and Evaluation PowerPoint Presentation
Download Presentation
Assessment and Evaluation

Loading in 2 Seconds...

play fullscreen
1 / 25

Assessment and Evaluation - PowerPoint PPT Presentation


  • 82 Views
  • Uploaded on

Assessment and Evaluation. Diane M. Bunce Chemistry Department The Catholic University of America Washington, DC 2064 Bunce@cua.edu. Project Evaluation NSF 1997. What is it? Systematic investigation of the worth of a project Answers the questions: What have we accomplished?

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Assessment and Evaluation' - helen-pierce


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
assessment and evaluation

Assessment and Evaluation

Diane M. Bunce

Chemistry Department

The Catholic University of America

Washington, DC 2064

Bunce@cua.edu

project evaluation nsf 1997
Project EvaluationNSF 1997

What is it?

  • Systematic investigation of the worth of a project
    • Answers the questions:
      • What have we accomplished?
      • Are the project goals being met?
      • How effectively are the individual parts of the project working?
    • Provides information to communicate to stakeholders
      • NSF
      • Administrators
      • Faculty
      • Students
      • Others not involved in the project
        • Faculty peers
        • Other students
      • General Audience
role of evaluation in project
Role of Evaluation in Project
  • Evaluation should be an integral part
    • Address each goal of project.
    • Provide information to help improve project.
    • Offer necessary information to different groups of stakeholders.
    • Need not be adversarial.
      • Evaluation should include the criteria of proof accepted by each group of stakeholders.
  • Funded at a level consistent with the work that needs to be accomplished.
two stages of evaluation
Two Stages of Evaluation
  • Formative Evaluation
    • Opportunity to measure progress and suggest modifications during the project
      • Implementation Evaluation
        • Is the project being conducted as planned?
      • Progress Evaluation
        • Measures impact of project components at various stages.
  • Summative Evaluation
    • Assesses quality and impact of the full project at the completion of project.
stakeholders
Stakeholders
  • Who are the people who have an interest in the results of your project?
    • Funding Agency
      • Judge the success of your individual project.
      • Judge the impact your project results have on the broader goals of the program.
    • Administration of the participating institutions
      • Are the results compelling enough to make changes in the staffing, scheduling or facilities of the institution?
    • Faculty
      • Are the results compelling enough to change the way they teach and the curricula materials they choose?
    • Students
      • Does the experience support/challenge their academic/career choice?
        • Are more students attracted to science, mathematics, engineering majors?
    • Professional Peers
      • Will your project affect the way others teach?
evaluation questions
Evaluation Questions

Questions must address the overall goals of the project.

  • Goal: To provide authentic research experience for beginning undergraduates
    • Evaluation Question:
      • What constitutes authentic research?
      • Do students experience authentic research in this project as judged by science experts, faculty, administrators, and students themselves?
evaluation questions1
Evaluation Questions
  • Goal: To increase the number of students who choose to major in science, mathematics, engineering, and technology.
    • Evaluation Questions:
      • How many students declare/change their majors from non-SMET to SMET?
      • Does the % of students who successfully graduate with SMET majors increase as a result of this project?
      • Does the % of students who go onto SMET careers increase as a result of the project?
evaluation questions2
Evaluation Questions
  • Goal: To change the way introductory chemistry is taught to students.
    • Evaluation Question
      • How have faculty changed the process or curricula of these courses as a result of this project?
measurement and results of evaluation questions
Measurement and Results of Evaluation Questions
  • What proof is needed to answer the questions asked?
    • Quantitative: Statistics
      • Significant Difference between groups
        • Difference unlikely to occur by chance
      • Effect Size
        • Change measured against standard deviation
    • Qualitative
      • Observation of specific behavior of groups
      • Perceptions of Stakeholders
    • Combination
  • Can results be applied to policy or program that motivated the study?
  • Will the study have direct implications for action?
matching evaluation methodology to evaluation questions
Matching Evaluation Methodology to Evaluation Questions
  • What, How or Why something occurs?
    • Qualitative
  • Does what occurs make a difference?
    • Quantitative
  • How important is it?
    • Mixed Methods
      • What occurs and why?
      • What does it mean?
      • How is it valued by stakeholders?
comparison of qualitative and quantitative evaluation designs
Qualitative

Assumptions

Variables are complex, interwoven and difficult to measure

Purpose

Interpretation of reality of situation

Approach

Ends with hypothesis

Researcher is the tool

Quantitative

Assumptions

Variables can be identified and relationships measured

Purpose

Generalizability

Prediction

Approach

Starts with hypothesis

Researcher manipulates formal tools

Comparison of Qualitative and Quantitative Evaluation Designs
qualitative evaluation design
Qualitative Evaluation Design
  • Use when investigating “What”, “How”, or “Why”
    • Results can be used
      • To answer the question directly or
      • To help develop a quantitative study.
  • Types of Qualitative Studies
    • Observer Study: observations of participants augmented by opportunities to talk with stakeholders.
    • Case Study: In-depth analysis of a limited number of participants.
analysis of qualitative data miles and huberman 1994
Analysis of Qualitative Data(Miles and Huberman 1994)
  • Data Reduction
    • Process of selecting, focusing and transforming data
    • Initial categorizations shaped by pre-established evaluation questions.
  • Data Display
    • Diagram, Chart or Matrix used to organize data for purpose of facilitating construction of conclusions.
  • Conclusion Drawing and Verification
    • Consideration of what data mean.
    • Revisiting data to verify emerging conclusions.
quantitative evaluation design
Quantitative Evaluation Design
  • Quantitative studies report on aggregate (average of students in sample)
    • (Qualitative studies report on a limited number of specific cases)
  • Types of Quantitative Designs
    • Pure experimental design: assume randomization of subjects to treatment and control
    • Quasi experimental design: randomization not assumed
pure experimental design abraham cracolice 1993 94
Pure Experimental Design(Abraham & Cracolice, 1993-94)
  • Pretest-Post Test
    • Each student randomly assigned to treatment or control groups.
    • Use pretest to measure variables of interest.
    • Implement treatment.
    • Use post test (same as or equivalent to pretest).
  • Problems
    • Possible interaction between pre and post tests.
pure experimental design
Pure Experimental Design
  • Post-test only Design
    • Randomly assign students to treatment or control groups.
    • Present treatment.
    • Administer post test on variables of interest.
  • Potential problems
    • Treatment and control groups may not be equivalent even though randomly assigned.
    • If there is a differential drop out from treatment vs. control groups, a biased sample may result.
    • Without pretest, cannot analyze students according to subgroup (low vs. high achieving students).
quasi experimental design
Quasi-Experimental Design
  • Students are not randomly assigned to groups.
  • Identification and measurement of pertinent covariables are compared between treatment and control groups to assure equality of groups.
    • Gender, age, mathematical aptitude, logical reasoning ability, academic accomplishment, science vs. nonscience major, etc.
quasi experimental design jaeger pg 592
Quasi-Experimental Design(Jaeger. Pg. 592)
  • Interrupted Time Series
    • Data recorded for pertinent variable for many consecutive points in time both before and after treatment is introduced.
    • Pattern of graph is studied before and after treatment.
      • If graph shows an abrupt shift in level or direction at point of treatment, then treatment is cause.
  • Advantage
    • Useful with a small number of subjects
  • Problem
    • Separating out other possible causes of the shift in graph can be difficult.
quasi experimental design1
Quasi-Experimental Design
  • One classroom design
    • Covariables of interest identified and measured.
      • Ex. Logical reasoning ability, gender, academic aptitude
    • Treatment applied.
    • Achievement/Attitude measures administered.
    • Analysis
      • Differences in achievement/attitude of subgroups, as identified through use of covariables, are compared.
mixed evaluation methods nsf 1997
Mixed Evaluation Methods(NSF 1997)
  • Approach combining both qualitative and quantitative methods
  • Advantages
    • Triangulation
      • validity of results strengthened by using more than one method to investigate question.
    • Improved instrumentation
      • Qualitative open-ended survey results used to design Likert scale survey.
    • Mixed results may lead investigators to modify or expand research design.
mixed methods surveys
Mixed Methods:Surveys
  • Open-ended question surveys (qualitative).
    • Analysis of responses can lead to development of a validated Likert Scale Survey.
  • Likert Scale Survey (quantitative)
    • Subjects respond to a series of questions/statements by indicating the amount of agreement/disagreement on a scale of 15.
mixed methods interviews
Mixed Methods:Interviews
  • Structured interviews
    • Fixed sequence of pre-determined questions
    • Interviewer does not deviate from script.
  • Semi-structured interviews
    • Interviewer is free to ask spontaneous questions of interviewee
      • Think Aloud interview
        • Present subjects with a demonstration or problem.
        • Interviewee solves and thinks aloud.
        • Interviewer asks questions such as
          • “ What are you thinking?”
mixed methods interviews1
Mixed Methods:Interviews
  • Unstructured Interviews
    • Interviewer asks subject questions as they arise.
      • Not all subjects are asked the same questions.
summary
Summary
  • Project Evaluation
    • Tied to project goals
    • Includes responses of stakeholders in project
    • Includes both formative and summative evaluation
  • Methodology
    • Match methodology to evaluation questions
      • Mixed Methods uses both qualitative and quantitative methods.
        • Validity of evaluation strengthened by using multiple methods (triangulation).
  • Results
    • Inform stakeholders
    • Lead to action
references
References
  • Abraham, M. R. and M. S. Cracolice. (1993-94). Doing Research on College Science Instruction: Realistic Research Designs. Journal of College Science Teaching. Dec.93/Jan.94, pp.150-153.
  • Jaeger, R. M. (Ed.) (1988). Complementary Methods for Research in Education. 2nd Ed. Washington, DC: American Educational Research Association.
  • Miles, M. B. and Huberman, A. M. (1994). Qualitative Data Analysis, 2nd Ed. Newbury Park, CA: Sage.
  • The 2002 User-Friendly Handbook for Project Evaluation. (2002). Washington, DC: National Science Foundation (02-057).
  • User Friendly Handbook for Mixed Methods Evaluations. (1997). Washington, DC: National Science Foundation (97-153).