evaluation and assessment strategies for web based education l.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
Evaluation and Assessment Strategies for Web-Based Education PowerPoint Presentation
Download Presentation
Evaluation and Assessment Strategies for Web-Based Education

Loading in 2 Seconds...

play fullscreen
1 / 36

Evaluation and Assessment Strategies for Web-Based Education - PowerPoint PPT Presentation


  • 367 Views
  • Uploaded on

Evaluation and Assessment Strategies for Web-Based Education Paula Frew, MA, MPH Associate Director of Programs Behavioral Sciences and Health Education Emory University Atlanta, Georgia, USA Overview Why Evaluate? Types of Assessment Theoretical Framework Evaluation Research Questions

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Evaluation and Assessment Strategies for Web-Based Education' - andrew


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
evaluation and assessment strategies for web based education

Evaluation and Assessment Strategies for Web-Based Education

Paula Frew, MA, MPH

Associate Director of Programs

Behavioral Sciences and Health Education

Emory University

Atlanta, Georgia, USA

overview
Overview
  • Why Evaluate?
  • Types of Assessment
  • Theoretical Framework
  • Evaluation Research Questions
  • Methods & Data Collection Strategies
  • Frew (2001) Formative Evaluation Study of Web-Based Course Initiatives
  • Additional Resources
introductions
Introductions
  • About us
    • About the workshop leader
    • Ice-breaking exercise - getting to know each other
slide4

Evaluation Design Framework: A Process Exercise

Evaluation Purposes

Theory

Evaluation Questions

Sampling & Data Collection Strategies

Methods

Adopted from Robson, 2000 (Fig 5.1, p. 80)

why evaluate
Why evaluate?
  • Evaluation is the systematic acquisition and assessment of information to provide useful feedback about some object (program, technology, activity, etc.) - William Trochim,Cornell University (http://trochim.human.cornell.edu/kb/intreval.htm)
  • Learner evaluation: Assess worth or quality of learning and instruction in web-based environment (Donald E. Hanna, et al., 2000)
why evaluate6
Why Evaluate?
  • To improve a program/course
  • Assess outcomes and efficiency - Provide “useful feedback” to administrators, faculty, sponsors, other relevant constituencies.
  • Find out how a program operates/Understand why a program works (or does not) - May aid in decision-making or policy formulation processes through provision of empirically-driven feedback.
why evaluate7
Why evaluate?
  • Other reasons:
    • To generate knowledge - assess the academic/applied value which includes topics such as:
    • Testing theory
    • Distinguishing types of interventions
    • Learning about measurements
    • Developing policy
why evaluate learners
Why evaluate learners?
  • To identify content areas that are unclear and confusing
  • Recognize content areas that need revision
  • Gather evidence to support revisions
  • Assess effectiveness of your course
          • (Hanna, et al., 2000)
reflective exercise your reasons to evaluate
Reflective Exercise:Your reasons to evaluate
  • Workshop exercise - what are your reasons to conduct an evaluation?
    • Detail specifics related to own experience
    • Who are your stakeholders?
    • What is the political dynamic at work?
    • What is your relationship to the stakeholders and others involved in process?
    • Conclude exercise: Share with others at workshop
types of assessment
Types of Assessment
  • Improvement-oriented evaluations:
    • Formative evaluation - strengthen or improve the object being evaluated*
      • Scriven (1967) - form or develop programs
      • Patton (1994) - “developmental evaluation”
      • Focus on processes - what is going on
    • Quality-enhancement and continuous improvements
    • Local adaptation (for different cultures)
types of assessment11
Types of Assessment
  • Knowledge-oriented evaluations:
    • Effectiveness
    • Theory-building
    • Policy making
types of assessment12
Types of Assessment
  • Judgement-oriented evaluations:
    • Summative evaluation - examine the effects or outcomes of some object*
      • Robson (2000) - an “end of term report” - what goals have been achieved?
      • Were needs met, target audiences reached, was program implemented as planned?
    • Audit: accountability and quality control
    • Cost-benefit decisions
program technology courses formative evaluation distinctions
Program/Technology/Courses: Formative Evaluation Distinctions
  • Needs assessment - who needs the program, how great is the need, what might work to meet the need?
  • Structured conceptualization - helps define the program/technology, the target audience, and possible outcomes
  • Implementation evaluation - monitors the fidelity of the program or technology delivery
  • Process evaluation - investigates the process of delivering the program or technology, including alternative delivery procedures (Trochim, 1999)
formative evaluation learning instructional focus
Formative Evaluation: Learning & Instructional Focus
  • Helps to identify the knowledge and skills learners have gained in the course to date
    • Allows you to determine whether or not to introduce new content
    • Gives you feedback on the learners’ learning processes
    • Signals whether learners need additional practice or work in certain areas
    • Refocuses the learners’ attention
program technology courses summative evaluation distinctions
Program/Technology/Courses: Summative Evaluation Distinctions
  • Outcomes evaluation - investigate whether the program or technology caused demonstrable effects on specifically defined target outcomes
  • Impact evaluation - assesses the overall or net effects of a program or technology as a whole
  • Cost-effectiveness/Cost-benefit analysis - address questions of efficiency by standardizing outcomes in terms of their dollar costs and values
  • Meta-analysis - integrates the outcomes estimates from multiple studies to arrive at an overall summary judgement on an evaluation question (Trochim, 1999)
summative evaluation learning instruction focus
Summative Evaluation: Learning & Instruction Focus
  • Measures what learners have learned
  • Finalizes decisions about grades
  • Reviews new knowledge and skills the learners have gained in taking the course.
reflective exercise what type of evaluation do you propose
Reflective Exercise: What type of evaluation do you propose?
  • Given the perceived need of the stakeholders, what type of evaluation would you pursue? Why?
  • What challenges do you perceive in conducting this type of evaluation?
  • What would be gained in doing this type of evaluation?
  • Exercise conclusion: share with others
theoretical framework
Theoretical Framework
  • 3 Approaches (Patton, 1994)
    • Deductive Approach - scholarly theory guides inquiry
    • Inductive Approach - generate theory from fieldwork
    • User-focused - working with others in evaluation context to extract and specify their implicit theory of action
theoretical framework19
Theoretical Framework
  • Deductive Approach Examples:
    • Diffusion of Innovations Theory (Rogers, 1995) - e.g., examine the adoption of educational innovations in academic settings
    • Adult Learning Theory (Knowles, 1980) - e.g., how learners acquire their skills, knowledge, understandings, attitudes, values and interests
reflective exercise what theoretical approach will you take
Reflective Exercise: What Theoretical Approach will you take?
  • What is your theoretical orientation, if any, in conducting the evaluation?
  • Reasons for this decision
  • Potential blindspots in theoretical orientation
  • Alternative approaches
  • Exercise conclusion: share with others
evaluation research questions
Evaluation Research Questions
  • Formative Research Examples:
    • How should the program or technology be delivered to address the problem?
    • How well is the program or technology delivered?
    • What is the definition and scope of the problem/issue, or what is the question?
    • Where is the problem and how big or serious is it? (Trochim, 1999)
evaluation research questions22
Evaluation Research Questions
  • Learning & Instruction: Formative Evaluation Examples:
    • How well is the instruction likely to work?
    • What obstacles are learners encountering, and how can they be overcome?
    • Are the selected instructional methods, media, and materials effective in helping learners learn?
    • What improvements could be made in the instruction for future use?
          • (Hanna, et al., 2000)
evaluation research questions23
Evaluation Research Questions
  • Summative Evaluation Research:
    • What was the effectiveness of the program or technology?
    • What is the net impact of the program? (Trochim, 1999)
evaluation research questions24
Evaluation Research Questions
  • Learning & Instruction: Summative Evaluation Examples:
    • What did the learners learn?
    • Did the learners find the instruction interesting, valuable, and meaningful?
    • What instructional methods contributed to learner motivation?
          • (Hanna, et al., 2000)
reflective exercise your research questions
Reflective Exercise: Your research questions
  • Write up to 5 questions that will guide your evaluation work - what do you hope to learn from these questions?
  • Once answered, what impact do you see this information having on your intended audience (the stakeholders, institution, etc.)?
  • Conclusion: share with others
program course technology evaluation methods data collection strategies
Program/Course/Technology Evaluation: Methods & Data Collection Strategies
  • Most used:
    • Informational interviews
    • Formal, open-ended interviews
    • Formal, questionnaire-based interviews
    • Focus groups
    • Participant observations
    • Document review & analysis
    • Statistical modeling/analysis
learner evaluation data collection strategies
Learner Evaluation: Data Collection Strategies
  • Most used:
    • Pre-test/post-test quizzes of knowledge and skills
    • Essays
    • Portfolios
    • Performance evaluations/learner self-assessment
    • Interviews
    • Journals
    • Reflective papers
    • Website development
    • Learner participation figures
    • Peer assessment
reflective exercise your methods data collection approach
Reflective Exercise: Your methods/data collection approach
  • What methods would you employ in your evaluation study? Why?
  • How much time and resources do you think are necessary to conduct your evaluation?
  • How would you propose to address bias in your data collection approach?
  • Conclusion: share ideas about your methods with others
quick tips on questionnaire survey design
Quick Tips on Questionnaire/Survey Design
  • A popular method in gathering data for evaluation studies
  • Tips on writing survey questions (handout distributed to participants)
  • Statistical implications - how questionnaire data is turned into statistics/how to avoid bias in writing questions
  • Other issues
reflective exercise write a brief questionnaire for your evaluation study
Reflective exercise: Write a brief questionnaire for your evaluation study
  • Write 5 good questionnaire items for your evaluation survey based upon your research questions
  • Who is your intended audience for this instrument (students?, faculty?, administrators?) - why does this matter?
  • Conclusion: share questions with others/why is it difficult to write good questions?
if time exploring other methods
If time, exploring other methods
  • Role play exercise: conducting a focus group
  • Role play exercise: how to interview participants
  • Basic descriptive statistics: how to analyze user statistics for web-based courses and programs
frew 2001 formative evaluation study of web based course initiatives
Why Evaluate: Are we progressing toward meeting Teaching Strategic Plan objectives?

What Type of Evaluation:Formative - focus on improving course website to meet specific TSP goals

Research Questions: example - to what extent do the participants agree that the website enhanced the course curriculum?

Theoretical Orientation: DOI and ALT

Frew (2001): Formative Evaluation Study of Web-Based Course Initiatives
frew 2001 formative evaluation study of web based course initiatives33
Study Population:

Faculty and Medical Students (MEDI 605 course - Fall 2000)

Methods/Data Collection:

Faculty:

User Statistics

Questionnaire

Students:

User Statistics

Questionnaires

Focus Group

Frew (2001): Formative Evaluation Study of Web-Based Course Initiatives
frew 2001 formative evaluation study of web based course initiatives34
Results (an example):

Curriculum Enhancement Question:

72.2% faculty - increased enthusiasm for instructional resource

94% students - website improved quality of course

Recommendations for Future Educational Development (examples):

Greater faculty involvement in diffusion process

Offer pre-course training

Improve access to computing resources

Frew (2001): Formative Evaluation Study of Web-Based Course Initiatives
additional resources
Additional Resources:
  • Hanna, D.E., Glowacki-Dudka, M., & Conceição-Runlee, S. (2000). 147 Practical Tips for Teaching Online Groups: Essentials of Web-Based Education. Atwood Publishing: Madison, WI.
  • Knowles, M.S. (1980). The Modern Practice of Adult Education: From Pedagogy to Andragogy. Cambridge/Prentice Hall: Englewood Cliffs, NJ.
  • Patton, M. (1994). Utilization-focused Evaluation: 3rd Edition. SAGE Publications: London.
  • Robson, C. (2000). Small-Scale Evaluation. SAGE Publications: London.
  • Rogers, E.M. (1995). Diffusion of Innovations (4th ed.). The Free Press: New York.
  • Trochim, W. (1999). The Research Methods Knowledge Base, 1st Edition. Atomic Dog Publishing: Cincinnati, OH.