1 / 24

Planning Engineering Education Research

Planning Engineering Education Research. Facilitator: Matthew W. Ohland. Workshop Objective. Describe steps in the planning and assessment of engineering education research Give examples of each step Small-group discussion to implement the step using a common example. Workshop Agenda.

Download Presentation

Planning Engineering Education Research

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Planning Engineering Education Research Facilitator: Matthew W. Ohland

  2. Workshop Objective • Describe steps in the planning and assessment of engineering education research • Give examples of each step • Small-group discussion to implement the step using a common example

  3. Workshop Agenda 4:00-4:05 Introduction 4:05-4:20 Defining the Purpose of the Evaluation (What questions are we asking?) 4:20-4:40 Clarify Project Objectives (What do we expect?) 4:40-5:00 Create a Model of Change (How will our efforts lead to the objectives?) 5:00-5:15 Select Criteria and Indicators (What data do we need to measure our progress?) 5:15-5:25 Identify Data Sources (Where will we get that data?) 5:25-5:45 Design Evaluation Research (How will we analyze that data?) 5:45-5:50 Monitor and Evaluate (What actually happened?) Use and Report Results (Who needs to know?) 5:50-6:00 Wrap-up and evaluations

  4. Formative Summative Efficacy Effectiveness Cost Examples Is there a benefit to doing this? Can we improve it? Does that benefit lead to better retention / grades? Do the benefits justify the program cost? Can we achieve the same benefits on a larger scale? Step 1: Purpose of the Evaluation

  5. Step 1 Action Steps • Decide purpose(s) • Primary questions? Order? • Primary audience(s)? • How will the results be used?

  6. Step 1: Purpose of the Evaluation • Do we have a learning community? • Is it leading to better course retention / grades? • Can we achieve the same benefits cost-effectively on a larger scale?

  7. Impact Outcome Process Example Improve graduation rate by some % Increase course passing rate by some % Student or faculty opinions / behaviors change Reserve any facilities needed Step 2: Project Objectives

  8. Step 2 Action Steps • Write down project objectives • Impact • Outcome • Process

  9. Step 2: Project Objectives • Increase graduation rate from 50% to 75% • Increase first-time Calculus passing rate from 50% to 70% • Students have positive opinion of group work • Secure dormitory and classroom space

  10. Identify assumptions you can assess Choose relationships to test based on Resources Where you anticipate problems Where you have control / can make improvements Link what you do to what you expect to happen Example Common residence / classes  in between, we need a sound theory of why this will happen  more will graduate Step 3: Create a Model of Change

  11. Step 3 Action Steps • Model of change • as specific / complete as needed • Review model assumptions • Use criteria to prioritize • Resources • Relevance • Control

  12. Step 3: Create a Model of Change • Common residence / classes  affiliation • Affiliation  support • Support  performance • Performance  future performance • Future performance  more will graduate

  13. Validity Reliability Sensitivity Ease of interpretation Usefulness Example: define Graduation rate Course pass rate Affiliation Life-long learning Enrollment Step 4: Criteria and Indicators

  14. Step 4 Action Steps • Define a set of indicators and criteria • Impact • Outcome • Process

  15. Step 4: Criteria and Indicators • Graduation rate • # graduated / # in original cohort • Calculus pass rate • # A, B, C / # in original cohort • D’s are no good – must be retaken • Affiliation • # in study group at end of sophomore year / # students • Number of students enrolled in program

  16. Exams Surveys Observations Student records SAT scores Frequency Resources Guidance Change Example Institutional Research office (annual) Course records (each semester) Registrar Admissions Assessment office Process - monitor until achieved Step 5: Data Sources

  17. Step 5 Action Steps • Define data sources • Define frequency of measurement

  18. Step 5: Data Sources • Grad / retention rate – Inst. Res. – Annual • Calculus performance – Math dept. – includes course grades and common exam results –end of semester • Dorm space allocated – Housing – monitor until achieved • Survey of program participants • Observer evaluations of class interaction • SAT scores—Admissions

  19. Selection Mortality Placebo is not an issue Qualitative Interviews Focus groups Systematic observation Quantitative Non-experimental Posttest only Pretest-Posttest Quasi-experimental Time Series Nonequivalent control Experimental Pretest-Posttest Control Multiple Intervention Step 6: Design Evaluation Research

  20. Step 6 Action Steps • Design evaluation research studies for key questions

  21. Example-Interviews Student expectations—invitation process modified Resource utilization—approaches to motivate attendance Example-Efficacy Non-equivalent control group Test for selection bias using baseline measures Matched pairs / groups Calculus grades and overall GPR Step 6: Example Designs

  22. Step 7: Monitor and Evaluate • Establish project information system • Budget for evaluation • Evaluation meetings • Review and revise evaluation plan • Carry out studies

  23. Step 8: Use and Report Results • Report results – to everyone • Use results to make improvements

  24. Conclusions • Planning and assessment is essential • Start small if resources are limited • Develop a plan before starting • NSF and other agencies support well-designed educational research • CCLI—EMD / A&I, ASA, and other programs • Seek appropriate partners from education, psychology, sociology, statistics, etc.

More Related