what works in teaching science l.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
What Works In Teaching Science: PowerPoint Presentation
Download Presentation
What Works In Teaching Science:

Loading in 2 Seconds...

play fullscreen
1 / 29

What Works In Teaching Science: - PowerPoint PPT Presentation


  • 158 Views
  • Uploaded on

What Works In Teaching Science:. A Meta-Analysis of Current Research Carolyn Schroeder, Ph.D. Center for Math & Science Education Texas A&M University. Texas A&M University Project Staff. Timothy P. Scott, Ph.D., Project Director

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'What Works In Teaching Science:' - vartan


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
what works in teaching science

What Works In Teaching Science:

A Meta-Analysis of Current Research

Carolyn Schroeder, Ph.D.

Center for Math & Science Education

Texas A&M University

texas a m university project staff
Texas A&M University Project Staff
  • Timothy P. Scott, Ph.D., Project Director
  • Carolyn Schroeder, Ph.D., Senior Research Associate
  • Homer Tolson, Ph.D., Senior Analyst
  • Yi-Hsuan Lee, Ph.D., Analyst
  • Tse-Yang Huang, Ph.D., Analyst
advisory board
Advisory Board
  • Carol L. Fletcher, Ph.D., Texas Regional Collaboratives, UT Austin
  • Ginny Heilman, Region VI ESC
  • Anna McClane, Region IV ESC
  • Sandra S. West, Ph.D., Texas State University
  • Jo Ann Wheeler, Region IV ESC
criteria for selection of studies
Criteria for Selection of Studies
  • Dates: 01/01/1980 – 12/31/2004
  • Dealt with K-12 science education in the U.S.
  • Used student achievement (success, performance, etc.) as dependent variable
  • Used science education teaching strategies as independent variables
  • Was experimental or quasi-experimental
  • Reported effect size (ES) or statistics necessary to calculate it
  • Could not be totally correlational
  • Could not deal exclusively with special populations
  • Could not be included more than once (e.g., same study reported in a dissertation and journal article)
acquisition of studies
Acquisition of Studies
  • Broad search conducted
  • Over 400 potential sources identified
    • Journal articles
    • Conference papers
    • Books
    • Dissertations
    • Government reports
    • Unpublished papers
search methods
Search Methods
  • Electronic searches
    • Web of Science
    • ERIC (EBSCO, First Search, CSA)
    • Academic Search Premier
    • PsycInfo
    • ProQuest Dissertations and Theses
  • Reference lists from previous meta-analyses, books & other articles, electronic sources (e.g., government sites)
  • Request to NARST listserve
  • Requests to specific developers of instructional packages for product studies
coding of studies
Coding of Studies
  • Study attributes coded:
    • Citation
    • Publication type (refereed journal, dissertation, etc.)
    • Study type (experimental, quasi-experimental, correlational)
    • Dependent variable (describe test used to measure achievement)
    • Independent variable (describe treatment & control or alternate treatment)
    • Length of treatment/study
    • Setting & characteristics
      • Schools (#, how selected, public/private, rural/urban, size, % free lunch)
      • Students (#, how selected, how assigned, gender, grade, ethnicity, SES)
      • Teachers ( #, how selected, experience, gender, certification)
    • Study results (ES, p, t, F, eta squared, omega squared)
intercoder objectivity
Intercoder Objectivity
  • 3 randomly selected articles were coded independently by senior analyst and 2 researchers
  • Degree of objectivity was 90% for two articles
  • Third article was identified as correlational therefore was not coded
  • Senior analyst read & coded all articles, resolved any differences in coding values
study design classification
Study Design Classification
  • True random assignment of schools/students to treatment and control groups
  • Quasi-experimental with match of schools/students to achievement and demographics of comparison school/group
  • Quasi-experimental with covariate adjustment for prior achievement differences
  • Quasi-experimental comparison of schools/subjects based a claim of “similarity”
  • Quasi-experimental comparison of schools/subjects to region, state, or national data
  • Quasi-experimental single group pre-post comparison
  • Quasi-experimental treatment vs. control pre-posttest
  • Quasi-experimental multiple group ANOVA
treatment category classification
Treatment Category Classification
  • Modified from Wise, 1996
    • Questioning strategies
    • Manipulation strategies
    • Enhanced materials strategies
    • Testing strategies (changed to Assessment strategies)
    • Inquiry strategies
    • Enhanced context strategies
    • Instructional media strategies (changed to Instructional technology strategies)
    • Focusing strategies (not used)
    • Collaborative learning strategies (added)
effect sizes
Effect Sizes
  • Obtained or calculated for all studies that met criteria
    • n = 62
    • one removed later as extreme outlier
  • Internal & external validity influences on effect sizes calculated
  • Regression analysis for moderator variables & dependent variable effect sizes (n = 61)
  • Failsafe N calculated for all categories
analysis of effect size
Analysis of Effect Size
  • Comprehensive Meta-Analysis® software from BioStat
  • Outputs
    • Cohen’s d,
    • Hedges’s g,
    • Q value,
    • confidence interval etc.,
    • fixed and random effects, and
    • heterogeneity testing results.
slide18

Figure 1. Mean Effect Sizes for Treatment Categories and Total Data

C1=Questioning

C2=Manipulation

C3=Enhanced Material

C4=Assessment

C5=Inquiry

C6=Enhanced Context

C7=Instructional Technology

C8=Collaborative Learning

what teaching strategies have been shown to improve student achievement in science
What teaching strategies have been shown to improve student achievement in science???
  • All of the innovative strategies have a positive influence on student achievement.
  • Innovative science instruction is a mixture of teaching strategies.
  • Teaching strategies are tools, and the right tool must be selected for the job at hand.
most powerful enhanced context strategies
Most Powerful – Enhanced Context Strategies
  • Make learning relevant to students
  • Use real-world examples and problems
    • Problem based learning
    • Case based learning
  • Use technology to bring real world into classroom
  • Take students out of classroom into real world
  • Use multiple contexts to teach concept
future research meta analysis
Future Research – Meta-Analysis
  • Examine studies included in MA to determine how many of them meet the “strong” or “possible” evidence of effectiveness standards of the DOE Institute of Education Sciences (see Identifying and Implementing Educational Practices Supported by Rigorous Evidence, available athttp://www.ed.gov/rschstat/research/pubs/rigorousevid/rigorousevid.pdf)
  • Broaden scope of meta-analysis to include:
    • International studies
    • Correlational studies (data on two variables collected and summarized, showing the relationship between the variables)
    • Studies dealing with attitudinal and motivational changes in students and teachers
    • Studies dealing with special populations (English-language learners, special education, under-represented populations, etc.)
    • Studies dealing with teacher professional development
products
Products
  • Research-based Teaching Strategies for Effective Science Instruction
  • Rubric for Analyzing Science Products
  • Combined in booklet – Effective K-12 Science Instruction: Elements of Research-based Science Education
rubric design based on meta analysis
Science content

Accuracy and alignment

Safety

Organization and structure

Format of materials

Coherency

Meaningful assessment

Alignment

Formative

Summative

Metacognitive

Effective instructional practices

Enhanced context strategies

Inquiry strategies

Instructional technology strategies

Collaborative learning strategies

Manipulation strategies

Questioning strategies

Equity and practicality

Equity

Practicality

Rubric Design Based on Meta-Analysis
rubric development
Rubric Development
  • Draft created using criteria
  • Sent to advisory board and stakeholders for comment
  • Revision
  • Discussion with science teachers/ supervisors
  • Further revisions, clarifications, & weighting of categories
  • Field test
  • Statistical validation (Interrater reliability = .945 using Cronbach’s alpha)
booklets may be ordered for 1 50 each shipping
Dr. Carolyn Schroeder

979-458-4450

cschroeder@science.tamu.edu

Booklets may be ordered for $1.50 each + shipping
  • Texas A&M University
  • Center for Mathematics and Science Education
  • 3257 TAMU
  • College Station, Texas 77843-3257
  • http://www.science.tamu.edu/cmse/tsi