strategies for the evaluation of adult science media l.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
Strategies for the Evaluation of Adult Science Media PowerPoint Presentation
Download Presentation
Strategies for the Evaluation of Adult Science Media

Loading in 2 Seconds...

play fullscreen
1 / 55

Strategies for the Evaluation of Adult Science Media - PowerPoint PPT Presentation


  • 285 Views
  • Uploaded on

Strategies for the Evaluation of Adult Science Media Saul Rockman - Saul@rockman.com Jennifer Borland - Jennifer@rockman.com Kristin Bass - Kristin@rockman.com Monnette Fung - Monnette@rockman.com www.rockman.com San Francisco, CA • Bloomington, IN

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Strategies for the Evaluation of Adult Science Media' - Anita


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
strategies for the evaluation of adult science media

Strategies for the Evaluation of Adult Science Media

Saul Rockman - Saul@rockman.comJennifer Borland - Jennifer@rockman.comKristin Bass - Kristin@rockman.comMonnette Fung - Monnette@rockman.com

www.rockman.com

San Francisco, CA • Bloomington, IN

slide2

Retrospective Review of Evaluations of Adult Science Media

Saul RockmanROCKMAN ET AL

www.rockman.com

April 14, 2009

commissioned paper
Media-Based Learning Science in Informal Environments (2007)

(http://tinyurl.com/REA_NRC_Paper)

Learning Science in Informal Environments: People Places, and Pursuits (2009)

(http://tinyurl.com/NRC_Book)

Supported by NSF, NRC, National Academy of Sciences, Board on Science Education

A comprehensive synthesis of research on science learning in informal environments.

Commissioned Paper
review of the literature
Review of the Literature

More than 50 studies from 1999 – 2006

Covers both adult and children programming

Much is fugitive literature, not accessible

Categorized by media form

research themes
Research Themes
  • Adult vs. Youth: Differences between adult learning and youth learning.
  • Formal vs. Informal: Differences between formal and informal learning.
  • Media vs. Non-mediated: Learning from media vs. learning from non-mediated formats.
media for children vs adults
Media for Children vs. Adults

Children’s media

Series

Repetition

Iterative

Curriculum design

Intentional

Adult media

One-offs

Content not consistent

Informational

News focus

adult audience
Adult Audience

Older, wealthier, whiter, more educated

Science, news, arts (persistent)

Increasingly using multiple media

Autonomous, self-directed, practical, looking for respect and relevance

differences between formal and informal learning
Differences between formal and informal learning

Two Key Areas of Difference:

Context for learning:

  • When and why the learning is taking place
  • Locus of responsibility for learning (teacher-directed or learner directed)

Potential or desired learning outcomes:

  • The goal of ISE is enabling the ideas and information to be integrated more fully into ways of thinking or ways of behaving
  • ISE not geared toward formal assessments
differences between learning in mediated and non mediated formats
Differences between learning in mediated and non-mediated formats
  • Mediated content can promote more self-directed learning and therefore deeper processing
  • Pacing varies in mediated formats (pros and cons)

Instructional

Design

Cognitive

Science

Communication

Theory

media
Media
  • Television/video (including video files on the Internet)
  • Radio/audio (including podcasts and streamed audio)
  • Film
  • Large Format Film (e.g., IMAX)
  • Planetarium shows
  • Not: Websites, print, brief videos in museums, etc.
accessibility continuum
Accessibility Continuum

Most research has been done at either end of the spectrum

We are starting to see more research here

Internet

Home Video

Distribution

Highly Accessible

Broadcast

Media

Location -Based

Limited Accessibility

the lay of the land
The Lay of the Land

More than PBS / NPR

Attributes of science and nature programming:

Voice of God

why are programs like this
Why are Programs Like This?

Schedule drives design and production

Media requires significant funding

The money is in production

Built on values, not theory

Review process focuses on media, not outcomes

slide14

There would be no bucks without Buck Rogers.

– Old NASA adage

ROCKMAN ET AL

slide15

Framework for Adult Learning From Media

  • Media Production Context
  • Individual-level Inputs
  • Activities
  • Outcomes:
    • Short-term
    • Mid-term
    • Longer-term
  • Program Goals
  • External Influences
summary of outcomes
Summary of Outcomes

Limited range of outcomes

Methodological weaknesses

Limited generalizability (selection bias)

the difference between outputs and outcomes is like the difference between what is so and so what

The difference between outputs and outcomes is like the difference between what is so and so what?

– Michael Scriven

ROCKMAN ET AL

getting to so what
Getting to “So What?”

Policy and practice

More focus on research in RFPs

Enhanced funding

More creative research approaches

More powerful research methodologies

Interactive multiple media strategies

four main categories of evaluation outcomes
Four Main Categories of Evaluation Outcomes

Learn • Feel • Think • Do

methods mapped to outcomes
Methods Mapped to Outcomes

Learning: self-reports, recall, little application of learning

Engagement: self-reports, appeal associated with regular viewing/listening

Attitude change: short term, increased interest, rarely a control group

Behavior: information seeking, discussions

learning
Learning
  • Self-rated level/amount of learning
  • most common
  • obvious limitations
  • recall tests more than knowledge/procedural questions
  • Subjects answer questions related to content learning
  • rarely pre-post testing
  • Observable learning outcomes
  • (applied knowledge)
  • control/treatment groups
  • different time/place

Methods to Consider: pre-post testing, assessments of higher order thinking skills control/treatment groups, better sampling, transfer tasks, more longitudinal assessment

engagement enjoyment
Engagement/Enjoyment
  • Attention Measures
  • Indicators:
  • Time-sampled observations
  • Attention
  • Instruments to assess enjoyment
  • Appeal
  • Self-reported attention and appeal (Most common)
  • Why more/less attention and appeal? (causes and correlates)
  • Personal interest in subject
  • Age/Gender
  • Style of program
  • Level/newness of content

Methods to Consider: Measures of physiological responses, better sampling and instrument construction to facilitate multivariate analysis

attitude change
Attitude Change

Examples:

  • Self-Reported Change:
  • Limited reliability/Validity
  • Science plays a positive change-making role in our society
  • Towards science in general
  • Science does more good than harm
  • Greater appreciation for the natural world
  • Science content in program
  • Better understanding the role we’ve had on the environment
  • Sense of being able to understand scientific concepts

Methods to Consider: More in-depth assessments and longitudinalstudies

behavior
Behavior
  • Intended
  • self-report (reliability/validity?)
  • real action occurs in different time/place

What do you plan to do

  • Talk to others
  • Explore the topic further (books/web)
  • Change in behaviors related to topic: e.g. water conservation, exercising more
  • self-report or observed
  • (already done/seen)
  • Actual

What did you do?

  • range from shallow/casual to deep/meaningful
  • Level of depth varies

Methods to Consider: More observed behavioral change, longitudinal behavioral change, randomized control/treatment studies

challenges
Challenges
  • Funding: More rigorous evaluations cost more - The money needs to be on the screen
  • Timing: Interest in quick-results, move on to next thing
  • Logistics: IRB approval, sample selection
  • Lack of buy-in: necessary evil, luxury more than necessity, good reviews/ratings are perceived to count for more
  • Reality: a thirty minute program isn’t going to change someone’s life dramatically
solutions suggestions
Solutions/Suggestions:
  • Given adequate funding/timeframes…
  • Better/more rigorous and powerful methodologies:
    • Control Group Studies
    • Panel Studies
    • Better Samples/Strategic Audience Sampling (include non-traditional and reluctant audiences)
    • Longitudinal Studies
    • Multivariate analysis (finding new connections)
  • New/Unique methods:
    • Specific to informal learning (different from formal)
    • Beyond the individual (dyads, groups, etc.)
  • New Modes of Informal Learning: Web, Games, Mobile, etc.
  • Theory-based/Theory building: psychology, mass communication, cognitive science
assessing knowledge of exploring time

Assessing Knowledge of Exploring Time

Kristin Bass

Kristin@rockman.com

www.rockman.com

San Francisco, CA • Bloomington, IN

new directions in ise evaluation
New Directions in ISE Evaluation
  • Rigor
    • Design
    • Instrumentation
  • Practicing what we preach
slide31

TV show with accompanying website (http://www.exploringtime.com/)

  • Program objectives
  • Program format
  • .
assessing learning
Assessing Learning

Constructs

Items

Item scores

Review and Validation

item generation
Item Generation
  • Prior evaluations
  • Show producers
  • Script
item example 1
Item Example 1

Please describe what is changing in the following scene.

sample item 2
Sample Item 2

In order to explain why a heart muscle goes into arrhythmia, scientists have to drill down to a chain of events in the thousandths of a second.

Why is this?

item scoring
Item Scoring
  • Top-down
  • Bottom-up
    • Pilot responses
    • Pre and post responses
  • Inter-rater reliability
review and validation
Review and Validation

Pilot response results

Scoring agreement

Patterns of responses

results
Results
  • Improved awareness of time
  • Improved timescale identification
  • No change in understanding of adjacent timescales
lessons learned
Lessons Learned
  • Begin at the beginning.
  • Need items? Use the script.
  • Budget enough time.
  • Remember less is more.
future directions
Future Directions
  • Group assessments
  • “Authentic” assessments
survey and panel studies of quest science programming

Survey and Panel Studies of Quest Science Programming

Monnette Fung

monnette@rockman.com

www.rockman.com

San Francisco, CA • Bloomington, IN

quest
QUEST

Radio

Television

Community Science Blog

Original Web Content

audience study
Audience Study

Year 1: 2007

Baseline surveys in Spring and Fall

Year 2: 2008

Panel surveys in June, August and October

New Media Users survey in September

Year 3: 2009

Educator Case Studies

Continue New Media Users Survey

recruiting
Recruiting

High Engagement

Members * Recent visitors * Content consumers

Medium/Family Engagement

Families with children under 16

Lower Engagement

Non-members * Infrequent visitors * Interest in arts

repeated question
Repeated Question

In the last two months, have you participated in any of the following science/nature-related activities? (Check all that apply)

Visited a science museum or nature organization

Attended a lecture at a science/nature organization

Attended a science café

Taken a science or nature-related class or workshop

Participated in a nature walk

Participated in another science/nature-related activity:

unique questions
Unique Questions

Survey 2

  • Participation in arts-related activities (e.g., visits to art museums, performances, or classes)
  • Approximate number of times, and is this typical?
  • Describe a recent science/nature activity
    • What was the activity?
    • With whom did you engage?
    • Why did you engage?
unique questions50
Unique Questions

Survey 3

Which option below best describe your engagement with QUEST online video content?

I have no watched QUEST video online, and I am not interested in doing so.

I have not done so, but I might in the future.

I have done so, but I will not do so again.

I have done so, and I will continue to do so.

findings
Findings
  • Collectively, engagement was steady
  • Families are most engaged
  • Environment-specific activities most popular
  • Potential for engaging high arts group
  • Growth of new media audience
new media survey
New Media Survey

Recruiting

Links and aggregators

Findings

Half were outside Bay Area

On-air broadcasts are still primary medium

year 3
Year 3

Educator Case Studies

Formal & informal educators

New media for Science teaching & learning

New Media Users Survey

Continue with the same survey

things to consider
Things to Consider

Influence on participant behavior

Increased awareness