1 / 55

Strategies for the Evaluation of Adult Science Media

Strategies for the Evaluation of Adult Science Media Saul Rockman - Saul@rockman.com Jennifer Borland - Jennifer@rockman.com Kristin Bass - Kristin@rockman.com Monnette Fung - Monnette@rockman.com www.rockman.com San Francisco, CA • Bloomington, IN

Anita
Download Presentation

Strategies for the Evaluation of Adult Science Media

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Strategies for the Evaluation of Adult Science Media Saul Rockman - Saul@rockman.comJennifer Borland - Jennifer@rockman.comKristin Bass - Kristin@rockman.comMonnette Fung - Monnette@rockman.com www.rockman.com San Francisco, CA • Bloomington, IN

  2. Retrospective Review of Evaluations of Adult Science Media Saul RockmanROCKMAN ET AL www.rockman.com April 14, 2009

  3. Media-Based Learning Science in Informal Environments (2007) (http://tinyurl.com/REA_NRC_Paper) Learning Science in Informal Environments: People Places, and Pursuits (2009) (http://tinyurl.com/NRC_Book) Supported by NSF, NRC, National Academy of Sciences, Board on Science Education A comprehensive synthesis of research on science learning in informal environments. Commissioned Paper

  4. Review of the Literature More than 50 studies from 1999 – 2006 Covers both adult and children programming Much is fugitive literature, not accessible Categorized by media form

  5. Research Themes • Adult vs. Youth: Differences between adult learning and youth learning. • Formal vs. Informal: Differences between formal and informal learning. • Media vs. Non-mediated: Learning from media vs. learning from non-mediated formats.

  6. Media for Children vs. Adults Children’s media Series Repetition Iterative Curriculum design Intentional Adult media One-offs Content not consistent Informational News focus

  7. Adult Audience Older, wealthier, whiter, more educated Science, news, arts (persistent) Increasingly using multiple media Autonomous, self-directed, practical, looking for respect and relevance

  8. Differences between formal and informal learning Two Key Areas of Difference: Context for learning: • When and why the learning is taking place • Locus of responsibility for learning (teacher-directed or learner directed) Potential or desired learning outcomes: • The goal of ISE is enabling the ideas and information to be integrated more fully into ways of thinking or ways of behaving • ISE not geared toward formal assessments

  9. Differences between learning in mediated and non-mediated formats • Mediated content can promote more self-directed learning and therefore deeper processing • Pacing varies in mediated formats (pros and cons) Instructional Design Cognitive Science Communication Theory

  10. Media • Television/video (including video files on the Internet) • Radio/audio (including podcasts and streamed audio) • Film • Large Format Film (e.g., IMAX) • Planetarium shows • Not: Websites, print, brief videos in museums, etc.

  11. Accessibility Continuum Most research has been done at either end of the spectrum We are starting to see more research here Internet Home Video Distribution Highly Accessible Broadcast Media Location -Based Limited Accessibility

  12. The Lay of the Land More than PBS / NPR Attributes of science and nature programming: Voice of God

  13. Why are Programs Like This? Schedule drives design and production Media requires significant funding The money is in production Built on values, not theory Review process focuses on media, not outcomes

  14. There would be no bucks without Buck Rogers. – Old NASA adage ROCKMAN ET AL

  15. Framework for Adult Learning From Media • Media Production Context • Individual-level Inputs • Activities • Outcomes: • Short-term • Mid-term • Longer-term • Program Goals • External Influences

  16. Summary of Outcomes Limited range of outcomes Methodological weaknesses Limited generalizability (selection bias)

  17. The difference between outputs and outcomes is like the difference between what is so and so what? – Michael Scriven ROCKMAN ET AL

  18. Getting to “So What?” Policy and practice More focus on research in RFPs Enhanced funding More creative research approaches More powerful research methodologies Interactive multiple media strategies

  19. Of course it works in practice, but will it work in theory? – French research saying

  20. Informal Science Evaluation MethodologiesJennifer BorlandJennifer@rockman.com

  21. Four Main Categories of Evaluation Outcomes Learn • Feel • Think • Do

  22. Methods Mapped to Outcomes Learning: self-reports, recall, little application of learning Engagement: self-reports, appeal associated with regular viewing/listening Attitude change: short term, increased interest, rarely a control group Behavior: information seeking, discussions

  23. Learning • Self-rated level/amount of learning • most common • obvious limitations • recall tests more than knowledge/procedural questions • Subjects answer questions related to content learning • rarely pre-post testing • Observable learning outcomes • (applied knowledge) • control/treatment groups • different time/place Methods to Consider: pre-post testing, assessments of higher order thinking skills control/treatment groups, better sampling, transfer tasks, more longitudinal assessment

  24. Engagement/Enjoyment • Attention Measures • Indicators: • Time-sampled observations • Attention • Instruments to assess enjoyment • Appeal • Self-reported attention and appeal (Most common) • Why more/less attention and appeal? (causes and correlates) • Personal interest in subject • Age/Gender • Style of program • Level/newness of content Methods to Consider: Measures of physiological responses, better sampling and instrument construction to facilitate multivariate analysis

  25. Attitude Change Examples: • Self-Reported Change: • Limited reliability/Validity • Science plays a positive change-making role in our society • Towards science in general • Science does more good than harm • Greater appreciation for the natural world • Science content in program • Better understanding the role we’ve had on the environment • Sense of being able to understand scientific concepts Methods to Consider: More in-depth assessments and longitudinalstudies

  26. Behavior • Intended • self-report (reliability/validity?) • real action occurs in different time/place What do you plan to do • Talk to others • Explore the topic further (books/web) • Change in behaviors related to topic: e.g. water conservation, exercising more • self-report or observed • (already done/seen) • Actual What did you do? • range from shallow/casual to deep/meaningful • Level of depth varies Methods to Consider: More observed behavioral change, longitudinal behavioral change, randomized control/treatment studies

  27. Challenges • Funding: More rigorous evaluations cost more - The money needs to be on the screen • Timing: Interest in quick-results, move on to next thing • Logistics: IRB approval, sample selection • Lack of buy-in: necessary evil, luxury more than necessity, good reviews/ratings are perceived to count for more • Reality: a thirty minute program isn’t going to change someone’s life dramatically

  28. Solutions/Suggestions: • Given adequate funding/timeframes… • Better/more rigorous and powerful methodologies: • Control Group Studies • Panel Studies • Better Samples/Strategic Audience Sampling (include non-traditional and reluctant audiences) • Longitudinal Studies • Multivariate analysis (finding new connections) • New/Unique methods: • Specific to informal learning (different from formal) • Beyond the individual (dyads, groups, etc.) • New Modes of Informal Learning: Web, Games, Mobile, etc. • Theory-based/Theory building: psychology, mass communication, cognitive science

  29. Assessing Knowledge of Exploring Time Kristin Bass Kristin@rockman.com www.rockman.com San Francisco, CA • Bloomington, IN

  30. New Directions in ISE Evaluation • Rigor • Design • Instrumentation • Practicing what we preach

  31. TV show with accompanying website (http://www.exploringtime.com/) • Program objectives • Program format • .

  32. Assessing Learning Constructs Items Item scores Review and Validation

  33. Construct Identification

  34. Item Generation • Prior evaluations • Show producers • Script

  35. Item Example 1 Please describe what is changing in the following scene.

  36. Sample Item 2 In order to explain why a heart muscle goes into arrhythmia, scientists have to drill down to a chain of events in the thousandths of a second. Why is this?

  37. Item Scoring • Top-down • Bottom-up • Pilot responses • Pre and post responses • Inter-rater reliability

  38. Review and Validation Pilot response results Scoring agreement Patterns of responses

  39. Results • Improved awareness of time • Improved timescale identification • No change in understanding of adjacent timescales

  40. Lessons Learned • Begin at the beginning. • Need items? Use the script. • Budget enough time. • Remember less is more.

  41. Future Directions • Group assessments • “Authentic” assessments

  42. Survey and Panel Studies of Quest Science Programming Monnette Fung monnette@rockman.com www.rockman.com San Francisco, CA • Bloomington, IN

  43. QUEST Radio Television Community Science Blog Original Web Content

  44. Audience Study Year 1: 2007 Baseline surveys in Spring and Fall Year 2: 2008 Panel surveys in June, August and October New Media Users survey in September Year 3: 2009 Educator Case Studies Continue New Media Users Survey

  45. Recruiting High Engagement Members * Recent visitors * Content consumers Medium/Family Engagement Families with children under 16 Lower Engagement Non-members * Infrequent visitors * Interest in arts

  46. Participants

  47. Repeated Question In the last two months, have you participated in any of the following science/nature-related activities? (Check all that apply) Visited a science museum or nature organization Attended a lecture at a science/nature organization Attended a science café Taken a science or nature-related class or workshop Participated in a nature walk Participated in another science/nature-related activity:

  48. Unique Questions Survey 2 • Participation in arts-related activities (e.g., visits to art museums, performances, or classes) • Approximate number of times, and is this typical? • Describe a recent science/nature activity • What was the activity? • With whom did you engage? • Why did you engage?

  49. Unique Questions Survey 3

  50. Unique Questions Survey 3 Which option below best describe your engagement with QUEST online video content? I have no watched QUEST video online, and I am not interested in doing so. I have not done so, but I might in the future. I have done so, but I will not do so again. I have done so, and I will continue to do so.

More Related