how did we do insights from a climas pilot evaluation n.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
How did we do? Insights from a CLIMAS pilot evaluation PowerPoint Presentation
Download Presentation
How did we do? Insights from a CLIMAS pilot evaluation

Loading in 2 Seconds...

play fullscreen
1 / 26

How did we do? Insights from a CLIMAS pilot evaluation - PowerPoint PPT Presentation


  • 382 Views
  • Uploaded on

How did we do? Insights from a CLIMAS pilot evaluation. Dan Ferguson, Anne Browning-Aiken, Gregg Garfin, Dan McDonald, Jennifer Rice, Marta Stuart Climate Prediction Application Science Workshop March 7, 2008. Overview. CLIMAS/RISA Purpose of the project Evaluation project team

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'How did we do? Insights from a CLIMAS pilot evaluation' - issac


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
how did we do insights from a climas pilot evaluation

How did we do? Insights from a CLIMAS pilot evaluation

Dan Ferguson, Anne Browning-Aiken, Gregg Garfin, Dan McDonald, Jennifer Rice, Marta Stuart

Climate Prediction Application Science Workshop March 7, 2008

overview
Overview
  • CLIMAS/RISA
  • Purpose of the project
  • Evaluation project team
  • Process and methods
  • Who is participating
  • Research/evaluation questions
  • Where we are in the process
climas risa
CLIMAS/RISA
  • Climate Assessment for the Southwest (CLIMAS) is one of 8 currently funded Regional Integrated Science and Assessments (RISA) programs
climas mission mode
CLIMAS mission/mode

We both do climate research and work iteratively with stakeholders, partners, and collaborators to provide timely, pertinent, and (hopefully) useful information, tools, services (or access to these) about climate to those who need these to make decisions.

climas team
CLIMAS team
  • Program is 10 years old—cast of characters changes through time
  • Currently 10 investigators + affiliate investigators + grad students + core office staff
  • HQ at University of Arizona, but currently have investigator (Deborah Bathke) at New Mexico State University
  • Highly interdisciplinary: anthropology, climatology, decision-support system development, geography, hydroclimatology, Latin American studies, paleoclimatology, resource economics
purpose of the evaluation project
Purpose of the evaluation project
  • Broad evaluation of the RISA model as expressed by CLIMAS
    • Not eval. of particular product, info source, etc, but rather a first crack at an overall evaluation of CLIMAS
    • Roughly bounded in time—2002-2007
  • Looking for key insights about penetration of information; perceived salience, credibility, and legitimacy* of CLIMAS; and changes in knowledge, behavior, understanding as a result of interactions with CLIMAS

*After: Cash, D., W. Clark, et al. (2002). Salience, Credibility, Legitimacy and Boundaries: Linking Research, Assessment and Decision Making. Cambridge, MA, John F. Kennedy School of Government, Harvard University: 24 pp.

purpose of the evaluation project cont
Purpose of the evaluation project (cont.)
  • Input to CLIMAS program manager and investigators
  • Input to the Climate Program Office and the other RISAs
  • Input to NIDIS as it develops
evaluation team1
Evaluation team
  • Mixed team:
    • two members directly affiliated with CLIMAS (Ferguson and Garfin)
    • four members not previously affiliated with CLIMAS (Browning-Aiken, McDonald, Rice, Stewart)
evaluation team roles
Evaluation team roles
  • Garfin=‘Encyclopedia of CLIMAS’
  • Ferguson=lead investigator/coordinator, but not conducting data collection
  • Browning-Aiken + Rice=interviews
  • McDonald=survey
  • Stewart, Browning-Aiken, Rice=focus groups
methods
Methods
  • Survey (online)
  • Interviews (primarily telephone)
  • Focus groups will follow survey and interviews; FG will be used to probe deeper into issues, ideas that emerge from survey and interview results
survey
Survey
  • Multiple iterations involving whole team
  • Piloted survey with ~15 colleagues
    • Teased out obvious issues
    • Included required IRB disclaimer language, but made access (hid) behind a click
  • Used professional web team to develop/build
    • Very fast turnaround, reliable product, able to customize and troubleshoot
    • Helped us understand options, e.g. email login
team process
Team Process
  • Develop research questions based on broad strokes of proposal
  • Utilize whole team for development of research questions + all data collection instruments
    • Collaboratively and iteratively develop and refine data collection instruments=a very good thing
a sample of organizations with whom we work
A sample of organizations with whom we work
  • Basic stats of evaluation participants
    • ~150 people will be contacted
    • Representing > 50 organizations
    • ~25-35 interviews
    • ~120 people surveyed
our spectrum of relationships
Our spectrum of relationships
  • Communication: e.g., receive Southwest Climate Outlook, e-mail updates or other publications; call or e-mail CLIMAS team members with specific questions
  • Consultancy: e.g., ask for expert speaker for workshop/meeting; seek consultation on project development
  • Partner: e.g., co-sponsor an event; been invited to speak at a meeting or workshop
  • Collaboration: e.g., ongoing or long-lasting research collaborations; long-term engagement to address a particular issue
research evaluation questions
Research/evaluation questions
  • Is CLIMAS achieving the overall RISA goals of being responsive, stakeholder-oriented, and use-inspired?
  • Is CLIMAS perceived as salient, credible and legitimate?
research evaluation questions1
Research/evaluation questions
  • Is CLIMAS perceived by collaborating organizations as a reliable and responsive partner?
  • What are the outcomes (short and medium term) that result from interactions with CLIMAS?
  • How is CLIMAS accessed and is it reaching populations in need of climate information?
slide22
Lessons learned (so far) or what I know today that I didn’t really know in October but probably should have
lessons learned so far common sense warning
Lessons learned so far(common sense warning)
  • Try to keep track of your stakeholders
  • Mixed team (inside/outside program) has worked out very well
  • Use whole team to develop/refine research questions and instruments
  • Pros for developing web survey interface and database=major time/headache saver
  • Take the time to pilot a survey; big return on small investment
  • Institutional Review Board, oye
  • Understand that sometimes “evaluation is intervention”
where we are in the process
Where we are in the process
  • Interviews beginning this week
  • Survey link is being distributed over the next week
  • Focus groups will follow, probably late April/May
then what
Then what?
  • White paper aimed at RISA, NIDIS, other programs and organizations grappling with similar issues
  • Peer reviewed pub
  • Better grip on next steps for ongoing evaluation
questions
Questions?

Dan Ferguson

University of Arizona/CLIMAS

dferg@email.arizona.edu

http://www.climas.arizona.edu/