Building a scientific basis for research evaluation
Download
1 / 33

Building a Scientific Basis for Research Evaluation - PowerPoint PPT Presentation


  • 81 Views
  • Uploaded on

Building a Scientific Basis for Research Evaluation. Rebecca F. Rosen, PhD. Senior Researcher. Research Trends Seminar October 17, 2012. Outline. Science of science policy A proposed conceptual framework Empirical approaches: NSF Engineering Dashboard ASTRA – Australia HELIOS – France

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about ' Building a Scientific Basis for Research Evaluation' - evadne


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
Building a scientific basis for research evaluation

Building a Scientific Basis for Research Evaluation

Rebecca F. Rosen, PhD

Senior Researcher

Research Trends Seminar

October 17, 2012


Outline
Outline

  • Science of science policy

  • A proposed conceptual framework

  • Empirical approaches:

    • NSF Engineering Dashboard

    • ASTRA – Australia

    • HELIOS – France

  • Final thoughts


Outline1
Outline

  • Science of science policy

  • A proposed conceptual framework

  • Empirical approaches:

    • NSF Engineering Dashboard

    • ASTRA – Australia

    • HELIOS – France

  • Final thoughts


The emergence of a s cience of science policy
The emergence of a science of science policy

  • Jack Marburger’s challenge (2005)

  • Science of Science & Innovation Policy Program at the National Science Foundation (2007)

    • An emerging, highly interdisciplinary research field

  • Science of Science Policy Interagency Task Group publishes a “Federal Research Roadmap” (2008):

    • The data infrastructure is inadequate for decision-making

  • STAR METRICS (2010)


Why a science of science policy
Why a science of science policy?

  • Evidence-based investments

    • Good metrics = good incentives

    • Science is networked and global

  • Build a bridge between researchers and policymakers

    • Researchers ask the right questions

  • The adjacent possible: leverage existing and new research and expertise

    • New tools to describe & measure communication




Getting the right framework matters
Getting the right framework matters

  • What you measure is what you get

    • Poor incentives

    • Falsification

  • Usefulness

  • Effectiveness


A proposed conceptual framework
A proposed conceptual framework

Adapted from Ian Foster, University of Chicago


A framework to drive person centric data collection
A framework to drive person-centric data collection

  • WHO is doing the research

  • WHAT is the topic of their research

  • HOW are the researchers funded

  • WHERE do they work

  • With WHOM do they work

  • What are their PRODUCTS


Challenge the data infrastructure didn t exist
Challenge – The data infrastructure didn’t exist

However, some of the data do exist


Empirical approaches
Empirical Approaches

Leveraging existing data to begin describing results of the scientific enterprise


An empirical approach
An empirical approach

  • Enhance the utility of enterprise data

  • Identify authoritative “core” data elements

  • Develop an Application Programming Interface (API)

    • Data platform that provides programmatic access to public (or private) agency information

  • Develop a tool to demonstrate value of API


Topic modeling enhancing the value of existing data
Topic modeling: Enhancing the value of existing data

Automatically learned topics (e.g.):

t6. conflict violence war international military …

t7. model method data estimation variables …

t8. parameter method point local estimates …

t9. optimization uncertainty optimal stochastic …

t10. surface surfaces interfaces interface …

t11. speech sound acoustic recognition human …

t12. museum public exhibit center informal outreach

t13. particles particle colloidal granular material …

t14. ocean marine scientist oceanography …

NSF proposals

  • Topic Model:

  • Use words from

  • (all) text

  • Learn T topics

t49

t18

t114

t305

Topic tags for each and every proposal

David Newman - UC Irvine


Stepwise empirical approach
Stepwise empirical approach

  • Enhance the utility of enterprise data

  • Identify authoritative “core” data elements

  • Develop an Application Programming Interface (API)

    • Data platform that provides flexible, programmatic access to public (or private) agency information

  • Develop a tool to demonstrate value of API


Stepwise empirical approach1
Stepwise empirical approach

  • Enhance the utility of enterprise data

  • Identify authoritative “core” data elements

  • Develop an Application Programming Interface (API)

    • Data platform that provides programmatic access to public (or private) agency information

  • Develop a tool to demonstrate value of API


Outline2
Outline

  • Science of science policy

  • A proposed conceptual framework

  • Empirical approaches:

    • NSF Engineering Dashboard

    • ASTRA – Australia

    • HELIOS – France

  • Final thoughts


Outline3
Outline

  • Science of science policy

  • A proposed conceptual framework

  • Empirical approaches:

    • NSF Engineering Dashboard

    • ASTRA – Australia

    • HELIOS – France

  • Final thoughts



Outline4
Outline

  • Science of science policy

  • A proposed conceptual framework

  • Empirical approaches:

    • NSF Engineering Dashboard

    • ASTRA – Australia

    • HELIOS – France

  • Final thoughts



What does getting it right mean
What does getting it right mean?

  • A community driven empirical data framework should be:

    • Timely

    • Generalizable and replicable

    • Low cost, high quality

    • The utility of “Big Data”:

    • Disambiguated data on individuals

      • Comparison groups

    • New text mining approaches to describe and measure communication

    • ??



Policy makers can engage scisip communities
Policy makers can engage SciSIP communities:

  • Patent Network Dataverse; Fleming at Harvard and Berkeley

  • Medline-Patent Disambiguation; Torvik & Smalheiser at U Illinois)

  • COMETS (Connecting Outcome Measures in Entrepreneurship Technology and Science); Zucker & Darby at UCLA


The power of open research communities
The power of open research communities

  • Internet and data technology can transform effectiveness of science:

    • Informing policy

    • Communicating science to the public

    • Enabling scientific collaborations

  • Interoperability is key

  • Publishers are an important part of the community


THANK YOU!

Rebecca F. Rosen, PhD

E-Mail: [email protected]

1000 Thomas Jefferson Street NWWashington, DC 20007

General Information: 202-403-5000TTY: 887-334-3499

Website: www.air.org


ad