1 / 39

CSREES Reporting Web Conference

CSREES Reporting Web Conference. April 14, 2008. User Support. (202) 690-2910 or C2IT@csrees.usda.gov Do not contact Texas A&M support FAQs and other information on the CSREES Reporting Web Conference web page at www.csrees.usda.gov/rwc. Format and Logistics.

evita
Download Presentation

CSREES Reporting Web Conference

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CSREES Reporting Web Conference April 14, 2008

  2. User Support • (202) 690-2910 orC2IT@csrees.usda.gov • Do not contact Texas A&M support • FAQs and other information on the CSREES Reporting Web Conference web page at www.csrees.usda.gov/rwc E-mail questions to rwc@csrees.usda.gov

  3. Format and Logistics • E-mail questions to rwc@csrees.usda.gov • Also e-mail topic suggestions to rwc@csrees.usda.gov • Opportunity to vote for topics for next conference • Conferences are recorded and will be available on the Reporting Web Conference web page at www.csrees.usda.gov/rwc E-mail questions to rwc@csrees.usda.gov

  4. Deb Hamernik Deb is the CSREES National Program Leader for Animal Physiology and NRI Program Director for Bovine Genome Sequencing and Porcine Genome Sequencing. She has represented USDA on the NSTC-COS-Research Business Models Subcommittee to develop the Research Performance Progress Report (RPPR) since 2004. (202) 401-4202 dhamernik@csrees.usda.gov www.csrees.usda.gov/onesolution E-mail questions to rwc@csrees.usda.gov

  5. Standard Progress Report

  6. Standard Reporting Across the Federal Government • Implementation of the Federal Financial Assistance management Improvement Act of 1999 (Public Law 106-107) • Facilitate information collection in lieu of numerous agency-specific forms • Does not change reporting requirements in OMB Circulars A-102 and A-110. Provides a standard format for collecting the information • Draft Research Performance Progress Report (RPPR) available at: www.nsf.gov/bfa/dias/policy/rppr/draftformat.pdf E-mail questions to rwc@csrees.usda.gov

  7. Standard Reporting Across the Federal Government • Agencies will use the standard categories & instructions developed for each category • Agencies may provide additional program-specific instructions to clarify program requirements • Agencies may develop additional agency- or program-specific reporting categories & instructions E-mail questions to rwc@csrees.usda.gov

  8. One Solution: CRIS Transition Standard Report • The revised CRIS AD 416 sections include: • Goals/Objectives/Expected Outputs • Methods • Non-Technical Summary • The revised CRIS AD 421 sections include: • Outputs • Outcomes/Impacts • Publications • Participants • Target Audiences • Project Modifications E-mail questions to rwc@csrees.usda.gov

  9. Outputs • Activities, events, services, or products that reach people • Examples: conferences, field days, videos, curricula, patent applications, germplasm, genetic maps, students graduated, etc. • Do not report publications in this category E-mail questions to rwc@csrees.usda.gov

  10. Outputs--Dissemination • Dissemination refers to outreach activities to reach intended audiences to advance knowledge, encourage positive actions, or change conditions. • If educational materials and resources were distributed, describe the distribution method and the intended audience. E-mail questions to rwc@csrees.usda.gov

  11. Publications • Publications are outputs. • For technical reasons, the CRIS system collects publications in a separate box. • Include paper or electronic publications • Include status of publication (e.g., pending, accepted, in press) E-mail questions to rwc@csrees.usda.gov

  12. Outcomes/Impacts • Changes in knowledge, actions, conditions • Results of basic research projects should be described as a change in knowledge (rather than experimental/technical details) • Results of extension activities should be described as a change in actions or conditions E-mail questions to rwc@csrees.usda.gov

  13. Participants • Provide information about individuals who worked on the project—their role and how they participated in the project • If applicable, describe partner organizations, collaborators, and contacts. Include collaborators outside the U.S. • Describe opportunities for training or professional development (trainees, K-12 teachers, producers, farmers, staff, volunteers, etc. E-mail questions to rwc@csrees.usda.gov

  14. Target Audiences • Provide information on target audiences for efforts designed to cause a change in knowledge, actions or conditions. • Include: individuals, groups, communities served by the project • Delivery of science-based knowledge to people through formal or informal educational programs E-mail questions to rwc@csrees.usda.gov

  15. Project Modifications • Describe major changes in approach and reason for change • Examples: changes in Assurance Statements (animals, humans, or biohazards); major problems or delays that have significant impact on rate of expenditures E-mail questions to rwc@csrees.usda.gov

  16. Tips Do NOT re-enter the objectives and methods (already entered on the AD416) Do NOT copy and paste abstracts for scientific meetings into the Standard Report Use general terms for a lay audience More information is not necessarily better information E-mail questions to rwc@csrees.usda.gov

  17. Standard Progress Report: For More Information • Deb Hamernik (202)401-4202 dhamernik@csrees.usda.gov E-mail questions to rwc@csrees.usda.gov

  18. Questions? • E-mail questions to rwc@csrees.usda.gov • For more information, visit the One Solution web page at www.csrees.usda.gov/onesolution E-mail questions to rwc@csrees.usda.gov

  19. Djimé Adoum Djimé assists the Director, Office of Planning and Accountability, in developing monitoring and evaluation systems to analyze program activities funded by CSREES and implemented by our Land Grant System partners, and provides leadership in strategic planning and the CSREES Portfolio Review process. (202) 720-4564 dadoum@csrees.usda.gov www.csrees.usda.gov/opa

  20. Practical, Realistic Approaches to Measuring Impacts of Basic Research

  21. Outline of the Presentation • Reasons for Measuring Impact of Basic Science • Difficulties with Measuring Impact of Basic Science • Metrics and Efforts to Date • R&D Criteria as Starting Point • Experience from CSREES PREP • Use of the Logic Model • Summary and Conclusions

  22. Why Measure the Impact of Basic Research? • Pressure due to limited resources • Problems have become extremely complex and require multi-disciplinary collaboration • To secure public buy-in • To demonstrate public value

  23. Stating the Obvious • Measuring the impact of Basic Research is difficult • Attempts have been made to identify ways to measure • A few approaches have been determined to be of value

  24. A Few Suggested Indicators

  25. Metric Defined Definition of a metric (Geisler 2000): It is a system of measurement that includes three elements: • the item being measured • the unit of measurement • the value of the unit

  26. A Few Suggested Metrics • Bibliometric Analysis refers to measures of scientific and technical published outputs from science and its disciplines. It measures both quantity and quality (Geisler 2000).

  27. A Few Suggested Metrics • Economic Analysis: a process that correlates financial measures for both investments/expenditures and outputs (source: Geisler 2000). It is extremely difficult to predict the outcomes of R&D • .

  28. A Few Suggested Metrics • Peer Review: A process by which a selective jury of experts in a given scientific field is asked to evaluate the undertaking of scientific activity or its outcomes (e.g., research, projects, or scientific publications.) Source: Geisler 2000

  29. Efforts to date A few models are selected to highlight evaluation of publically funded research: • United Kingdom’s Research Assessment Exercise (RAE) • The Japanese model • The Australian (Australia’s Relative Funding Model) - RFM • The United States: GPRA and PART (NIH, NSF, ARS and CSREES, NAS)

  30. Evaluation based on the R&D Criteria • R&D Criteria • Relevance R&D investments must have clear plans, must be relevant to national priorities, agency missions, relevant fields, and “customer” needs, and must justify their claim on taxpayer resources. • Quality Programs should maximize the quality of the R&D they fund through the use of a clearly stated, defensible method for awarding a significant majority of their funding. • Performance R&D programs should maintain a set of high priority, multi-year R&D objectives with annual performance outputs and milestones that show how one or more outcomes will be reached. • To be used as broad guidelines applicable to all Federally funded R&D efforts.

  31. CSREES Experience: Portfolio Review Expert Panel Process • R&D Criteria and Dimensions • Relevance Scope Focus Contemporary and/Emerging Issues Solicitation and/or receptiveness for Stakeholders Input Utilization of Stakeholder Input • Quality Significance of Results Usefulness and Utilization of Results Integration Interdisciplinary Balance Alignment with Current State of Science • Performance Productivity Comprehensiveness of Work Produced Accountability Management • To be used as broad guidelines applicable to all Federally funded R&D efforts.

  32. The Logic Model • What is it? A roadmap, a conceptual framework, a program theory, program theory of action (Weiss, 1998; Patton, 1997, Bickman, 1987) • It is a concise way to show how a program is designed and will make a difference (Harvard Family Research Project) • It is the core of program planning, evaluation, program management and communications (NAS, Kellogg Foundation, and UNW)

  33. Use of logic model to • Set the context within which research takes place • Consider the concept of public value • Provide a conceptual roadmap • Ascertain the extent to which outputs led to new knowledge, applications, solutions and reasonably consistent with expenditures • Provide framework for evaluation

  34. Description of challenge or opportunity - Farmers face increasing challenges from globalization - Opportunity to improve animal health through genetic engineering - Insufficient # of trained & diverse professionals entering agricultural fields - Youth at risk - Invasive species is becoming an increasing problem - Bioterrorism - Obesity crisis - Impaired water quality What we invest: - Faculty - Staff - Students - Infrastructure - Federal, state and private funds - Time - Knowledge - The collection of stakeholder opinions Occurs when there is a change in knowledge or the participants actually learn: - New fundamental or applied knowledge - Improved skills - How technology is applied - About new plant & animal varieties - Increased knowledge of decision-making, life skills, and positive life choices among youth & adults - Policy knowledge - New improved methods Occur when there is a change in behavior or the participant’s act upon what they’ve learned and: - Apply improved fundamental or applied knowledge - Adopt new improved skills - Directly apply information from publications - Adopt and use new methods or improved technology - Use new plant & animal varieties - Increased skill by youth & adults in making informed life choices - Actively apply practical policy and decision-making knowledge Occur when a societalcondition is improved due to a participant’s action taken in the previous column. For example, specific contributions to: - Increased market opportunities overseas and greater economic competitiveness - Better and less expensive animal health - Vibrant & competitive agricultural workforce - Higher productivity in food provision - Better quality-of-life for youth & adults in rural communities - Safer food supply - Reduced obesity and improved nutrition & health - Higher water quality and a cleaner environment What we do (Activities): - Design and conduct research - Publish scientific articles - Develop research methods and procedures - Teach students - Conduct non-formal education - Provide counseling - Develop products, curriculum & resources Who we reach (Participation): - Other scientists - Extension Faculty - Teaching Faculty - Students - Federal, state & private funders - Scientific journal, industry & popular magazine editors - Agencies - Policy and decision- makers - Agricultural, environmental, life & human science industries - Public EXTERNAL FACTORS - A brief discussion of what variables have an effect on the portfolio, program or project, but which cannot be changed by managers of the portfolio, program, or project. For example, a plant breeding program’s success may depend on the variability of the weather...etc. ASSUMPTIONS- These are the premises based on theory, research, evaluation knowledge etc. that support the relationships of the elements shown above, and upon which the success of the portfolio, program, or project rests. For example, finding animal gene markers for particular diseases will lead to better animal therapies. Situation Inputs Activities Outputs Outcomes Knowledge Actions Conditions Generic Logic Model for CSREES Reporting CSREES – Office of Planning & Accountability(This model is intended to be illustrative guide for reporting on CSREES-funded research, education and extension activities. It is not a comprehensive inventory of our programs.) - New fundamental or applied knowledge - Scientific publications - Patents - New methods & technology - Plant & animal varieties - Practical knowledge for policy and decision-makers - Information, skills & technology for individuals, communities and programs - Participants reached - Students graduated in agricultural sciences Version 1.2

  35. Logic model as a planning and an evaluation tool Assumptions External factors

  36. Summary and Conclusions It is fully recognized that measuring the impact of basic science is difficult (OMB and NAS) Impact might not be realized long after studies are completed Debate not over yet but limited public resources have led to scrutiny about return on investments and the need to document effectiveness and efficiency of investment The use of logic model as a planning and evaluation tool has gained some ground The CSREES PREP Process has been quite helpful

  37. Questions? • E-mail questions to rwc@csrees.usda.gov • For more information, visit the Planning and Accountability web page at www.csrees.usda.gov/opa

  38. Topics for Next Time… E-mail questions to rwc@csrees.usda.gov

  39. See you in June!!! • Next CSREES Reporting Web Conference will be on Thursday, June 12 from 2-4 pm (Eastern) • Visit the conference web site at www.csrees.usda.gov/rwc for: • The recording of this conference • The slides from this conference • Recordings and slides from past conferences • Announcements E-mail questions to rwc@csrees.usda.gov

More Related