1 / 12

AGEP Evaluation Capacity Meeting 2008

AGEP Evaluation Capacity Meeting 2008. Yolanda George, Deputy Director, Education & Human Resources Programs. Objectives.

osborn
Download Presentation

AGEP Evaluation Capacity Meeting 2008

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. AGEP Evaluation Capacity Meeting 2008 Yolanda George, Deputy Director, Education & Human Resources Programs

  2. Objectives • Identifying methods and questions for evaluation studies related to STEM graduate student progression to the PhD and professoriate, including admissions/selections, retention/attrition, PhD completion, and post-doctoral experiences, including collection of quantitative and qualitative data. • Identifying methods and questions for Alliance evaluations, particularly in terms of progression to PhD and the professoriate. What can AGEPs learn from cross institutional studies?

  3. As you listen to presentations…. • What research informed the design of the study? • What type of data was collected? What was the rationale for deciding to collect this data? • What methods were used? What was the rationale for selecting methods used? • How were comparisons groups constructed? What are the reporting limitations in regards to the construction of the comparison groups?

  4. Another Objective for this AGEP Meeting Developing and writing impact statements or highlights (nuggets) that include data for use in: • AGEP NSF Annual Reports Findings section • AGEP Supplemental Report Questions • NSF Highlights • Brochures and Web sites

  5. The poster should include quantitative and qualitative data that provides evidence of: • Graduate student changes for selected STEM fields or all STEM fields • Infrastructure changes. This can include changes in institutional or departmental polices or practices • Alliance impact. This can include changes in institutional or departmental policies or practices related to graduate school affairs, postdoctoral arrangements, or faculty hiring. Stories and pictures are welcome but the major emphasis must be on quantitative and, as appropriate, qualitative data. Program descriptions need to be kept to a minimum and put in the context of the data behind decisions to keep or eliminate strategies. A focus can be on what works and what doesn't as long as the emphasis on the data that showed whether different strategies worked on not.

  6. Impact Evaluations and Statements An impact evaluation measures the program's effects and the extent to which its goals were attained. Although evaluation designs may produce useful information about a program's effectiveness, some may produce more useful information than others. • For example, designs that track effects over extended time periods (time series designs) are generally superior to those that simply compare periods before and after intervention (pre-post designs); • Comparison group designs are superior to those that lack any basis for comparison; and • Designs that use true control groups (experimental designs) have the greatest potential for producing authoritative results. http://www.ojp.usdoj.gov/BJA/evaluation/guide/documents/impact_eval_gangs.htm

  7. http://www.ed.gov/about/inits/ed/competitiveness/acc-mathscience/report.pdfhttp://www.ed.gov/about/inits/ed/competitiveness/acc-mathscience/report.pdf

  8. Multiple Evidence Collection for AGEP • AAAS & Campbell Kibler, Inc (Collecting trend data) Are the numbers changing? Coming Soon • Carlos Rodriquez --- Impact Evaluation with Comparisons • Portfolio Assessment – (announced by Bernice Anderson yesterday) Alliance Evaluation • Alliance Level Evaluation (Annual Reports that include highlights with comparisons, if appropriate) You might want to re-evaluate your Alliance design in light of the need to show attribution. Read the ACC report.

  9. Highlights (Nuggets) • Send highlight to l-allen@nsf.gov • Include highlights in your annual reports to NSF • AGEP Supplemental Annual Report Questions • Send highlight to your communication office. • Include highlights on your web sites, brochures, & posters

  10. Evaluation Capacity Tool Kit for AGEPs What do you want in the toolkit? Do you have sample evaluations that you want to include in the evaluation toolkit?

  11. Thank You

More Related