1 / 16

A Meta-Assessment of Statewide Program Evaluations

A Meta-Assessment of Statewide Program Evaluations. Jeffrey Pomerantz Carolyn Hank School of Info. & Library Science UNC Chapel Hill <pomerantz, hcarolyn>@unc.edu. Charles R. McClure Jordon Andrade College of Information Florida State University <cmcclure, jca07d>@fsu.edu.

kathy
Download Presentation

A Meta-Assessment of Statewide Program Evaluations

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. A Meta-Assessment of Statewide Program Evaluations Jeffrey Pomerantz Carolyn Hank School of Info. & Library Science UNC Chapel Hill <pomerantz, hcarolyn>@unc.edu Charles R. McClure Jordon Andrade College of Information Florida State University <cmcclure, jca07d>@fsu.edu Tapping the vast reservoir of human knowledge --Louis Round Wilson, founder, 1931

  2. http://is.gd/1eJQ Tapping the vast reservoir of human knowledge --Louis Round Wilson, founder, 1931

  3. Research questions • What methodologies are used in LSTA evaluations? • How do states’ goals map to LSTA goals? • What, if any, correlations are there between methodologies used and states’ and LSTA goals? • How successful are different methodologies in providing useful evaluation data about library programs?

  4. Background on IMLS and LSTA 2008: IMLS awarded ~$161 M under Grants to States program LSTA funds < 15% of total state library funding But ≈ 97% of all federal funding to state libraries IMLS encourages grantees to use Outcome Based Evaluation methods

  5. Rendon’s LSTA goal categories A: Establish or enhance electronic linkages among or between libraries. B: Electronically linking libraries with educational, social, or information services. C: Assisting libraries in accessing information through electronic networks. D: Encouraging libraries in different areas, & encouraging different types of libraries to establish consortia & share resources. E: Paying costs for libraries to acquire or share computer systems and telecommunications technologies. F: Targeting library & information services to persons having difficulty using a library & to underserved urban & rural communities, including children from families with incomes below the poverty line. The Rendon Group (2003). National Profile: Analyses of the Five-Year Evaluations submitted to the Institute of Museum and Library Services by the State Library Administrative Agencies under the Grants to States program of the Library Services and Technology Act. Washington, DC: Institute of Museum and Library Services.

  6. Methodology Collected 5-year plans & evaluation reports for 28 states for 1998-2002 & 2003-2007. From 5-year plans: • State’s 5-year goals, • Connection between state’s goals and LSTA’s goals. From evaluation reports: • Methodologies and data collection instruments used, • Stakeholder groups that provided data, • Accomplishment of LSTA goals, • Recommendations.

  7. Methods used in LSTA evaluations Methodologies used Number of methods used

  8. Data collected from whom

  9. States’ goals mapping to Rendon

  10. Accomplishment of states’ goals

  11. Accomplishment of Rendon goals

  12. Methods used to evaluate goals

  13. Discussion Quality and readability of eval reports varied considerably. Need more explicit mapping between state & LSTA goals, and eval methodologies. Only 39% of goals accomplished completely.

  14. Recommendations • To libraries: • Need for clearer goals & measurable objectives. • Need for more appropriate methods. • To IMLS: • Not just OBE. • Simplify the process. • To both: • Need for more consistent use of terms. • Need for planning pre-eval & re-program implementation.

  15. Answers to research questions • What methodologies are used in LSTA evaluations? A lot of surveys. • How do states’ goals map to LSTA goals? Not very well. • What, if any, correlations are there between methodologies used and states’ and LSTA goals? Not many. • How successful are different methodologies in providing useful evaluation data about library programs? Not very.

  16. Thank you! Jeffrey Pomerantz Carolyn Hank School of Info. & Library Science UNC Chapel Hill <pomerantz, hcarolyn>@unc.edu Charles R. McClure Jordon Andrade College of Information Florida State University <cmcclure, jca07d>@fsu.edu Tapping the vast reservoir of human knowledge --Louis Round Wilson, founder, 1931

More Related