1 / 11

The National Research Council Assessment of Research Doctorate Programs, 2006-07

The National Research Council Assessment of Research Doctorate Programs, 2006-07 Assessment Overview and Implications for Columbia Lucy Drotning Associate Provost for Planning and Institutional Research John Scanlon Data Manager, Planning and Institutional Research. Background. Goal

crwys
Download Presentation

The National Research Council Assessment of Research Doctorate Programs, 2006-07

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The National Research Council Assessment of Research Doctorate Programs, 2006-07 Assessment Overview and Implications for Columbia Lucy Drotning Associate Provost for Planning and Institutional Research John Scanlon Data Manager, Planning and Institutional Research

  2. Background Goal The stated goal of the study is to “help universities improve the quality of these programs through benchmarking; provide potential students and the public with accessible, readily available information on doctoral programs nationwide; and enhance the nation's overall research capacity.” The study will consist of: • Collection of quantitative data through questionnaires • Data on faculty publications, citations, and research activity and student dissertation keywords • Faculty opinion on the relative importance of measures of program quality • Who is doing what? • NRC • Mathematica Policy Research (MPR) • Columbia

  3. End results: what will the study produce? • A quantitatively based methodology for rating graduate programs • Online database of graduate programs available to public • Expert analyses and reports

  4. Process and Timeline

  5. Programs Program List: Submitted in September to MPR; 56 total programs submitted. See website for all programs and field designations. • Program Questionnaires • What’s being collected? • Faculty Lists • List of all doctoral graduates from the past three years. • General program characteristics and policies. • Admissions, enrollment and degree completion data. • Detailed financial support data for full-time doctoral students in fall 2005. • How can programs help with the process? • Confirm/correct data that can be collected centrally • Provide answers to questions that cannot be answered centrally

  6. Programs- Faculty Lists Collection of faculty names by program For each participating program, we have to submit a list of faculty members. Faculty members must be classified in one of 3 categories: • CORE • has served on a dissertation committee in the past 5 years OR is a member of admissions or curriculum committee • is currently and formally designated as faculty in the program • NEW • is currently and formally designated as faculty in the program, but does not meet the Core requirements • has been hired in a tenured or tenure-track position in the past 3 academic years. • ASSOCIATED • has served on a dissertation committee in the past 5 years • is not currently and formally designated as faculty in the program Note: Faculty members can be reported in more than one program

  7. Programs-Faculty Lists, cont’d • What’s being collected? • Name • Street address and email address • Category (i.e. Core, New, Associated) • Highest degree awarded • Rank and tenure status • Number of dissertation committees “chaired” (i.e. “Sponsored”) in the past 5 academic years • Total number of dissertation committees served on in the past 5 academic years. • How will the data be used? • NRC will use email addresses to contact faculty members directly for faculty questionnaire. • NRC will collect citation and publication data for faculty members. • NRC will use a formula to allocate faculty headcount and productivity (publications and citations) for faculty serving in more than one graduate program. • How can programs help with the process? • Confirm/correct list: are all the appropriate faculty members listed? Is anyone missing? Is anyone on the list who shouldn’t be there? Is anyone categorized incorrectly? • Confirm/correct faculty name, contact, rank, tenure, and degree information, especially email address For each program, we will indicate all other programs in which the faculty are listed; we will also list anyone not included at all but who is appointed in the program.

  8. Programs-Faculty Productivity Faculty productivity will be calculated based on status in program and nature of dissertation service. Example: Professor Smith is considered Core in Program A and Program B. Between 2001-02 and 2005-06, she served as principal advisor on 3 dissertations and participated in 4 others in Program A and served as principal advisor on 1 dissertation and participated in 2 others in Program B. = 69% allocated to Program A. And therefore, 31% allocated to Program B.

  9. Faculty Questionnaire • Who receives the survey? • All Core and New faculty members. • What’s being collected? • Name and contact information. • Demographic data. • Specific information about faculty’s research field(s). • Education and work experience. • Doctoral graduates for the past 5 years. • Information about scholarship/research activity (with opportunity to upload CV). • Section on “relative importance of program characteristics to program quality.” • How will the data be used? • Information on names used to aid in NRC’s productivity data collection • Information on doctoral graduates will be used to check NRC’s effort to gather placement data • “Program quality” responses will be used to identify and weigh key study variables in conjunction with “Anchoring” study.

  10. Anchoring Study • Who receives the survey? • Random sample of Core faculty members • who completed faculty questionnaire • who indicated willingness to participate. • What’s being collected? • The faculty will receive a list of program faculty for a sample of 15 to 20 programs as well as additional program data and information (e.g. faculty diversity, student completion & placement) • Respondents will: • indicate familiarity with program • rate the quality of the programs on a 5-point scale. • How will the data be used? • Respondent ratings regressed on quantitative variables to derive weights of the key study variables.

  11. Summary • How can programs help with the process? • Recognize that there still is uncertainty in the process • Think through faculty lists carefully, and • Encourage faculty participation! • For more information: • Website: http://www.columbia.edu/cu/opir/nrc/index.html • Email: nrc_survey@columbia.edu

More Related