- 33 Views
- Uploaded on
- Presentation posted in: General

[ Resampled Range of Witty Titles]

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

[Resampled Range of Witty Titles]

Understanding and Using the NRC Assessment of Doctorate Programs

Lydia Snover, Greg Harris & Scott Barge

Office of the Provost, Institutional Research

Massachusetts Institute of Technology • 2 Feb 2010

Overview

*NB: All figures/data in this presentation are used for illustrative purposes only and do not represent a known institution.

Background & Context

Approaches to Ranking

The NRC Model: A Modified Hybrid

Presenting & Using the Results

Introduction

History of NRC Rankings

MIT Data Collection Process

Introduction

Approaches to Ranking

Approaches to Ranking

- Use indicators (“countable” information) to compute a rating
- Number of publications
- Funded research per faculty member
- Etc.,

- Try to quantify more subjective measures through an overall perception-based rating
- Reputation
- “Creative blending of interdisciplinary perspectives”

Approaches to Rankings

The NRC Approach

The NRC Approach

The NRC used a modified hybrid of the two basic approaches:

- In total, a 4-step process, indicator based, by field
- Process results in 2 sets of indicator weights developed through faculty surveys:
- “Bottom up” –importance of indicators
- “Top-down” – perception-based ratings of a sample of programs

- Multiple iterations (re-sampling) to model “the variability in ratings by peer raters.” *

The NRC Approach

*For more information on the rationale for re-sampling, see pp. 14-15 of the NRC Methodology Report

STEP 1: Gather raw data from institutions, faculty & external sources on programs. Random University (RU) submitted data for its participating doctoral programs.

The NRC Approach

NRC

STEP 2: Use faculty input to develop weights:

- Method 1: Direct prioritization of indicators--“What characteristics (indicators) are important to program quality in your field?”

The NRC Approach

Calculations

STEP 2: Use faculty input to develop weights:

- Method 2: A sample of faculty each rate a sample of 15 programs from which indicator weights are derived.

The NRC Approach

PrincipleComponents

& Regression

STEP 3: Combine both sets of indicator weights and apply them to the raw data:

The NRC Approach

X

=

Rating

STEP 4: Repeat steps 500 times for each field

A) Randomly draw ½ of faculty “important characteristics” surveys

C) Randomly draw ½ of faculty program rating surveys

G) Randomly perturb institutions’ program data 500 times*

H) Use each pair of iterations (1 perturbation of data (G) + 1 set of weights (F)) to rate programs and prepare 500 ranked lists

D) Compute “regression- based” weights

The NRC Approach

B) Calculate “direct” weights

E) Combine weights

F) Repeat (A) – (E) 500 times to develop 500 sets of weights for each field

I) Toss out the lowest 125 and highest 125 rankings for each program and present the remaining range of rankings

*For more information on the perturbation of program data, see pp. 50-1 in the NRC Methodology Report

Results

Presenting & Using the Results

Results

- TABLE 1:Program values for each indicator plus overall summary statistics for the field

Results

- TABLE 2:Indicators and indicator weights – one standard deviation above and below the mean of the 500 weights produced for each indicator through the iterative process (and a locally calculated mean)

Results

*n.s. in a cell means the coefficient was not significantly different from 0 at the p=.05 level.

- TABLE 3:Range of rankings for RU’s Economics program alongside other programs, overall and dimensional rankings

Results

- TABLE 4:Range of rankings for all RU’s programs

Results

Q&A

- The full NRC Methodology Report
http://www.nap.edu/catalog.php?record_id=12676

- Helpful NRC Frequently Asked Questions Page
http://sites.nationalacademies.org/pga/Resdoc/PGA_051962

Resources