1 / 18

Graduate Program Review Where We Are, Where We Are Headed and Why

Graduate Program Review Where We Are, Where We Are Headed and Why. Duane K. Larick, Associate Graduate Dean Presentation to Directors of Graduate Programs Fall 2003. Why Assess Graduate Programs?. Improvement of graduate education Evaluation of program quality or need

lukas
Download Presentation

Graduate Program Review Where We Are, Where We Are Headed and Why

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Graduate Program ReviewWhere We Are, Where We Are Headed and Why Duane K. Larick, Associate Graduate Dean Presentation to Directors of Graduate Programs Fall 2003

  2. Why Assess Graduate Programs? • Improvement of graduate education • Evaluation of program quality or need • Resource allocation/re-allocation • Long-term institutional/college/departmental goals • Changes in focus or emphasis • Accreditation requirements Fall 2003Fall 2003

  3. SACS Criterion for Accreditation • Section 3 – Comprehensive Standards • “The institution identifies outcomes for its educational programs and its administrative and educational support services; assesses whether it achieves these outcomes; and provides evidence of improvement based on analysis of those results. Fall 2003Fall 2003

  4. Graduate Program Review at NC State Current Process: Administration • Administered by the Dean of the Graduate School • Initiated by program or Graduate School • Often at the Department level • Includes multiple degrees/programs • Partner with College and/or accreditation reviews Fall 2003Fall 2003

  5. Graduate Program Review at NC State Current Process: Objectives • Reviews are conducted to gain a clearer understanding of a program’s: • Purpose(s) within NC State • Effectiveness in achieving purposes • Overall quality • Future objectives • Changes needed to achieve objectives Fall 2003Fall 2003

  6. Graduate Program Review at NC State Current Process: Operational Procedures • 10 year review cycle • Components • Internal self-study • External “team” review • Review team report – oral & written • Program response prepared • Administrative Meeting • Graduate Dean, Provost, Vice-Chancellor for Research & Graduate Studies, College Administration, Department Head, Director of Graduate Programs, Review Team Chair Fall 2003Fall 2003

  7. Graduate Program Review at NC State Current Process: Information Made Available • Last program review report & response • Graduate program profile (updated annually – 10 year data ) • Enrollment: numbers, demographics • Applications • Numbers applied/admitted/enrolled • Quality indicators – GRE, GPA, etc. • Number of degrees awarded, time to degree • Financial support • Exit interviews Fall 2003Fall 2003

  8. Graduate School Assessment of the Current Process • Self-Study that identified problems with the current protocol • Decided not to combine with CUPR (UG program review) • Differences in review process/expectations • Appointed a Task Force • Made up of faculty & administrators • Rely on on-campus expertise Fall 2003Fall 2003

  9. Graduate Program Review at NC State Task Force Goals: • Evaluate purpose and goals of review • Examine current protocols, especially with respect to • Continuous and ongoing • Outcomes-based assessment • Make recommendations on what could be done to make the outcome of a review more effective? • Determine what the role of the Graduate School and the Administrative Board should be? • Make recommendations on the infrastructure necessary to operate the process Fall 2003Fall 2003

  10. Questions The Task Force Asked • Do each of our degree programs have clearly defined student outcomes? • Are they measurable or observable? • Do we provide or do programs collect data to assess the achievement of degree program outcomes? • Do programs use assessment results to improve programs? • Do we document that we use assessment results to improve programs? Fall 2003Fall 2003

  11. Graduate Program Review at NC State Task Force Key Findings: • The current process is fairly typical. • Graduate program reviews typically are conducted on a 6- to 10-year cycle. • The current process follows Council of Graduate School guidelines. • An external review component should be continued. • Greater emphasis should be placed on program developed student learning outcomes. Fall 2003Fall 2003

  12. Graduate Program Review at NC State Task Force Key Findings – continued: • The revised process should be more continuous and ongoing – 10-year intervals are not realistic. • The review process should result in appropriate follow-up and tie into COMPACT Planning. • Current resources do not allow review of all graduate programs on a 10-year cycle, much less on a more ongoing basis. Fall 2003Fall 2003

  13. What We Propose to Do • Continue the traditional external review program on an 8 year schedule. • Continue to partner with external reviews already conducted for accreditation or other purposes. • Emphasize development of program specific student learning outcomes and assessment procedures to determine if they are being achieve. • Already part of the current protocol (section 4.5 of self-study outline) Fall 2003Fall 2003

  14. What We Propose to Do • Provide the training necessary for programs to implement changes. • Modify the current “Profile” information and work with University Planning and Analysis to improve utility of centralized data collection and processing. • Increase efforts relative to follow-up after the graduate program review – assess progress on recommendations. Fall 2003Fall 2003

  15. What is Outcome-Based Assessment? • Shift to student learning centered concerns • “What do we want our students to know?” • “How well does the program promote learning?” • Moves from the “quality” of presentation to “How well did the student learn it?” • Assesses achievement of the outcomes on a continuous rather than episodic basis Fall 2003Fall 2003

  16. Common Graduate Outcomes • Students will demonstrate professional competencies including: • Knowledge of concepts in the discipline • Ability to conduct independent research • Ability to work in teams • Ability to use appropriate technologies • Ability to find employment in discipline • Ability to teach others • Oral & written communication skills • Critical thinking skills Fall 2003Fall 2003

  17. Examples of Assessment Opportunities • Pre-Program/Admissions • Quality indicators like GPA, GRE score, etc. • In-Program • Annual Progress Evaluation • Qualifying/Preliminary Examination; including proposal preparation & defense • Required seminar presentations • Portfolios – design, architecture, etc. Fall 2003Fall 2003

  18. Examples of Assessment Opportunities - continued • At Program Completion • Presentations • Thesis/Dissertation; preparation, defense, etc. • Published works • Exit interview or survey • Job placement • Long Term • Alumni surveys • Evidence of lifelong learning • Career success Fall 2003Fall 2003

More Related