assessing the mission of doctoral research universities n.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
Assessing the Mission of Doctoral Research Universities PowerPoint Presentation
Download Presentation
Assessing the Mission of Doctoral Research Universities

Loading in 2 Seconds...

play fullscreen
1 / 82

Assessing the Mission of Doctoral Research Universities - PowerPoint PPT Presentation


  • 102 Views
  • Uploaded on

Assessing the Mission of Doctoral Research Universities. J. Joseph Hoey, Georgia Tech Lorne Kuffel, College of William and Mary North Carolina State University Workshop October 30-31, 2003. Guidelines for This Presentation. Please turn off or silence you cell phones

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Assessing the Mission of Doctoral Research Universities' - nanji


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
assessing the mission of doctoral research universities

Assessing the Mission of Doctoral Research Universities

J. Joseph Hoey, Georgia Tech

Lorne Kuffel, College of William and Mary

North Carolina State University Workshop

October 30-31, 2003

guidelines for this presentation
Guidelines for This Presentation
  • Please turn off or silence you cell phones
  • Please feel free to raise questions at anytime during the presentation, we will also leave time at the end for general discussion.
  • We are very interested in your participation
agenda
Agenda
  • Introduction and Objectives
  • Reasons for Graduate Assessment
  • Comparative Data Sources
  • Developing Faculty Expectations for Graduate Students
  • Principles of Graduate Assessment
  • Physics Case Study
  • Taking Assessment Online
  • Summary and Discussion
objectives
Objectives
  • Articulate motivations for undertaking graduate assessment
  • Increase awareness of comparative data sources
  • Program Linkages for Graduate Assessment
  • Hands-on: develop faculty expectations for student competence; utilize diverse data sources to evaluate a graduate program’s first assessment efforts; etc.
why assess graduate programs
Why Assess Graduate Programs?
  • We are all interested in the quality and improvement of graduate education
  • To help satisfy calls for accountability
  • Accreditation requirements: SACS accreditation imperatives
  • “To change or improve an invisible system, one must first make it visible”

– Schilling and Schilling, 1993, p. 172.

common internal reasons for graduate assessment
Common Internal Reasons for Graduate Assessment
  • Program marketing
  • Meet short-term (tactical) objectives or targets
  • Meet long-term (strategic) institutional/departmental goals
  • Funded project evaluation (GAANN, IGERT)
  • Understand sources of retention/attrition among students and faculty
sacs principles of accreditation
SACS Principles of Accreditation
  • Core requirement #5: “The institution engages in ongoing, integrated, and institution-wide research-based planning and evaluation processes that incorporate a systematic review of programs and services that (a) results in continuing improvement and (b) demonstrates that the institution is effectively accomplishing its mission.”
sacs principles of accreditation1
SACS Principles of Accreditation
  • Section 3 – Comprehensive Standards: Institution Mission, Governance, And Institutional Effectiveness
    • “16. The institution identifies outcomes for its educational programs and its administrative and educational support services; assesses whether it achieves these outcomes; and provides evidence of improvement based on analysis of those results.”
sacs principles of accreditation2
SACS Principles of Accreditation
  • Section 3 – Comprehensive Standards: Standards for All Educational Programs
    • “12. The institution places primary responsibility for the content, quality, and effectiveness of its curriculum with the faculty”
    • “18. The institution ensures that its graduate instruction and resources foster independent learning, enabling the graduate to contribute to a profession or field of study.”
sacs accreditation
SACS Accreditation
  • The intent of the SACS procedures is to stimulate institutions to create an environment of planned change for improving the educational process.
language
Language
  • Much of the assessment literature employs a fair amount of industrial or business speak
  • Feel free to develop and use your own
  • Keep it consistent across the institution
  • Produce and maintain a glossary of terms
so what do we need to do
So What Do We Need to Do?
  • Do our departments have a clear mission statement?
  • Do we have departmental plans to evaluate the effectiveness of our degree programs?
    • Do our degree programs have clearly defined faculty expectations for students?
    • Are they published and are they measurable or observable?
    • Do we obtain data to assess the achievement of faculty expectations for students?
    • Do we document that assessment results are used to change or sustain the excellence of program activities and further student gains in professional and attitudinal skills and experiences?
so what do we need to do cont
So What Do We Need to Do? (Cont.)
  • Based on assessment results, do we reevaluate the appropriateness of departmental missions as well as the expectations we hold for student competence?

The amount of work needed to satisfy accreditation requirements is proportional to the number of ‘No’ responses to the above questions.

needed to succeed
Needed to Succeed
  • The department should want to do this process
  • The department must use the information collected
  • The institution must use the information collected
  • Use participation in the process as part of faculty reviews
focusing efforts
Focusing Efforts
  • It is important to achieve a strategic focus for the program, decide what knowledge, skills, abilities, and experiences should characterize students who graduate from our program…
what is important to measure
What is Important to Measure?
  • To decide this, it is first vital to ask:
    • What are our strong areas?
    • What are our limitations?
    • What do we want to accomplish in
      • Education of students?
      • Research?
      • Service?
purpose statement sample
Purpose Statement (sample)

The Anthropology Department serves the institution by offering courses and scholarly experiences that contribute to the liberal education of undergraduates and the scholarly accomplishments of graduate students. Program faculty members offer courses, seminars, directed readings, and directed research studies that promote social scientific understandings of human cultures. The Department offers a bachelor’s degree major and minor, an M.A. degree, and a Ph.D.

developing a plan to evaluate degree programs
Developing a Plan to Evaluate Degree Programs
  • How to start a departmental plan: top down or bottom up (Palomba and Palomba, 2001)
    • Top Down – As a group of scholars, decide what are the important goals or objectives for the program.
    • Bottom Up – Identify the primary faculty expectations for student competence in core courses in the program and use this list to develop overarching expectations for student competence.
develop an assessment plan
Develop an Assessment Plan
  • Desirable characteristics for assessment plans: (Palomba and Palomba, 1999)
    • Identify assessment procedures to address faculty expectations for student competence;
    • Use procedures such as sampling student work and drawing on institutional data where appropriate;
    • Include multiple measures;
    • Describe the people, committees, and processes involved; and
    • Contain plans for using assessment information.
words to remember when starting an assessment plan
Words to Remember When Starting an Assessment Plan
  • It may be best to tackle the modest objectives first.
  • Assessment plans should recognize that students are active participants and share responsibility for their learning experience along with the faculty and administration.
  • It takes a long time to do assessment well. So be patient and be flexible.
  • The overriding goal is to improve educational programs, not to fill out reports or demonstrate accountability.
use a program profile to get started

Use a Program Profile to get Started

Related to Operational Objectives

data for profiles
Data for Profiles
  • Admissions: Applications, acceptance rates, and yield rates
  • Standardized Test Scores
    • Graduate Record Examination (GRE) http://www.gre.org/edindex.html
    • Graduate Management Admission Test (GMAT) http://www.gmac.com/
    • Law School Admission Test (LSAT) http://www.lsac.org/
  • Undergraduate GPA
  • Headcount or Major Enrollments (Full/Part-Time)
  • Degrees Awarded
profiles cont
Profiles (Cont.)
  • Formula Funding Elements when appropriate
  • Time-to-Degree and/or Graduation/Retention Rates
  • Support for Students (Type of Assistance)
  • Faculty Headcount (Full/Part, Tenure Status)
  • Faculty Salaries
  • Faculty Productivity or Workload Compliance
  • Research Proposals Submitted/Awarded
  • Research Award/Expenditure Dollars
  • Instructional and Research Facility Space
comparative data
Comparative Data
  • Survey of Earned Doctorates (SED)
  • National Center for Educational Statistics (NCES) Institutional Postsecondary Educational Data System (IPEDS)
  • National Research Council (NRC) Reports
  • Higher Education Data Sharing Consortium (HEDS) Graduate Student Survey (GSS)
  • American Association of University Professors (AAUP) or College and University Professional Association (CUPA) Faculty Salary Surveys
sed data
SED Data
  • Is administered annually and has a very high annual response rate
  • Doctoral degrees awarded by broad field and subfield by gender, racial/ethnic group, and citizenship.
  • Institutional ranking by number of doctorate awards (top 20) by broad field and by racial/ethnic group
  • Time-to-Degree (three measures) by broad field, gender, racial/ethnic group, and citizenship
sed data cont
SED Data (Cont.)
  • Financial resources for student support by broad field, gender, racial/ethnic group, and citizenship
  • Postgraduate plans, employment, and location by broad field, gender, racial/ethnic group, and citizenship
  • Reports are available at http://www.norc.uchicago.edu/issues/docdata.htm
ipeds data
IPEDS Data
  • Fall enrollments by major field (2-digit CIP code) of study, race/ethnicity and citizenship, gender, attendance status (full/part-time), and level of student (undergraduate, graduate, and first professional)
    • The discipline field data is reported in even years only.
  • Annual degrees conferred by program (6-digit CIP code) or major discipline (2-digit CIP code), award level (associate degree, baccalaureate, Master’s, doctoral, and first professional), race/ethnicity and citizenship, and gender.
    • Reported annually
ipeds data cont
IPEDS Data (Cont.)
  • Useful for identifying peer institutions
  • Available at the IPEDS Peer Analysis System http://nces.ed.gov/Ipeds/
  • These data are also published in the National Center for Education Statistics (NCES), Digest of Education Statistics
national research council
National Research Council

Research-Doctorate Programs in the United States

  • This information is dated (1982 and 1993) with a new study scheduled for 2004 (?).
  • Benefit is rankings of programs. But some critics suggest “reputational rankings cannot accurately reflect the quality of graduate programs.” (Graham & Diamond, 1999)
  • The National Survey of Graduate Faculty
    • Scholarly quality of program faculty
    • Effectiveness of program in educating research scholars/scientists
    • Change in program quality in last five years
profile comparison for history and physics nrc ranking
Profile Comparison for History and Physics – NRC Ranking
  • History department ranked 46.5
  • Physics department ranked 63

(Goldberger, Maher, and Flattau, 1995)

why describe faculty expectations for students
Why Describe Faculty Expectations for Students?
  • To sustain program excellence and productivity
  • To give faculty feedback and the ability to make modifications based on measurable indicators, not anecdotes
  • To inform and motivate students
  • To meet external standards for accountability
what are our real expectations
What Are Our Real Expectations?

Read each question thoroughly. Answer all questions. Time limit: four hours. Begin immediately.

  • MUSIC: Write a piano concerto. Orchestrate it and perform it with flute and drum. You will find a piano under your seat.
  • MATHEMATICS: Give today's date, in metric.
  • CHEMISTRY. Transform lead into gold. You will find a beaker and three lead sinkers under your seat. Show all work including Feynman diagrams and quantum functions for all steps.
  • ECONOMICS: Develop a realistic plan for refinancing the national debt. Run for Congress. Build a political power base. Successfully pass your plan and implement it.
steps to describing expectations 1
Steps to Describing Expectations - 1
  • Write down the result or desired end state as it relates to the program.
  • Jot down, in words and phrases, the performances that, if achieved, would cause us to agree that the expectation has been met.
  • Phrase these in terms of results achieved rather than activities undertaken.
steps to describing expectations 2
Steps to Describing Expectations - 2
  • Sort out the words and phrases. Delete duplications and unwanted items.
  • Repeat first two steps for any remaining abstractions (unobservable results) considered important.
  • Write a complete statement for each performance, describing the nature, quality, or amount we consider acceptable.
  • Consider the point in the program where it would make the most sense for students to demonstrate this performance.
steps to describing expectations 3
Steps to Describing Expectations - 3
  • Again, remember to distinguish results from activities.
  • Test the statements by asking: If someone achieved or demonstrated each of these performances, would we be willing to say the student has met the expectation?
  • When we can answer yes, the analysis is finished.
steps to describing expectations 4
Steps to Describing Expectations - 4
  • Decide how to measure the meeting of an expectation: can we measure it directly? Indirectly through indicators?
  • In general, the more direct the measurement, the more content valid it is.
  • For more complex, higher order expectations: may need to use indicators of an unobservable result.
steps to describing expectations 5
Steps to Describing Expectations - 5
  • Decide upon a preferred measurement tool or student task.
  • Describe the expectation in terms that measure student competence and yield useful feedback.
try it
Try it!
  • What Faculty Expectation? Our sample is this: Graduates will be lifelong learners
  • Decide: Under what condition? When and where will students demonstrate skills?
  • Decide: How well? What will we use as criteria?
try it1
Try it!
  • Under what condition?
  • Condition: Students will give evidence of having the ability and the propensity to engage in lifelong learning prior to graduation from the program.
try it2
Try it!
  • How well? Specify performance criteria for the extent to which students:
    • Display a knowledge of current disciplinary professional journals and can critique them
    • Are able to access sources of disciplinary knowledge
    • Seek opportunities to engage in further professional development activities
    • Other?
principles of graduate assessment
Principles of Graduate Assessment
  • Clearly differentiate master’s and doctoral level expectations
  • Assessment must be responsive to more individualized nature of programs
  • Assessment of real student works is preferable
  • Students already create the products we can use for assessment!
principles of graduate assessment continued
Principles of Graduate Assessment (continued)
  • Use assessment both as a self-reflection tool and an evaluative tool
  • Build in feedback to the student and checkpoints
  • Use natural points of contact with administrative processes
common faculty expectations at the graduate level
Common Faculty Expectations at the Graduate Level
  • Students will demonstrate professional and attitudinal skills, including:
    • Oral, written and mathematical communication skills;
    • Knowledge of concepts in the discipline;
    • Critical and reflective thinking skills;
    • Knowledge of the social, cultural, and economic contexts of the discipline;
    • Ability to apply theory to professional practice;
    • Ability to conduct independent research;
common faculty expectations at the graduate level continued
Common Faculty Expectations at the Graduate Level (continued)
  • Students will demonstrate professional and attitudinal skills, including:
    • Ability to use appropriate technologies;
    • Ability to work with others, especially in teams;
    • Ability to teach others; and
    • Demonstration of professional attitudes and values such as workplace ethics and lifelong learning.
areas and linkage points to consider in graduate assessment
Areas and Linkage Points to Consider in Graduate Assessment
  • Deciding on what is important to measure
  • Pre-program assessment
  • In-program assessment
  • Assessment at program completion
  • Long-term assessment
  • Educational process assessment
  • Comprehensive assessment (program review)
use natural linkage points
Use Natural Linkage Points
  • Admission: use diagnostic exam or GRE subject test
  • Annual: advising appointment/progress check
  • Qualifying/Comprehensive exams: embed items relevant to program objectives
  • Thesis and dissertation: develop rubrics to rate multiple areas relevant to program objectives
  • Exit: exit interview; exit survey at thesis appointment, check-out, or commencement
pre program assessment
Pre-Program Assessment
  • Re-Thinking Admissions Criteria (Hagedorn and Nora, 1997):
    • Problem: Graduate persistence.
    • GRE is only designed to predict first-year performance.
    • UG GPA and GRE are not measures of professional and attitudinal competency.
    • A variety of skills, talents, and experiences is necessary for success but not usually included in admissions criteria.
  • Evaluating the fit between the program and the student is important.
other pre program assessment tools
Other Pre-Program Assessment Tools
  • Portfolio and/or structured interviews featuring:
    • Research interests and previous products
    • Critique of a report or research paper
    • Plan for a research project
    • Prior out-of-class experiences
  • Inventories to assess motivation, personality, fit to program
in program assessment of student learning
In-Program Assessment of Student Learning
  • Based on faculty expectations
  • Methods may include assessment of:
    • Case studies, term papers, projects
    • Oral seminar presentations
    • Preliminary exams, knowledge in field
    • Research and grant proposals
    • Portfolios
    • Problem-Based Learning or Team projects
    • Input from advisors, graduate internship director
assessment at program completion
Assessment at Program Completion
  • Allows demonstration of synthesis of knowledge, skills and attitudes learned
  • Ideal comprehensive assessment point --but a sense of where the student began is desirable to assess change, growth, and value added
    • Qualitative analysis may be appropriate
    • Portfolio of research, scholarly products
assessment at program completion continued
Assessment at Program Completion (continued)
  • Methods may include assessment of:
    • Thesis/dissertation; oral defense
    • Professional registration or licensure exam
    • Published works, conference papers
    • Portfolio
    • Exit interview
    • Exit survey
long term assessment
Long-Term Assessment
  • Common sentiment: graduates can adequately self-assess the outcomes of their program only after they have been applying their skills for several years following graduation.
  • Pursuing long-term assessment, based on identified learning objectives, is an important component of a graduate assessment program.
long term assessment continued
Long-Term Assessment (continued)
  • AAU (1998): important to track graduates of post-baccalaureate programs:
    • to gain information on expectations vs. learning experiences;
    • to gain data on outcomes and placement.
  • Other reasons: to them involved in the life of the school; to bring them back as speakers, mentors, advisory board members…and donors.
long term assessment continued1
Long-Term Assessment (continued)
  • May include assessment of:
    • Job placement and linkage to degree
    • Career success
    • Production of scholarly work
    • Evidence of lifelong learning
    • Awards and recognition gained
    • Participation in professional societies
    • Satisfaction with knowledge gained
long term assessment continued2
Long-Term Assessment (continued)
  • Common Assessment Methods:
    • Follow-up interviews, surveys or focus groups
    • Journal publications
    • Citation indices
    • Membership lists and papers presented in professional/disciplinary associations
value of assessing the educational process
Value of Assessing the Educational Process
  • Widely viewed as key to graduate retention
  • Helps understand the strengths and needs for improvement of graduate coursework, research experience, teaching experience, advising, and support services.
  • Environment and process assessment: see Golde and Dore (2001) survey for Pew Charitable Trusts.
ways of assessing the educational process continued
Ways of Assessing the Educational Process (continued)
  • Graduate student advisory groups
  • Surveys of students, focus groups
  • Peer review of teaching
  • Institutional data: time to degree, graduation rate
  • Advising process
  • Mentoring process
assessing the mentoring process
Assessing the Mentoring Process
  • A primary graduate learning and professional enculturation process
  • Mentoring at UC Berkeley (Nerad and Miller, 1996):
    • All faculty advise individuals, but mentoring is the shared responsibility of all members of dept.
    • Individual faculty mentors to students
    • Departmental seminars and workshops
comprehensive assessment program review
Comprehensive Assessment: Program Review
  • The combination of an internal self-study and an external review of the program by qualified faculty peers forms a very powerful and comprehensive assessment device.
  • Program review encompasses an examination of resources, processes, and student learning outcomes.
program review examples of areas to evaluate
Program Review: Examples of Areas to Evaluate
  • Achievement of Faculty Expectations
    • communication skills appropriate to the discipline, professional and attitudinal competency, ability to conduct independent research, etc.
  • Processes
    • coursework, research opportunities, teaching, internships, comprehensive exams, theses, and time in residence
  • Resources (Profile)
    • faculty, students, library, instructional and lab space, financial support, extramural support, etc.
putting the pieces together
Putting the Pieces Together
  • Adapted from Baird (1996): matrix of faculty expectations, linkage points to use in conducting assessment, and some possible methods to use.
  • Adapt for use by each department by inserting appropriate faculty expectations for each program.
case study
Case Study
  • See case study handout
  • Doctoral program in Physics at Muggy Research University (MRU)
  • First time through their assessment process
  • Data in hand: What now?
  • You are the consultants!
case study debriefing questions
Case Study: Debriefing Questions
  • What do you see in the results?
  • What do you recommend?
  • What actions do they need to take?
  • In light of their mission, what should they do next time?
taking assessment online

Taking Assessment Online

Georgia Tech’s Approach: Online Assessment Tracking System (OATS)

oats purpose
OATS-Purpose
  • Annual Assessment Updates are a key piece in Tech’s efforts to demonstrate compliance with SACS Principles of Accreditation.
  • Annual Assessment Updates concept was generated by GT unit coordinators in 1998 as a way of documenting Tech’s responsiveness to SACS recommendations re: assessment practices.
  • Many people have requested that the process be moved to an online environment.
  • The online process provides structure, formalizes best practices in assessment of student learning, and thus facilitates demonstration of compliance.
  • SACS 2005 will be an electronic remote review.
annual assessment update

OBJECTIVES

METHODS

ACTIONS

Annual Assessment Update

New Method

Previous Method

  • What Did You Look At?
  • How Did You Look At It?
  • What Did You Find?
  • What Did You Do?

RESULTS

feature comparison
Old System

Many different formats

Hard copy only

Difficult to track progress over time

Flexibility (but no consistency across Institute)

Difficult to provide feedback internally and to facilitate institutional sharing of good practices

OATS

Consistent format

Database storage

Ability to track progress over time

Flexibility maintained

Process facilitates accreditation e-review

Easier to provide feedback; facilitates institutional sharing

Feature Comparison
oats application
OATS Application
  • Includes user id/password logon
  • Web accessible from any location
  • Defined format structure—Objectives, Methods, Results, and Actions/Impact
    • Allows posting of formatted text (tables, charts, etc.)
    • Allows notes and written feedback
  • Review at School/Unit and College level keeps everyone in the loop
  • OATS Production Date: October 1
  • Assessment Updates due: December 1 this year
school level history technology society

- example -

School Level: History, Technology & Society

Sent to College

Sent to College

Sent to College

summary
Summary
  • SACS requires assessment of graduate programs, research and public service
  • Make it relevant to the program
  • Keep it simple and focused
  • Consider different assessments for each stage of student progress
  • Start now: it takes several years to fine tune
references
References:
  • See references in back of handout
session evaluation
Session Evaluation
  • What one aspect was the most useful to you?
  • What one aspect most needs improvement, and what kind of improvement?
  • Other suggestions?
thank you

Thank You!

Questions? Contact us!

Joseph.hoey@oars.gatech.edu

Lorne@wm.edu