A basic toolbox for assessing institutional effectiveness
This presentation is the property of its rightful owner.
Sponsored Links
1 / 88

A Basic Toolbox for Assessing Institutional Effectiveness PowerPoint PPT Presentation


  • 140 Views
  • Uploaded on
  • Presentation posted in: General

A Basic Toolbox for Assessing Institutional Effectiveness. Michael F. Middaugh Assistant Vice President for Institutional Research and Planning University of Delaware Commissioner and Vice Chair Middle States Commission on Higher Education. Workshop Objectives.

Download Presentation

A Basic Toolbox for Assessing Institutional Effectiveness

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript


A basic toolbox for assessing institutional effectiveness

A Basic Toolbox forAssessing Institutional Effectiveness

Michael F. Middaugh

Assistant Vice President for Institutional Research and Planning

University of Delaware

Commissioner and Vice Chair

Middle States Commission on Higher Education


Workshop objectives

Workshop Objectives

  • Identify context for why assessment of institutional effectiveness is important

  • Identify key variables in developing measures for assessing institutional effectiveness

  • Identify appropriate data collection strategies for measuring those variables

  • Identify appropriate strategies for communicating information (NOTE THAT I DID NOT SAY DATA!) on institutional effectiveness


Context

Context


Robert zemsky and william massy 1990

Robert Zemsky and William Massy - 1990

“[The academic ratchet] is a term to describe the steady, irreversible shift of faculty allegiance away from the goals of a given institution, toward those of an academic specialty. The ratchet denotes the advance of an entrepreneurial spirit among faculty nationwide, leading to increased emphasis on research and publication, and on teaching one’s specialty in favor of general introduction courses, often at the expense of coherence in an academic curriculum. Institutions seeking to enhance their own prestige may contribute to the ratchet by reducing faculty teaching and advising responsibilities across the board, enabling faculty to pursue their individual research and publication with fewer distractions. The academic ratchet raises an institution’s costs, and it results in undergraduates paying more to attend institutions in which they receive less attention than in previous decades.”

(Zemsky and Massy, 1990, p. 22)


Boyer commission on educating undergraduates 1998

Boyer Commission on Educating Undergraduates - 1998

“To an overwhelming degree, they [research universities] have furnished the cultural, intellectual, economic, and political leadership of the nation. Nevertheless, the research universities have too often failed, and continue to fail, their undergraduate populations…Again and again, universities are guilty of advertising practices they would condemn in the commercial world. Recruitment materials display proudly the world-famous professors, the splendid facilities and ground breaking research that goes on within them, but thousands of students graduate without ever seeing the world-famous professors or tasting genuine research. Some of their instructors are likely to be badly trained or untrained teaching assistants who are groping their way toward a teaching technique; some others may be tenured drones who deliver set lectures from yellowed notes, making no effort to engage the bored minds of the students in front of them.”

(Boyer Commission, pp. 5-6)


U s news america s best colleges 1996

U.S. News “America’s Best Colleges” - 1996

“The trouble is that higher education remains a labor-intensive service industry made up of thousands of stubbornly independent and mutually jealous units that support expensive and vastly underused facilities. It is a more than $200 billion-a-year economic enterprise – many of whose leaders oddly disdain economic enterprise, and often regard efficiency, productivity, and commercial opportunity with the same hauteur with which Victorian aristocrats viewed those ‘in trade’… The net result is a hideously inefficient system that, for all its tax advantages and public and private subsidies, still extracts a larger share of family income than almost anywhere else on the planet…”

(America’s Best Colleges, p. 91)


National commission on the cost of higher education 1998

National Commission on the Cost of Higher Education - 1998

  • “…because academic institutions do not account differently for time spent directly in the classroom and time spent on other teaching and research activities, it is almost impossible to explain to the public how individuals employed in higher education use their time. Consequently, the public and public officials find it hard to be confident that academic leaders allocate resources effectively and well. Questions about costs and their allocation to research, service, and teaching are hard to discuss in simple, straightforward ways and the connection between these activities and student learning is difficult to draw. In responding to this growing concern, academic leaders have been hampered by poor information and sometimes inclined to take issue with those who asked for better data. Academic institutions need much better definitions and measures of how faculty members, administrators, and students use their time.”

    (National Commission on Cost of Higher

    Education, p. 20)


Spellings commission on the future of higher education 2006

Spellings Commission on the Future of Higher Education 2006

“We believe that improved accountability is vital to ensuring the success of all of the other reforms we propose. Colleges and universities must become more transparent about cost, price, and student success outcomes, and must willingly share this information with students and families. Student achievement, which is inextricably connected to institutional success, must be measured by institutions on a “value-added” basis that takes into account students’ academic baseline when assessing their results. This information should be available to students, and reported publicly in aggregate form to provide consumers and policymakers an accessible, understandable way to measure the relative effectiveness of different colleges and universities.”

(Spellings Commission, p.4)


Middle states accreditation standards expectations assessment planning

Middle States Accreditation StandardsExpectations: Assessment & Planning

It is the Commission’s intent, through the self-study process, to prompt institutions to reflect on those assessment activities currently in place (both for institutional effectiveness and student learning), to consider how these assessment activities inform institutional planning, and to determine how to improve the effectiveness and integration of planning and assessment.


Msche linked accreditation standards standard 14 student learning outcomes

MSCHE Linked Accreditation Standards:Standard 14: Student Learning Outcomes

Assessment of student learning demonstrates that, at graduation, or other appropriate points, the institution’s students have knowledge, skills, and competencies consistent with institutional and appropriate higher education goals.


Selected fundamental elements for msche standard 14

Selected Fundamental Elements forMSCHE Standard 14

  • Articulated expectations for student learning (at institutional, degree/program, and course levels)

  • Documented, organized, and sustained assessment processes (that may include a formal assessment plan)

  • Evidence that student learning assessment information is shared and used to improve teaching and learning

  • Documented use of student learning assessment information as part of institutional assessment


Msche linked accreditation standards standard 7 institutional assessment

MSCHE Linked Accreditation Standards:Standard 7: Institutional Assessment

The institution has developed and implemented an assessment process that evaluates its overall effectiveness in achieving its mission and goals and its compliance with accreditation standards.


Selected fundamental elements for msche standard 7

Selected Fundamental Elements forMSCHE Standard 7

  • Documented, organized, and sustained assessment processes to evaluate the total range of programs and services, achievement of mission, and compliance with accreditation standards

  • Evidence that assessment results are shared and used in institutional planning, resource allocation and renewal.

  • Written institutional strategic plan(s) that reflect(s) consideration of assessment results


A basic toolbox for assessing institutional effectiveness

MSCHE Linked Accreditation Standards:Standard 2: Planning, Resource Allocationand Institutional Renewal

An institution conducts ongoing planning and resource allocation based on its mission and goals, develops objectives to achieve them, and utilizes the results of its assessment activities for institutional renewal. Implementation and subsequent evaluation of the success of the strategic plan and resource allocation support the development and change necessary to improve and to maintain quality.


Selected fundamental elements for msche standard 2

Selected Fundamental Elements forMSCHE Standard 2

  • Clearly stated goals and objectives that reflect conclusions drawn from assessments that are used for planning and resource allocation at the institutional and unit levels

  • Planning and improvement processes that are clearly communicated, provide for constituent participation, and incorporate the use of assessment results

  • Assignment of responsibility for improvement and assurance of accountability


Variables

Variables

While we will discuss several variables today that contribute to assessment of institutional effectiveness, keep in mind that you don’t have to measure everything.

PRIORITIZE within the context of your institution’s culture and needs.


Students

Students

  • Admitted

  • Entering

  • Continuing

  • Non-Returning

  • Graduating

  • Alumni


Environmental issues

Environmental Issues

  • Student and Faculty Engagement

  • Student and Staff Satisfaction

  • Employee Productivity

  • Compensation

    - Market

    - Equity

  • Campus Climate

  • Economic Impact


Students1

STUDENTS


Admitted students

Admitted Students

  • What can we learn from monitoring admissions cycles?

  • What additional drill down is needed to fully understand student admissions behavior?


A typical admissions monitoring report

A Typical Admissions Monitoring Report


Drilling down

Drilling Down

  • Why do some students to whom we extend an offer of admission choose to attend our institution?

  • Why do other students to whom we extend an offer of admission choose to attend a different school?

  • How is our institution perceived by prospective students within the admissions marketplace?

  • What sources of information do students draw upon in shaping those perceptions?

  • What is the role of financial aid in shaping the college selection decision?


Survey research is useful in addressing these questions

Survey Research is Useful in Addressing These Questions

  • “Home-Grown” College Student Selection Survey

  • Commercially Prepared

    - College Board Admitted Student Questionnaire

    - College Board Admitted Student Questionnaire-Plus

  • Commercially prepared allows for benchmarking


Academic preparedness of respondent pool

Academic Preparedness of Respondent Pool


Academic preparedness of respondent pool1

Academic Preparedness of Respondent Pool


Academic preparedness of respondent pool2

Academic Preparedness of Respondent Pool


A basic toolbox for assessing institutional effectiveness

The survey allows respondents to rate 16 items with respect to their influence on the college selection decision. In choosing which institution to attend, the top three considerations for both enrolling and non-enrolling students are availability of majors, academic reputation of the institution, and commitment to teaching undergraduates. These are followed closely by educational value for price paid.


What admitted students are seeking

What Admitted Students are Seeking


A basic toolbox for assessing institutional effectiveness

Survey respondents are then asked to rate the focal institution on the 16 dimensions, compared with other institutions to which they applied and were accepted.NOTE: Student perceptions don’t have to be accurate to be real. It is the reality of student perceptions that must be addressed.


Important perceptions about university of delaware

Important Perceptions about University of Delaware


Financial aid as a factor in college selection

Financial Aid as a Factor in College Selection

  • Survey allows respondents to report financial aid awards from “The college you plan to attend.”

  • Work study awards at UD and at competitors are virtually identical, while the average loan award at competitors is about $1,000 higher than at UD.

  • The average need-based grant at competitors is double that for UD.

  • The average merit grant at competitors is about $5,000 higher than at UD.

  • Total financial aid packages awarded by competitors are about double that awarded by UD.


A potential competitive disadvantage for ud

A Potential Competitive Disadvantage for UD


Entering students

Entering Students

  • ACTCollege Student Needs Assessment Survey: Ask respondents to identify skills areas – academic and social – where they feel they will need assistance in the coming year.

  • College Student Expectations Questionnaire: Asks respondents to assess their level of expectations with respect to intellectual, social, and cultural engagement with faculty and other students in the coming year.


Continuing returning students

Continuing/Returning Students

  • Student Satisfaction Research

    - ACT Survey of Student Opinions

    - Noel-Levitz Student Satisfaction Inventory

  • ACT Survey of Student Opinions

    Student use of, and satisfaction with 21 programs and services typically found at a college or university (e.g. academic advising, library, computing, residence life, food services, etc.)

    Student satisfaction with 43 dimensions of campus environment (e.g., out-of-classroom availability of faculty, availability of required courses, quality of advisement information, facilities, admissions and registration procedures, etc.)

    Self-estimated intellectual, personal, and social growth; Overall impressions of the college experience

    NOTE: Survey is available in four-year and two-year college versions.


What about non returning student research

What About Non-Returning Student Research?


What about non returning student research drilling deeper

What About Non-Returning Student Research?Drilling Deeper…..

  • Commercial instruments exist, but response rates tend to be low, and reported reasons for leaving politically correct – personal or financial reasons.

  • For the last several years, we have administered the Survey of Student Opinions during the Spring term to a robust sample of students across freshman, sophomore, junior, and senior classes.

  • The following Fall, the respondent pool is disaggregated into those who took the Survey and returned in the Fall, and those who took the Survey, did not return in the Fall, and did not graduate.

  • Test for statistically significant differences in response patterns between the two groups.


Campus pulse surveys

Campus Pulse Surveys

  • Based upon information gleaned from Survey of Student Opinions, we annually develop five or six short, web-based focused Campus Pulse Surveys directed at specific issues that surfaced. Among recent Campus Pulse Surveys:

    • Registration Procedures Within a PeopleSoft Environment

    • Quality of Academic Advising at the University

    • Personal Security on Campus

    • Issues Related to Diversity within the Undergraduate Student Body


A basic toolbox for assessing institutional effectiveness

Note: While there are a number of instruments that allow for assessment of student satisfaction among undergraduate students, there is very little in the way of instrumentation for assessing graduate student research. Graduate students are virtually forgotten when it comes to any facet of student research, and data collection instruments are generally locally developed, if they exist at all.

We have just developed a Graduate Student Satisfaction

Survey and will be happy to share it with interested parties.


Student engagement

Student Engagement

  • College Student Expectations Questionnaire (CSXQ)

  • College Student Experiences Questionnaire (CSEQ)

  • National Survey of Student Engagement (NSSE)


Benchmarks of effective educational practice nsse

Benchmarks of Effective Educational Practice(NSSE)

  • Level of academic challenge

    • Course prep, quantity of readings and papers, course emphasis, campus environment emphasis

  • Student interactions with faculty members

    • Discuss assignments/grades, career plans & readings outside of class, prompt feedback, student-faculty research

  • Supportive campus environment

    • Resources to succeed academically, cope w/ non-academic issues, social aspect, foster relationships w/ students, faculty, staff

  • Active and collaborative learning

    • Ask questions & contribute in class, class presentations, group work, tutor peers, community-based projects, discuss course-related ideas outside class

  • Enriching educational experiences

    • Interact w/ students of a different race or ethnicity, w/ different religious & political beliefs, different opinions & values, campus environment encourages contact among students of different economic, social, & racial or ethnic backgrounds, use of technology, participate in wide-range of activities (internships, community service, study abroad, independent study, senior capstone, co-curricular activities, learning communities)


Graduating students

Graduating Students


Alumni research

Alumni Research

  • Commercially prepared instruments exist. Decide if they meet your needs or if you have to develop your own.

  • Decide early on why you are doing this research: Are you assessing the continuing relevance of the college experience? Are you cultivating prospects for the Development Office? Both? Be up front if fund raising is a component.

  • Decide the which classes you need to survey; don’t go after every living alumnus unless you are a very young institution.


Assessing student learning outcomes

Assessing Student Learning Outcomes

  • I’ll provide only a brief overview, as there are others (Linda Suskie, Trudy Banta, Jeff Seybert) who are far better versed than I am.

  • That said, understand that assessment of student learning is at the core of demonstrating overall institutional effectiveness.

  • Assessment of student learning is a direct response to the inadequacy of student grades for describing general student learning outcomes.


According to paul dressel of michigan state university 1983 grades are

According to Paul Dressel of Michigan State University (1983), Grades Are:

“An inadequate report of an inaccurate judgment by a biased and variable judge of the extent to which a student has attained an undefined level of mastery of an unknown proportion of an indefinite material. ”


There is no one size fits all approach to assessment of learning across the disciplines

There is no “one size fits all” approach to assessment of learning across the disciplines

None of these should be applied to evaluation of individual student performance for purposes of grading and completion/graduation status.

1.Standardized Tests

  • General Education or Discipline Specific

  • State, Regional, or National Licensure Exams

    2.Locally Produced Tests/Items

  • “Stand Alone” or Imbedded

    3. Portfolios/Student Artifacts

  • Collections of Students’ Work

  • Can Be Time Consuming, Labor Intensive, and Expensive

    4. Final Projects

  • Demonstrate Mastery of Discipline and/or General Education

    5. Capstone Experiences/Courses

  • Entire Course, Portion of a Course, or a Related Experience (Internship, Work Placement, etc.)


A basic toolbox for assessing institutional effectiveness

Non-Student Measures of Institutional Effectiveness:Teaching Productivity, Instructional Costs,and Externally Funded Scholarship


Budget support metrics

Budget Support Metrics


In 1988 the university of delaware

In 1988, the University of Delaware…

  • Was transitioning from a highly centralized, “closed” management style with respect to sharing of information.

  • Had grown from a total enrollment of 7,900 students in the late 1960’s to 20,000+ students in the mid-1980’s.

  • When financial data were examined, found itself with $9 million in recurring expenses on non-recurring revenue sources.

  • Needed to eliminate 240+ full time positions from the basic budget to achieve a balanced budget,


Ground rules in making budgetary decisions

Ground Rules in Making Budgetary Decisions

  • Decisions would be rooted in quantitative and qualitative data that would be collegially developed and broadly shared.

  • Savings would initially be achieved by eliminating vacant positions, outsourcing non-essential functions, and taking advantage of technology.

  • In eliminating human and fiscal resources, to the largest extent possible the academic core of the University would be insulated. However, over the long term, resources would need to be reallocated between and among academic units.


How best to make resource reallocation decisions within academic units

How Best to Make Resource Reallocation Decisions Within Academic Units?

  • A series of budget support metrics would be developed for measuring instructional productivity and costs within academic departments and programs.

  • Input as to which variables should be used was sought from deans and department chairs. These variables were supplemented by those identified by Office of Institutional Research and Planning, and a final set of productivity/cost variables was achieved through consensus.

  • The resulting product became known as ”Budget Support Notebooks.”


A basic toolbox for assessing institutional effectiveness

Looking at a Budget Support Notebook Page from a Humanities Department Within the College of Arts and Science


A basic toolbox for assessing institutional effectiveness

In the Initial Stages, Summary Data Such as Budget Support Data May be Challenged, and Must be Supported by Solid Background Analysis


Who is teaching what to whom

Who Is Teaching What to Whom?


At what cost

At What Cost?


Ground rules for using budget support data

Ground Rules For Using Budget Support Data

  • Decisions are never made on the basis of a single year of data. Trend information is the goal.

  • Data are not used to reward or penalize, but rather as tools of inquiry as to why productivity/cost patterns are as they appear.

  • It is understood that there are qualitative dimensions to productivity and cost that are not captured in this analysis.


Budget support data

Budget Support Data

  • Gave academic units a sense of buy-in and ownership of the data being used for academic management and planning.

  • Helped transform academic departments from 54 loosely confederated fiefdoms into a coherent university, with each unit contributing in meaningful ways to realization of the University’s mission areas of teaching, research, and service.


Extending budget support analysis

Extending Budget Support Analysis….

  • As useful as appropriate comparisons are between and among like departments within the University, comparisons would be substantially enhanced if, for example, the History Department at the University of Delaware could be compared with History Departments at actual peer universities, and at universities with History Departments to which the University of Delaware aspires.

  • Out of this need for external comparative information, the Delaware Study of Instructional Costs and Productivity was born.


Delaware study of instructional costs and productivity

Delaware Study of Instructional Costs and Productivity

  • Over the past decade, the Delaware Study of Instructional Costs and Productivity has emerged as the tool of choice for benchmarking data on faculty teaching loads, direct instructional costs, and externally funded faculty scholarship, all at the academic discipline level of analysis.

  • The emphasis on disciplinary analysis is non-trivial. Over 80 percent of the variance in instructional expenditures across four-year postsecondary institutions is accounted for by the disciplines that comprise a college’s or university’s curriculum.


Delaware study teaching load cost data collection form

Delaware Study:Teaching Load/Cost Data Collection Form


Using delaware study data

Using Delaware Study Data


A basic toolbox for assessing institutional effectiveness

  • We provide the Provost with data from multiple years of the Delaware Study, looking at the University indicators as a percentage of the national benchmark for research universities.

  • The Provost receives a single sheet for each academic department, with graphs reflecting the following indicators: Undergraduate Fall SCRH/FTE Faculty, Total Fall SCRH/FTE Faculty; Total AY SCRH/FTE Faculty (All); Fall Class Sections/FTE Faculty; Direct Cost/SCRH; External Funds/FTE faculty


Science department

Science Department


The delaware study next steps

The Delaware Study – Next Steps

  • As useful as Delaware Study teaching load/cost benchmarks are, they do not address the non-classroom dimensions of faculty activity in an institution and its academic programs.

  • It is possible that quantitative productivity and cost indicators for a given program/discipline may differ significantly from other institutional, peer, and national benchmarks for wholly justifiable reasons of quality that can be reflected in what faculty do outside of the classroom.

  • However, this cannot be determined unless measurable, proxy indicators of quality are collected.


The delaware study faculty activity study

The Delaware Study – Faculty Activity Study

  • In Fall of 2001, the University of Delaware was awarded a second three-year FIPSE grant.

  • This grant underwrote the cost of developing data collection instruments and protocols for assessing out-of-classroom facets of faculty activity.

  • Once again, the grant supported an Advisory Committee charged with responsibility for refining and enhancing data collection instrumentation, data definitions, and study methodology.

  • Faculty Activity Study collects data on 43 discrete variables related to instruction, scholarship, service to the institution, service to the profession, and public service.


Delaware study faculty activity study data collection form

Delaware Study:Faculty Activity Study Data Collection Form


Transparency in assessment

Transparency in Assessment

www.udel.edu/ir


Other issues related to institutional effectiveness

Other Issues Related to Institutional Effectiveness

  • In order to attract and retain the most capable faculty and staff, you have to compensate them.

    AAUP Academe (March/April Issue) annually publishes average faculty salary, by rank, for most institutions in the country, 2-year and 4-year

    CUPA-HR and Oklahoma Salary Study annually publish average faculty salary, by rank, by discipline, by Carnegie institution type. Also publish average salary for newly hired Assistant Professor.

    Salary equity and salary compression studies


Other issues related to institutional effectiveness1

Other Issues Related to Institutional Effectiveness

  • In order to attract and retain the most capable faculty and staff, you have to provide a hospitable workplace.

    Employee Satisfaction Studies

    Campus Climate Studies


Economic impact studies

Economic Impact Studies

  • While not a direct measure of institutional effectiveness, economic impact studies can be a powerful tool in shaping college/government relations for public institutions in particular. The methodology is straightforward, and we will share it if you contact our office.


A basic toolbox for assessing institutional effectiveness

Institutional “Dashboards” that report on key success indicators can be an succinct means of reporting on basic measures of institutional effectiveness.www.udel.edu/ir/UDashboard


A basic toolbox for assessing institutional effectiveness

Claims of institutional effectiveness are stronger when focal institution’s measures on important measures are compared with those of peer institutions.


Choosing peer groupings

Choosing Peer Groupings

  • Scientific – Cluster Analysis or Other Multivariate Tool

  • Pragmatic - e.g. Compensation Peers

  • “Whatever!” - e.g. Admissions Peers


A basic toolbox for assessing institutional effectiveness

We are extending the Dashboard concept to include key variables related to the University’s new Strategic Plan that enable us to compare our position vis-à-vis actual peers and aspirational peers.


Dashboard variables being collected

Dashboard Variables Being Collected

STUDENT CHARACTERISTICS

Percent of Accepted Applicants Matriculated

Total Minority Percentage in Undergraduate Student Body

Freshman to Sophomore Retention Rate

Four Year Graduation Rate

Six Year Graduation Rate

RESEARCH ACTIVITY

Total R&D Expenditures per Full Time Faculty

Total Service Expenditures per Full Time Faculty

FINANCE

University Endowment as of June 30

Alumni Annual Giving Rate

INSTRUCTION

Full Time Students per Full Time Faculty

Total Degrees Granted

Total Doctoral Degrees Granted

Percent of Faculty With Tenure

Percent of Faculty That Are Full Time

Percent of Women Among Full Time Faculty

Percent of Minorities Among Full Time Faculty

Percent of Minorities Among Full Time Staff

Total Faculty Who Are National Academy Members


A possible format for the strategic dashboard

A Possible Format for the Strategic Dashboard


Another potential format for strategic dashboard

Another Potential Format for Strategic Dashboard


That s all folks

That’s All, Folks!

  • What have I missed that you would like covered?

  • Other questions????

  • [email protected]


  • Login