A Basic Toolbox for Assessing Institutional Effectiveness. Michael F. Middaugh Assistant Vice President for Institutional Research and Planning University of Delaware Commissioner and Vice Chair Middle States Commission on Higher Education. Workshop Objectives.
Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.
A Basic Toolbox forAssessing Institutional Effectiveness
Michael F. Middaugh
Assistant Vice President for Institutional Research and Planning
University of Delaware
Commissioner and Vice Chair
Middle States Commission on Higher Education
“[The academic ratchet] is a term to describe the steady, irreversible shift of faculty allegiance away from the goals of a given institution, toward those of an academic specialty. The ratchet denotes the advance of an entrepreneurial spirit among faculty nationwide, leading to increased emphasis on research and publication, and on teaching one’s specialty in favor of general introduction courses, often at the expense of coherence in an academic curriculum. Institutions seeking to enhance their own prestige may contribute to the ratchet by reducing faculty teaching and advising responsibilities across the board, enabling faculty to pursue their individual research and publication with fewer distractions. The academic ratchet raises an institution’s costs, and it results in undergraduates paying more to attend institutions in which they receive less attention than in previous decades.”
(Zemsky and Massy, 1990, p. 22)
“To an overwhelming degree, they [research universities] have furnished the cultural, intellectual, economic, and political leadership of the nation. Nevertheless, the research universities have too often failed, and continue to fail, their undergraduate populations…Again and again, universities are guilty of advertising practices they would condemn in the commercial world. Recruitment materials display proudly the world-famous professors, the splendid facilities and ground breaking research that goes on within them, but thousands of students graduate without ever seeing the world-famous professors or tasting genuine research. Some of their instructors are likely to be badly trained or untrained teaching assistants who are groping their way toward a teaching technique; some others may be tenured drones who deliver set lectures from yellowed notes, making no effort to engage the bored minds of the students in front of them.”
(Boyer Commission, pp. 5-6)
“The trouble is that higher education remains a labor-intensive service industry made up of thousands of stubbornly independent and mutually jealous units that support expensive and vastly underused facilities. It is a more than $200 billion-a-year economic enterprise – many of whose leaders oddly disdain economic enterprise, and often regard efficiency, productivity, and commercial opportunity with the same hauteur with which Victorian aristocrats viewed those ‘in trade’… The net result is a hideously inefficient system that, for all its tax advantages and public and private subsidies, still extracts a larger share of family income than almost anywhere else on the planet…”
(America’s Best Colleges, p. 91)
(National Commission on Cost of Higher
Education, p. 20)
“We believe that improved accountability is vital to ensuring the success of all of the other reforms we propose. Colleges and universities must become more transparent about cost, price, and student success outcomes, and must willingly share this information with students and families. Student achievement, which is inextricably connected to institutional success, must be measured by institutions on a “value-added” basis that takes into account students’ academic baseline when assessing their results. This information should be available to students, and reported publicly in aggregate form to provide consumers and policymakers an accessible, understandable way to measure the relative effectiveness of different colleges and universities.”
(Spellings Commission, p.4)
It is the Commission’s intent, through the self-study process, to prompt institutions to reflect on those assessment activities currently in place (both for institutional effectiveness and student learning), to consider how these assessment activities inform institutional planning, and to determine how to improve the effectiveness and integration of planning and assessment.
Assessment of student learning demonstrates that, at graduation, or other appropriate points, the institution’s students have knowledge, skills, and competencies consistent with institutional and appropriate higher education goals.
The institution has developed and implemented an assessment process that evaluates its overall effectiveness in achieving its mission and goals and its compliance with accreditation standards.
An institution conducts ongoing planning and resource allocation based on its mission and goals, develops objectives to achieve them, and utilizes the results of its assessment activities for institutional renewal. Implementation and subsequent evaluation of the success of the strategic plan and resource allocation support the development and change necessary to improve and to maintain quality.
While we will discuss several variables today that contribute to assessment of institutional effectiveness, keep in mind that you don’t have to measure everything.
PRIORITIZE within the context of your institution’s culture and needs.
- College Board Admitted Student Questionnaire
- College Board Admitted Student Questionnaire-Plus
The survey allows respondents to rate 16 items with respect to their influence on the college selection decision. In choosing which institution to attend, the top three considerations for both enrolling and non-enrolling students are availability of majors, academic reputation of the institution, and commitment to teaching undergraduates. These are followed closely by educational value for price paid.
Survey respondents are then asked to rate the focal institution on the 16 dimensions, compared with other institutions to which they applied and were accepted.NOTE: Student perceptions don’t have to be accurate to be real. It is the reality of student perceptions that must be addressed.
- ACT Survey of Student Opinions
- Noel-Levitz Student Satisfaction Inventory
Student use of, and satisfaction with 21 programs and services typically found at a college or university (e.g. academic advising, library, computing, residence life, food services, etc.)
Student satisfaction with 43 dimensions of campus environment (e.g., out-of-classroom availability of faculty, availability of required courses, quality of advisement information, facilities, admissions and registration procedures, etc.)
Self-estimated intellectual, personal, and social growth; Overall impressions of the college experience
NOTE: Survey is available in four-year and two-year college versions.
Note: While there are a number of instruments that allow for assessment of student satisfaction among undergraduate students, there is very little in the way of instrumentation for assessing graduate student research. Graduate students are virtually forgotten when it comes to any facet of student research, and data collection instruments are generally locally developed, if they exist at all.
We have just developed a Graduate Student Satisfaction
Survey and will be happy to share it with interested parties.
“An inadequate report of an inaccurate judgment by a biased and variable judge of the extent to which a student has attained an undefined level of mastery of an unknown proportion of an indefinite material. ”
None of these should be applied to evaluation of individual student performance for purposes of grading and completion/graduation status.
2.Locally Produced Tests/Items
3. Portfolios/Student Artifacts
4. Final Projects
5. Capstone Experiences/Courses
AAUP Academe (March/April Issue) annually publishes average faculty salary, by rank, for most institutions in the country, 2-year and 4-year
CUPA-HR and Oklahoma Salary Study annually publish average faculty salary, by rank, by discipline, by Carnegie institution type. Also publish average salary for newly hired Assistant Professor.
Salary equity and salary compression studies
Employee Satisfaction Studies
Campus Climate Studies
Institutional “Dashboards” that report on key success indicators can be an succinct means of reporting on basic measures of institutional effectiveness.www.udel.edu/ir/UDashboard
Claims of institutional effectiveness are stronger when focal institution’s measures on important measures are compared with those of peer institutions.
We are extending the Dashboard concept to include key variables related to the University’s new Strategic Plan that enable us to compare our position vis-à-vis actual peers and aspirational peers.
Percent of Accepted Applicants Matriculated
Total Minority Percentage in Undergraduate Student Body
Freshman to Sophomore Retention Rate
Four Year Graduation Rate
Six Year Graduation Rate
Total R&D Expenditures per Full Time Faculty
Total Service Expenditures per Full Time Faculty
University Endowment as of June 30
Alumni Annual Giving Rate
Full Time Students per Full Time Faculty
Total Degrees Granted
Total Doctoral Degrees Granted
Percent of Faculty With Tenure
Percent of Faculty That Are Full Time
Percent of Women Among Full Time Faculty
Percent of Minorities Among Full Time Faculty
Percent of Minorities Among Full Time Staff
Total Faculty Who Are National Academy Members