1 / 13

Assessment Committee Meeting Continuous Program Improvement

Assessment Committee Meeting Continuous Program Improvement. 12-7-2017 DL2/706. Goals for the meeting. Understand why a new conception of “assessment” is necessary at NYIT.

mpeggy
Download Presentation

Assessment Committee Meeting Continuous Program Improvement

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Assessment Committee MeetingContinuous Program Improvement 12-7-2017 DL2/706

  2. Goals for the meeting • Understand why a new conception of “assessment” is necessary at NYIT. • Discuss a new assessment framework – continuous program improvement (CPI) – that addresses overall program health, relevance, and viability, and that works for administrative, as well as academic programs. • Discuss the timeline and process for rolling out the new framework.

  3. I. Why a change in the annual assessment process is necessary • Changing MSCHE accreditation standards and process are the main reasons for the need to a change our annual assessment process. • In short, PLO assessment needs to continue, but needs to be situated within a broader education context, and reflect all stakeholders’ endeavors in the learning process. • Furthermore, few at NYIT would argue that the current process adds significant value.

  4. MSCHE Expects: • Show Institutionalimprovement over time indicated by financial sustainability metrics and students achievement metrics. • -Trends rather than bright lines • -Compare trends to aggregated peer-group. • Report on majorinitiatives that are expected to lead to improvements. In the context of initiatives, report what you learned from assessment, and how results informed the initiatives. • All of this done in the context of institutional mission.

  5. The 8 Year NewProcess Annual Institutional Updates •Financial and Student Achievement data elements •Responses to recommendations(if needed) Opportunities/Input for InstitutionalImprovement Midpoint Review •Cumulative Peer Review of AIUdata •Feedback fromthe Commission •Campusengagement in self-study process that culminates with an onsite team visit by peerevaluators Self-Study Evaluation

  6. Rethinking assessment:“Student Achievement-” Academic Progress Mandatory • Retention Rates • IPEDS Graduation Rates • Mean Time to Graduation • And, of course, Program Learning Outcomes(PLOs) Optional • % Credits Completed/Attempted • % Pell • % Minority • % Developmental • % 1st Generation • % Non-Traditional • % Part-Time • Self-Identified

  7. Rethinking assessment: “Student Achievement-” Post-Institutional Mandatory • Loan Default Rate • Loan Repayment Rate Optional • 1st time Pass Rates on Licensure Exams • Graduate Survey Satisfaction Results • Career Placement Rates • “First Destination” Survey Placement Rates • Transfer Rates • Self-Identified

  8. Rethinking assessment: Financial Health Indicators- Documents All • Most Recent Audited Financials • IPEDS Finance Data • Title IV Compliance Audits • Catalog/URL • Most Recent USDE Composite Score If Applicable • Bond Rating for New Debt issued • Financial Audit from Parent Corporation

  9. II. Proposal for the new assessment framework: Continuous Program Improvement (CIP) At the heart of NYIT’s new framework • Using assessment data of all kinds to inform improvement initiatives at school/department/division level, and connect assessment with resource allocation within the context of NYIT’s strategic priorities. E.g., financial data, external research data, cutting edge technology or professional standard updates etc. High Impact Practices • Using assessments to demonstrate that the school/department/division achieves its mission and goals, which relates to students (learning & achievement), faculty/staff, and itself, which aligns with NYIT’s mission. e.g. DPT program goals & outcomes • Using assessments to demonstrate that the school/department/division achieves its aspirational goals (strategic goals) beyond in compliance with professional accreditation standards. Those goals are about what your school/department/divisiondo to bring itself to “the next level” that will impact the students, faculty and the program’s quality. e.g., SOM Strategic Planning

  10. Dean’s data set (internal) • Student credit hours • FTE faculty • FTE staff • Total revenue and expense, total margin • Three year enrollment trend. • DFW (drop-failure-withdrawal)rates • Three-year trend data: retention rate, graduation rate, and mean time to graduation. • School/college level graduation students survey (GSS) and NSSE • Others (requested by deans)

  11. Dean’s data set (outside-environment) • Market data • Benchmarking to peer programs • Student/alumni/employer/client satisfaction data • Others?

  12. III. New Framework (CPI) Implementation (the specifics of some items yet to be determined) • Timeline: a 4-year cycle with annual update to track its implementation? • Responsibilities & collaborations: School/department/division own it, and PADS facilitates the multi-year planning and assessment with data, and data analysis. • Tech platform to facilitate? • Presentation to a review team (who should be)

  13. By the end of a CPI cycle, School/department/division presents- • Students’ learning improvement with evidence. • Students’ achievement improvement with evidence (graduation rate, retention rate, certification passing rate, job placement, advancement in education, career development and success…etc.) • Teaching effectiveness improvement (course evaluation scores, implementing best teaching practices, and, of course, student learning) • Scholarship productivity improvements • Curriculum improvement with evidence of updated syllabi. • Alumni and current student recommendations of NYIT. • Other evidence of improvement (e.g., rankings)…

More Related