1 / 30

Knowing What we Know Begins with Benchmarking

Knowing What we Know Begins with Benchmarking. Janet Moore, Sloan-C Claudine SchWeber, UMUC John Sener, Sener Learning Services Karen Vignare, Michigan State University Betsy Bedigian, Hezel Associates. 11 th Annual ALN Conference November 18, 2005. Overview.

hastin
Download Presentation

Knowing What we Know Begins with Benchmarking

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Knowing What we Know Begins with Benchmarking Janet Moore, Sloan-C Claudine SchWeber, UMUC John Sener, Sener Learning Services Karen Vignare, Michigan State University Betsy Bedigian, Hezel Associates 11th Annual ALN Conference November 18, 2005

  2. Overview • Research on quality distance learning • The push to benchmark • Examples of targeted approaches • Examples of internal benchmarks • The next level of effectiveness • Benchmarking tools

  3. What is benchmarking? • Applying common, consistent standards, methods, observations, measurement • Identifying common metrics for process, costs, and other measures of quality • Using quantitative and qualitative standards to compare performance over time • The continuous process of measuring your outcomes, processes and practices against strong competitors, peers, or recognized distance learning leaders.

  4. Benchmarking Research • What do we need to know from other institutions that can guide our practice and internal decision making? • Difference between “best practices” and benchmarking • Examples of benchmarking research (year over year) • Campus Computing Survey • Sloan-C “Entering the Mainstream” • NUTN-Hezel benchmarking research

  5. Sloan-C Quality, Scale, Breadth • 10 in 10 • Faculty • Course design • Course completions

  6. Map of the Territory

  7. Numbers of Learners In millions. Total US population = 300M; world population = 6.5B

  8. Learner estimates in Millions Note: 10 in 10!

  9. Faculty • Nearly 3 million learners are taking online courses. About 100,000 faculty are likely engaged in teaching these students, about 15-20% of the teaching faculty in America. A growing proportion of regular faculty are teaching online.

  10. Faculty training benchmarks, 68 schools • 61% of schools have optional instructional design help • 51% have voluntary faculty training • 16% have mandatory training • 58% of schools evaluate course design only through end-of-course student evals

  11. Course Completion rates Number Programs in Sample 358 Two-year schools 58 56.87% Four-year schools 5 84.31% Upper division only 6 84.94% Graduate schools 7 85.53% Totals for Responding Programs 61.23% Ingle, F. K. Student retention and completion rates in a postsecondary online distance learning environment.

  12. Push to Benchmark: Tipping Points • Continual small changes which build on themselves (geometric progression) → critical mass →one more small change ‘tips’ the system • Point beyond which even a small change would result in a dramatic alteration of the equilibrium, producing a significant change in the environment

  13. Application of “Tipping Point” to ALN • Indicators • Student enrollments (number, %) • Faculty involvement (number, %) • Infrastructure support for online learning • i.e., technology, IT support, library resources, faculty support

  14. Educational Tipping Point • Point at which 3 indicators intersect: no numerical formula (yet) • Warning: indicators must all move together (upwards) or there will be significant problems • High student enrollment but low faculty involvement → student discontent • High faculty involvement but low student enrollment → faculty disengagement • High faculty + high student + low infrastructure → disaster

  15. Examples • Two Institutions which have “tipped” or are on a trajectory • University of Maryland University College • University of Illinois/Springfield

  16. Tipping Point is… • that moment when ideas, trends and social behaviors cross a critical threshold and ‘take’, causing a tidal wave of far reaching effect” • Online Learning…..!?

  17. QM Rubric as Benchmarking Tool • Applying common, consistent standards: • Application of 40 Specific Review Standards • Standards Sets (WICHE, SREB, NEA, et al.) • Research Literature Support • Cross-referenced w/Specific Review Standards • Identifying common metrics for measures of quality • Common, consistent scoring system • Essential standards; 85% “ME” threshold

  18. QM Peer Review as Benchmarking Process • Applying common, consistent standards, methods, observations, measurement • Parameters: • Purpose = quality improvement • Focus = course design • Relationship = collegial, collaborative/individual • Standards: • Experienced Online Faculty • Peer Review Training • Methods & Observations • Faculty-centered process is “subjective”, but: • Applying common protocols to process • Inter-rater reliability research in progress • Identifying common metrics for measures of quality • Faculty peer review team as ‘benchmarking’ body • Individual, peer deliberation

  19. MSU Big 10 Benchmarking Study • Looked at the adoption of web-based courses and programs at Big Ten Universities • To understand how online teaching and learning is developing at similar institutions

  20. MSU Big 10 Study - Metrics • # of Fully/Partially Online Courses, Programs, Degrees • # of Fully/Partially Online Disciplines • # of Fully/Partially Online Enrollments • Organization Description • Process Description • Marketing Description

  21. RIT Online Learning Examples • RIT DL Benchmarks adopted (2002) • RIT retention/graduation work (2001) • RIT student services dashboard (2003/4) • RIT course inventory (2001)

  22. RIT Online Learning Examples

  23. RIT Online Learning Examples • RIT student services dashboard (2003/4) • General support (stats) • Customized orientation (log-in) • Technical support (logs) • Bookstore support (report) • Library support (librarian) • Advising (coordinator) • New student survey (analysis)

  24. NUTN-Hezel Associates Benchmarking Initiative • Initial Survey • Began in March • Complete May, 2005 • Survey 2 • Began June 2 • IQAT • Expected release first quarter 2006

  25. NUTN-Hezel Associates Benchmarking Initiative • Key questions: • What are colleges/universities measuring? • What are they benchmarking? • What do they think are the most important things to benchmark?

  26. Survey Results • Key findings: • 54% say they’re benchmarking • Few similarities in the types of institutions being benchmarked against • Benchmarking by convenience

  27. www.iqat.org

  28. Join the NUTN-Hezel Associates Distance Learning Benchmarking • Leave a business card • Complete an inquiry sheet • Identify priority areas • Join us in the benchmarking inquiry process at www.nutn.org • Compare your institution against others in the database

More Related