1 / 41

Online Program Review :

Online Program Review :. Reflections on Lessons Learned and Paving the Way Forward. Why we went online. Why’d you come? What are the outcomes you’d like to achieve? What do you want to learn? Guess what... Our colleges had the same concerns In addition. Introduction and Framework .

airell
Download Presentation

Online Program Review :

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Online Program Review: Reflections on Lessons Learned and Paving the Way Forward

  2. Why we went online... • Why’d you come? • What are the outcomes you’d like to achieve? • What do you want to learn? • Guess what... • Our colleges had the same concerns • In addition...

  3. Introduction and Framework • Antelope Valley College (Aeron) • Weave Online • Cosumnes River College (Kathy) • Homegrown • Mesa College (Bri & Jill) • TaskStream • Yuba College (Erik) • TracDat

  4. Integrated Planning using Online Program Review ANTELOPE VALLEY COLLEGE

  5. Benefits of Going Online • Universal access to the multiple constituents in comparison to a document. • Standard formatting and layout. • Simple compliance reporting.

  6. WEAVEonline • Customizable to effectively accommodate our PR questions. • Easy compliance reporting. • Evidence database allows users to support statements by attaching documentation to the questions. • No internal IT support needed • Streamline reporting via PDF or MS Word formats.

  7. END USER EXPERIENCE • Common themes: • Likes • Universal access • Simple entry process • Evidence database • Challenges • Data overload • Unable to enter tables into document • No auto saving

  8. Customization • Customizable homepage which allows for news , updates and deadlines to be posted . • Question customizations from character limitations to link integration to HTML coding capabilities. • Customizable reporting.

  9. workspace

  10. INTEGRATION into other processes • Same tool provides access to SLO data which can be essential in providing support in decisions related to action plans within the program review process. • Customized reports can focus on specific questions which can be used as evidence in other planning reports (e.g., Educational Master Plan, Staffing plans, ACCJC reports, etc.).

  11. Cosumnes River College A Homegrown Online Program Review System (PrOF)

  12. Why Homegrown? • Faculty driven project • Vision • Understanding • Programming expertise

  13. Why Homegrown? • Practical Considerations • Commitment to current program review model • Had immediate access to reassigned time • Could create connections with other databases • No budget for ongoing costs • Could institutionalize and expand • Could phase in the roll-out

  14. Selected Comments • I am typically the type of person that finds this kind of activity frustrating, and a complete waste of my time. I feel PrOF has become a useful tool for real program evaluation as opposed to useless paperwork. • Filling out the sections really only took about two hours....the other hours were meeting with the fellow colleagues in the department to have a genuine program review. • Thanks for making it a smoother process for end users.

  15. Customization • Can easily customize during implementation • Currently • updating and modifying PrOF • providing training to institutionalize and fully utilize • planning for development of other applications

  16. Information is easily extracted to • inform planning and resource allocation • fulfill other functions • live and static extraction capabilities • System can be integrated with other systems to enhance integrity and save work

  17. Need More Info? • Dr. Katherine McLain, Dean of College Planning and Research; mclaink@crc.losrios.edu; 916-691-7144 • Mark Ford, Faculty Developer and Librarian fordm@crc.losrios.edu; 916-691-7628

  18. RP Conference April 1, 2013 Integrated Planning using Online Program Review

  19. Worked with our program review model • Could handle our complex process (e.g., multiple reviewers, collaboration with Liaisons, etc.) • Cost-effective • College was already using TaskStream for SLO assessment • Can integrate with SLO component • Customization capabilities • Workspaces, forms, and roles/access are created, modified, or managed locally • Reporting capabilities • Real-time analytics and Excel exporting capabilities Why TaskStream?

  20. Online module was tested and adjusted based on feedback from the Program Review Committee and shared governance groups • What did users like? • 24/7 access • Virtual spaces for collaboration • One-stop shop for data, program review, resource requests • What did users find challenging? • The software learning curve • Managing multiple roles (Lead Writer, Liaison, etc.) • Reviewer functions The User Experience Based on preliminary feedback in fall 2012…Process evaluation in progress…

  21. Customized workspace template that can be adjusted with some minor programming • Workspace layouts, data, attachments, and requirements were adjusted to fit college needs • Custom forms/fields and attachment capabilities • Created custom feedback forms for reviewers • Ability to create custom rubrics in future • Exported reports to Excel for reformatting and customization Customizing It for Mesa

  22. What It Looks Like: Workspace Layout

  23. What It Looks Like: Instruction Update Form

  24. What It Looks Like: New Goals Form

  25. Online module allows for a more dynamic experience for users • Information is stored and can be viewed or exported as: • A holistic report (full program review document) OR • As pieces (e.g., new goals, resource requests, etc.) to inform planning and resource allocation recommendations Pulling It All Together

  26. Reporting Aggregate Reports Individual Program Reports

  27. Jill Baker, Dean of Institutional Effectiveness • jibaker@sdccd.edu • 619-388-2320 • Bri Hays, Campus Based Researcher • bhays@sdccd.edu • 619-388-2319 Questions?

  28. Erik Cooper ecooper@yccd.edu

  29. Why TracDat? • Using TracDat for SLOs • WCC Faculty • Considered home grown • Adobe Database • Survey Monkey • Colleague Integration

  30. The User Experience “TracDat: An ancient Swedish word meaning ‘evil one’” • Year 1 • Unintuitive • Unfamiliar • Unexcited • Unsure

  31. The User Experience • Year 2 • Collaborative • Time Saving • Meaningful • Useful • Still Unintuitive...but more support helps

  32. Administration • Integration with MIS • Excel uploads...so not really • Integration with Planning/Budgeting • Standard and Ad Hoc reports • Multi-user access...except • Customization • Multi-college, similar but different processes • Easy enough

  33. Reporting

  34. Reporting

  35. Lessons Learned • Trials, Pains, and Refinement • Implementation • Successes • Challenges • Evolution • Cultural considerations • Next steps Questions?

  36. Need to contact us? • Aeron Zentner • azentner@avc.edu • Bri Hays • bhays@sdccd.edu • Jill Baker • jibaker@sdccd.edu • Kathy McLain • mclaink@CRC.losrios.edu • Erik Cooper • ecooper@yccd.edu

More Related