1 / 30

Gloria M. Rogers, Ph.D. Institutional Research, Planning, and Assessment

8th Improving Student Learning Symposium Improving Student Learning Strategically “Strategies for Harnessing Information Technology to Facilitate Institutional Assessment”. Gloria M. Rogers, Ph.D. Institutional Research, Planning, and Assessment

lowri
Download Presentation

Gloria M. Rogers, Ph.D. Institutional Research, Planning, and Assessment

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. 8th Improving Student Learning Symposium Improving Student Learning Strategically “Strategies for Harnessing Information Technology to Facilitate Institutional Assessment” Gloria M. Rogers, Ph.D. Institutional Research, Planning, and Assessment Rose-Hulman Institute of Technology Terre Haute, Indiana USA

  2. Overview • Use of models to guide institutional strategies for improving student learning • Assessing student learning • Best practices for student assessment • Brief history of RHIT process • Assessment model/taxonomy • A case study - demonstration • Benefits to teaching/learning • Assessment method truisms • Barriers to faculty involvement • Advice from the field

  3. Use of Principles of Best Practice for Assessment of Student Learningin guiding development of assessment “system” • Value of using models to guide practice • Recognition of local constraints OUTCOMES INPUTS

  4. Rose-HulmanInstitute of Technology • Terre Haute, Indiana, USA • 1600+ undergraduate students • B.S. degrees in engineering, science, and mathematics • Median SAT scores 1350 (700M,650V) • 80%+ engineering students

  5. BRIEF History • Presidential Commission of faculty, staff and students appointed in Spring of 1996 to develop a plan for the assessment of student outcomes • Provide for continuous quality improvement • Meet outcomes-based accreditation standards • Regional (NCA) • Program (ABET)

  6. Gloria Rogers - Rose-Hulman Institute of Technology Accreditation Requirements Institutional Mission Educational Goals & Objectives Constituents Measurable Performance Criteria Feedback for Continuous Improvement Educational Practices/Strategies Accreditation Program Outcomes Evaluation: Interpretation of Evidence Assessment: Collection, Analysis of Evidence Assessment for Continuous Improvement

  7. A T T I T U D E S & V A L U E S B E H A V I O R K N O W L E D G E Individual S K I L L S Group Object of Assessment (What?) Learning/Teaching (Formative) Accountability (Summative) Purpose of Assessment (Why?) Taxonomy of Approaches to Assessment X Competency-Based Instruction Assessment-Based Curriculum Individual Perf. Tests Placement Advanced Placement Tests Vocational Preference Tests Other Diagnostic Tests X “Gatekeeping” Admissions Tests Rising Junior Exams Comprehensive Exams Certification Exams Level of Assessment (Who?) X Program Enhancement Individual assessment results may be aggregated to serve program evaluation needs Campus and Program Evaluation Program Reviews Retention Studies Alumni Studies “Value-added” Studies (Terenzini, JHE Nov/Dec 1989)

  8. Rose-Hulman’s Mission To provide students with the world’s best undergraduate education in engineering, science, and mathematics in an environment of individual attention and support.

  9. Input Recruit highly qualified students, faculty, and staff Quality Provide an excellent learning environment Encourage the realization and recognition of the full potential of all campus community members Climate Outcomes Instill in our graduates skills appropriate to their professions and life-long learning Provide resource management & development that supports the academic mission Resources

  10. Outcomes Instill in our graduates skills appropriate to their professions and life-long learning • Ethics and professional responsibility • Understanding of contemporary issues • Role of professionals in the global society and ability to understand diverse cultural and humanistic traditions • Teamwork • Communication skills • Skills and knowledge necessary for mathematical, scientific, and engineering practice • Interpret graphical, numerical, and textual data • Design and conduct experiments • Design a product or process to satisfy a client's needs subject to constraints

  11. Why portfolios? • Authentic assessment • Capture a wide variety of student work • Involve students in their own assessment • Professional development for faculty

  12. Why “electronic” portfolios? • Student-owned laptop computer program since 1995 • Classrooms, residence halls, common areas, library, fraternity houses all wired • Access • Efficient • Cost effective • Asynchronous assessment

  13. Advisor Student ADMIN Rater Faculty Employer RosE-Portfolio Structure • User Management • Group Management • System Configuration • Criteria Tree • Activity Managment • Submit • Review • Search • Dynamic Resume • Access Control • View Advisee’s portfolio • Search Advisee’s portfolio • Inter-rater Reliability • Rating sessions • Feedback • Rating management • Curriculum Map • PTR Portfolio • Submit • Review • Search • View • Search

  14. Show Me!

  15. Assessment of student material • Faculty work in teams • Each team assesses one learning objective • Score holistically • Emerging rubrics • Does the reflective statement indicate an understanding of the criterion? • Does the reflective statement demonstrate or argue for the relevance of the submitted material to the criterion? • Does the submitted material meet the requirements of the criterion at a level appropriate to a graduating senior at R-HIT?

  16. Show Me!

  17. Example of Results • Understand criterion? • Submission relevant to criterion? • Meet standards for R-HIT graduate?

  18. Example of ResultsDoes submission meet the standards for a graduate of R-HIT? • Appropriate for audience • Organization • Content factually correct • Test audience response • Grammatically correct

  19. Show Me! Linking results to Practice • Development of Curriculum Map • Linking curriculum content/pedagogy to knowledge, practice and demonstration of learning outcomes

  20. Curriculum Map ResultsFall 1999-2000 (181 courses/labs)Communication Skills

  21. Curriculum Map ResultsFall 1999-2000 (181 courses/labs)Ethics

  22. DEC NOV JAN WINTER FALL Eval Committee receives and evaluates all data; makes report and refers recom-mendations to appropriate areas. FEB OCT Institute acts on the recom-mendations of the Eval. Comm. Reports of actions taken by the Institute and the targeted areas are returned to the Eval Comm. for iterative evaluation. MAR SEP Institute assessment cmte. prepares reports for submission to Dept. Heads of the collected data (e.g. surveys, e-portfolio ratings). APR AUG JUL MAY JUN SPRING SUMMER Closing the loop

  23. Primary focus • It is not about electronic portfolios. • It is about: • supporting teaching and learning • faculty and student development • the transformation of the teaching/learning environment

  24. Benefits to teaching • Faculty are asked to reflect on learning outcomes in relation to practice • Consider the value of stated outcomes • Right ones? • Right performance criteria? • Individual faculty role in creating the context for learning • Develop a common language and understanding of program/institutional outcomes • Explicit accountability • Promotes interdisciplinary discussions/collaborations

  25. Benefits to learning • Students review their own progress as it relates to expected learning. • Portfolios provide a way for students to make learning visible and becomes the basis for conversations and other interactions among students and faculty. • Learning is viewed as an integrated activity not isolated courses. • Students learn to value the contributions of out-of-class experiences. • Student reflections are metacognitive as they appraise their own ways of knowing. • Promotes a sense of personal ownership over one’s accomplishments.

  26. Assessment method truisms • There will always be more than one way to measure any outcome • No single method is good for measuring a wide variety of different student abilities • Consistently inverse relationship between the quality of measurement methods and their expediency • Importance of pilot testing to see if method is good for your program (students & faculty)

  27. Barriers to implementation • Faculty • current workload • lack of incentive to participate in the process (rewards) • “what’s in it for me” (cost/benefits) • Institutional/program leadership • Lack of vision for the program/institutional assessment process (no existing, efficient models) • Cost/benefit unknown • Difficulty of restructuring the reward system to facilitate faculty participation

  28. Process deficiencies • Lack of understanding of the dynamics of organizational change • Absence of “tools” to facilitate collaborative work • Portfolio deficiencies • Ill-defined purpose • Lack of efficient ways to manage the portfolio process • Systematic review of portfolio contents is ill-defined or non-existent • Student and faculty roles not clear • Portfolio process not integrated into the teaching/learning environment • Resource deficiencies • Expertise in portfolio development • Development of “authentic” portfolio

  29. Advice from the field E=MC2 • You cannot do it all - prioritize • All assessment questions are not equal • One size does not fit all • It’s okay to ask directions • Take advantage of local resources • Don’t wait until you have a “perfect” plan • Decouple from faculty evaluation

  30. DEMOSite http://www.rose-hulman.edu/ira/reps/

More Related