1 / 14

Evaluation and Replacement of Individual Development and Educational Assessment (IDEA)

Evaluation and Replacement of Individual Development and Educational Assessment (IDEA) Stephen Burd (burd@unm.edu) Associate Professor, ASM Academic Technology Liaison Presentation copies available online http://averia.unm.edu. Last revised : 11/11/2014 10:38 AM. Project Context.

mluke
Download Presentation

Evaluation and Replacement of Individual Development and Educational Assessment (IDEA)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evaluation and Replacement of Individual Development and Educational Assessment (IDEA) Stephen Burd (burd@unm.edu) Associate Professor, ASM Academic Technology Liaison Presentation copies available online http://averia.unm.edu Last revised: 11/11/2014 10:38 AM

  2. Project Context • In summer 2012, Associate Provost (Greg Heileman) charged the Academic Technology Liaison (Stephen Burd) to identify and evaluate alternative tools for student assessment of courses and instructors • Rationale: • High administrative complexity of current system • Difficulty in gathering/using survey responses/results for further analysis (e.g., data analytics and text mining) • Concerns about usefulness of results in promotion and tenure evaluation • Faculty dissatisfaction with current system • A working group was formed with most faculty members drawn from the Faculty Senate Teaching Enhancement and IT Use Committees • http://averia.unm.edu/IdeaNextStep

  3. Working Group Members Faculty • Stephen Burd (ASM) • Robert Busch (Chemical & Nuclear Engineering) • Kevin Comerford (Library) • Nick Flor (ASM) • Kristopher Goodrich (Counselor Education) • Chris Holden (Honors) • Amy Neel (Speech & Hearing) • Caleb Richardson (History) • Mary Margaret Rogers (ASM) • Julie Sykes (Spanish & Portuguese) Other • Moira Gerety (Deputy Chief Information Officer) • Greg Heileman (Associate Provost for Curriculum) • Grace Liu (ASUNM) • Kris Miranda (GPSA)

  4. Goals for IDEA Replacement (IDEA-R) • Increase use of and usability of student feedback on courses/instructors for formative and summative purposes • Adopt a modern toolwith: • Greater flexibility for faculty, departments, and programs • Online and mobile survey capabilities • Improved reporting • Support for analytics • These goals drove the RFP authoring process

  5. Timeline • Fall 2012 – Working group examines faculty technology survey results and available products – determines that a replacement for IDEA is warranted • Spring/Summer 2013 – Working group examines available alternatives and sample RFPs in detail – develops/releases RFP • Fall 2013 - RFP responses close in October, preliminary evaluation begins • Spring 2014 – Detailed evaluation of RFP responses, top responses identified, vendors demoed in early May: • ConnectEDU (CouseEval) • EvaluationKit – used at CNM and NMSU • eXplorance (Blue) • June 2014 – AVP Heileman reviews choices and feedback, chooses EvaluationKIT, working group concurs unanimously • July-Sept 2014 – Acceptance (sandbox) testing is successful • Sept-Oct 2014 – Contract negotiated – Provost approves purchase • Oct–Dec 2014 - Steering committee formed, pilot testing, initial discussion of related policies with Faculty Senate • Spring 2015 – Evaluate pilot results and make adjustments, phase 1 rollout to 33-50% of UNM, finalize related policies • Summer 2015 – Full switchover to EvaluationKIT

  6. Summary of Finalist Evaluations • eXplorance (Blue) – “The Lexus” • Tops in functionality/features • Much more expensive than the other two (± $400K) • EvaluationKit – “The Hyundai” • Acceptable in functionality/features • Reasonable cost (< $40K) • Some reservations about: • Ease-of-use • Usability beyond summative end-of-semester evaluations • ConnectEDU • Barely acceptable in functionality/features • Significant concerns about viability of vendor, adequate resources, and strategic direction for future product development

  7. EvaluationKIT Selection Reasons License and operational cost a bit less than IDEA Positive feedback from CNM and NMSU Satisfies “must-have” requirements Moves us firmly into the 21st Century Gets us out of the paper shuffling business Extra features of eXplorance are unlikely to be used in the near term Alternative tools exist for ad-hoc instructor-initiated surveys (e.g., UNM Learn, Opinio, paper, …)

  8. Key EvaluationKit Features • Survey structure similar to old ICES system • Develop a UNM question bank and/or “roll-your own” questions • A survey can “layer” questions from multiple organizational levels • No explicit tie to learning objectives or inter-institutional norms • Best use is mid-semester and end-of-semester evaluations – not well-suited to ad-hoc instructor-initiated surveys • Fully online system • Hosted on vendor servers – no local installation option • Survey definition and administration via browser-based application • Students complete surveys via browser or cellphone app • Reports generated in PDF/Excel and viewed online, delivered via email, or downloaded • Surveys/results can be extracted for downstream analytics

  9. To Where From Here? • Fall 2014 • Start policy discussion with the Faculty Senate • Plan/execute first small pilot for Fall end-of-semester evaluations • Participants ASM, Architecture, Public Administration, UNM Gallup • Experiment with: • Centralized and distributed administration • Survey content • Survey open/close dates – response rate impact • Email communication with students – response rate impact • Other communication with students – response rate impact • Spring 2015 • Evaluate first pilot results and plan phase 1 roll-out • Who will participate in this roll-out? • Develop training materials • Plan summer/fall roll-out • Summer/Fall 2015 • Turn off IDEA • Roll-out EvaluationKIT across UNM

  10. Policy Issues for Faculty Senate Consideration • Administration • How will control over survey content and administration be distributed among academic affairs, schools & departments, faculty, central IT services? • Tool specificity • Should use of a UNM-approved tool be required? • Survey content requirements • Will UNM adopt a set of standard questions included in all surveys? If so, what are they? • Will UNM populate an institutional question bank from which questions can be chosen and/or enable schools, departments, and instructors to create their own? • Confidentiality of survey respondents • Is existing language too weak, about right, not strong enough? • Distribution and/or confidentiality of survey data and reporting • Who gets to see what data/reports and under what conditions? • Do students or the public get to see any of it?

  11. EvaluationKIT Mobile Interface Examples

  12. EvaluationKITBrowser-Based Interface Example

  13. EvaluationKITInstructor/Course Report Example

  14. Report Example - Continued

More Related