1 / 47

Michigan Technology Readiness Tool (MTRAx) Pilot Training

Michigan Technology Readiness Tool (MTRAx) Pilot Training. March 20, 2013. Welcome. The MTRAx Story. Major Events:. In 2010, Smarter Balanced Assessment Consortium was formed

marin
Download Presentation

Michigan Technology Readiness Tool (MTRAx) Pilot Training

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Michigan Technology Readiness Tool (MTRAx) Pilot Training March 20, 2013

  2. Welcome

  3. The MTRAx Story

  4. Major Events: • In 2010, Smarter Balanced Assessment Consortium was formed • To help determine Test Readiness, the SBAC developed a tool to collect data and determine test readiness (Vendor: Pearson) • Spring, 2012 TechReadinessTool was launched in Michigan (36% participation) • OEII and BAA began meeting regularly to discuss test readiness

  5. Issues with TechReadinessTool: • Not customizable to Michigan needs • Very limited in reporting capabilities • Could only be compared to SBAC specifications • No ability to expand beyond SBAC testing • Could not expand into other areas of technology planning To solve these issues, MDE in partnership with MAISA contracted the Metiri Group to build a Michigan solution.

  6. Development Committee Mathew Ayotte BAA David Palme Portland Public Schools Mark Samp Monroe ISD Jay Schupp EUPISD Beth Soggs Bay-Arenac ISD Jan Vogel MDE

  7. Timeline November, 2012 – February, 2013 • Development of the MTRAx tool February, 2013 • Alpha Testing • Beta Testing March 20, 2013 • Pilot (All ISDs and 2 Districts per ISD) Fall, 2013 • Pilot districts receive training on enhancements • Statewide rollout

  8. Beta Test Districts Summerfield Schools Forest Hills Public Schools East Lansing Public Schools Brown City Community Schools Unionville-Sebewaing Area Schools  Portland Public Schools

  9. Pilot Training Objectives By the end of the training, participants will be able to: • complete the MTRAx survey. • read and understand MTRAx reports. • to train others on the MTRAx survey process.

  10. MTRAx is: A planning tool A method to prepare for online testing A way to inform local decision makers A process that will be flexible as targets move MTRAx is not: An inventory tool A network monitoring device A method to report test readiness to the public What MTRAx is and is not:

  11. Introduction to the Training A cast of thousands… well, several Start with what matters Follow the Palme Process Questions, questions Introduction to the training

  12. Reports Cheryl Lemke Metiri Group

  13. Page

  14. School Report Readiness Ratings: Overall School Technology Readiness for Online Testing Device Readiness for Online Testing 1 School Network Readiness for Online Testing 2

  15. School Report • Testing Specifications • Device Readiness for Testing • Network Readiness for Testing

  16. Overall District Technology Readiness for Online Testing 1 District Internet Readiness for Online Testing 2 Percentage of Schools That are Overall Ready

  17. District Report Readiness Ratings: Overall District Technology Readiness for Online Testing District Internet Readiness for Online Testing Percentage of Schools That are Overall Ready

  18. Reports School District ISD Consortium State

  19. MTRAx and Your Data MTRAx and your data The source The quality The future Support

  20. District and School Data

  21. Room Data

  22. Enrollment Data

  23. Test Information Test Window – 12 weeks Test Structure – 2 components • Computer Adaptive • Performance Task Test Length • Varies by grade/content area

  24. Test Length Two components • Part 1 (Adaptive) 1 class period • Part 2 (Performance Task) 2-4 class periods* All test lengths are estimates *Back-to-back class periods

  25. Performance Tasks Two components • Classroom portion (20-30 minutes) • Computer portion (90-180 successive minutes) All times are estimates Recommendation: • Set the number of test sittings per student to 9 • Keep successive time aspect of computer portion in mind when scheduling

  26. Wrapping Up Your SurveyHit Update! Wrapping up your survey

  27. Pilot Key Dates

  28. Future Plans • On demand reporting • Low effort network monitoring • Instructional data • Professional Development data • Transition to Technology Plan

  29. Current Technology Planning Process Local District creates plan based on: • Committee input • Current environment • Best guess for a three year plan ISD/RESA Review Plans • Checking for requirements • Can recommend modifications MDE Review/Approval • Check for completion and compliance

  30. Proposed Technology Planning Process MDE sets Goals/Targets based upon • Assessments • Educational goals ISD/RESA Guidance in Planning • Evaluation of current data • Gap analysis Local District Planning • Annual • Planning toward MDE Goals/Targets • Monitor progress toward goals

  31. Questions?

  32. Closing

More Related