1 / 50

Minnesota MAP Member’s Mtg. Oct. 22 & 24, 2013

Minnesota MAP Member’s Mtg. Oct. 22 & 24, 2013. Intro & Welcome Updates – Technology, Curriculum Partners, Product Updates - Skills Pointer, DesCartes , Science, Field Testing, Knowledge Academy Minnesota Updates: MCA Alignment, New Norms 2015

ulric
Download Presentation

Minnesota MAP Member’s Mtg. Oct. 22 & 24, 2013

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Minnesota MAP Member’s Mtg. Oct. 22 & 24, 2013 Intro & Welcome Updates – Technology, Curriculum Partners, Product Updates - Skills Pointer, DesCartes, Science, Field Testing, Knowledge Academy Minnesota Updates: MCA Alignment, New Norms 2015 Special Feature: “MAP & STAR: What’s the Difference?” LUNCH Afternoon Breakout sessions Featured Guest: Matt Hicks, NWEA – Partner Advisory Roundtable Discussion

  2. Afternoon Sessions 12:45 to 1:45 S1 Using RIT Norm Calculators: Setting Goals at All Levels Dan Henderson S1 Bring Your Data – Bring Your Questions Eric Merchant & Lynn Lamers 1:50 to 2:50 S2College and Career Readiness: Tracking Growth toward Post H.S. Success Les Perry S2MPG/CPAA – Assessing the Learning Needs of Primary Age Students – Lynn Lamers & Eric Merchant

  3. Minnesota NWEA Team • Les Perry – Account Executive for Kansas, Missouri, Minnesota • Minnesota based (Detroit Lakes) les.perry@nwea.org • Lynn Lamers – Account Manager for Minnesota & Wisconsin lynn.lamers@nwea.org (Minneapolis) • Eric Merchant – Account Manager for Minnesota & Wisconsin eric.merchant@nwea.org (Modesto, Ca) • Linda Andres - Account Manager for Minnesota & Wisconsin linda.andres@nwea.org (Portland, Or) • Dan Henderson – Central Region Accounts Manager (Iowa, Wisconsin, Minnesota, Illinois, Missouri, Kansas, Nebraska) (Washington, Ia.) dan.henderson@nwea.org General Number: 503.624.1951 Tech Support: 877-469-3287

  4. Special Guest: Mr. Matthew Hicks • Director of Product Strategy • Forming a Partner Advisory Panel Matt previously served as the Senior Product Manager for NWEA’s MAP assessment product line.  In his current role Matt is responsible for leading the strategy for NWEA’s growing product portfolio, ensuring that each of NWEA’s products is aligned with NWEA’s mission of helping all kids learn.  Before joining NWEA, Matt worked in high-tech, market research, secondary education and eCommerce where he successfully developed and launched over a dozen products. He developed his deep interest in education from his wife, a first grade teacher of 12 years. Matt graduated from Sheffield Hallam University in England and now lives in Portland Oregon with his wife and two dogs.

  5. Special Guest: Mr. Jeff Strickler,Executive Vice President and Chief Operating Officer • Jeff joined Northwest Evaluation Association (NWEA) in 2007. In addition to his role as Chief Operating Officer, Jeff also serves as NWEA's CFO and Vice President of Corporate Services. Jeff brings to NWEA more than 20 years of experience helping organizations attain their goals, first as an attorney at Perkins Coie and later in various leadership roles at notable Northwest companies, including CFI ProServices/Harland Financial System, Vesta Corporation, Centrisoft Corporation and CakeBoxx LLC. • At CFI ProServices, Jeff was Vice President and General Counsel from 1994 through its acquisition by Harland Financial Services Systems in 2000. He was the Vice President of Finance and Administration at Vesta Corporation during a critical time in that company's expansion. Jeff has served as the Chief Operating Officer at both Centrisoft Corporation and CakeBoxx LLC, a startup company engaged in developing a new, more secure design for international shipping containers. • Jeff received a B.S. in Business Administration from Oregon State University in 1982 and J.D. degree from the University of California at Berkeley (Boalt Hall) in 1985.

  6. Technology: The Good, The Bad & The Ugly (and the beautiful)  • CLIENT-SERVER MAP - Good • System in use since early 2000’s • Archive your previous term data located in the NTE folder. • Submit your Class Roster File (CRF) at least two weeks in advance of testing. If applicable, also submit your Special Programs File (SPF) • Client-Server performance has been good this year – tech calls are mostly of the “how-to” variety • Process for transitioning to Web MAP – contact Julia (currently a waiting list)

  7. Technology Happens INTERNET ACCESS

  8. Mobile Devices? Performing Well - Good iPad – Implemented this Fall (Not MPG) Chromebooks? YES

  9. Web Based MAP Update • Ugly Start This Year • Many of you experienced interrupted or suspended testing because of multiple technical problems in MAP. • hardware failures in network equipment • resiliency failures • degraded performance due to anomalous traffic generated by the changing patterns of testing • Long wait time on tech support line • Heavy concurrency load at certain times of the day 8:30 to 10:00 am • NOT the optimum partner experience that we want to provide • Hard to determine issues locally when we are having issues • Resulted in delayed testing; drawn out window; stops and starts; less than stellar results

  10. What We Have Done – What We Will Do • WHAT STEPS HAVE WE TAKEN? • We replaced hardware and audited our network configuration; we then asked Cisco to confirm the configuration and health of our system • We strengthened our solutions to automatically switch to redundant systems in case of failure; this work was confirmed with third party experts. • We expanded our hardware and systems infrastructure to increase the overall processing power available to MAP assessment • We temporarily delayed end-of-test reporting until end of day to optimize students’ testing experience. Students will still immediately receive their reports at the end of the testing session. • As we made these changes we asked many of you to suspend or delay testing for one or more days. This allowed us to ensure the changes we made fully solved the problems

  11. NEW! Web MAP Update Website – Better Communication Process http://www.nwea.org/node/18860/ Latest Tech Updates (no calling required) • Hour by Hour Updates on the Technical System at NWEA • Alerts will be posted in real time • Technical reminders for local configuration • Notices of future maintenance windows

  12. Questions from Partners ? • Are Results Valid when students had to stop and restart? (see letter from research team) • Research team analyzed about 100,000 test results of students that were interrupted. • From our findings, we are confident the scores are valid and that the MAP test behaved as designed in spite of unintentional interruptions

  13. CRITERIA TO DETERMINE SCORE VALIDITY • Time on test – less than 30 minutes are suspect • Standard Error of Measure (St Err) – more than 4 points is suspect • Percent correct for the test: • Normal = 45% to 55% correct • Students getting less than 45% to 40% correct will be suspect These data are all on the “Comprehensive Data File” aka, the “Data Export Scheduler”

  14. Partner Questions If we had to delay our test window to later in the fall, will the norms still apply? (see letter from Mike Nesterak – Director of NWEA Research Services) “It is recommended that districts not delay beyond the eighth week of instruction for their fall MAP test administration. Testing within these first eight weeks should have no material impact on the fall-to-spring RIT growth.”

  15. Partner Questions Since the Web MAP system had problems this year, will we receive any compensation? Yes, we will work with districts one on one to determine a fair compensation.

  16. Check List for Technology on District Side • Check the Whitelist: If your district/school is experiencing intermittent slowness, white screens, or the inability to click on various buttons when testing, please ensure that you have whitelisted *.mapnwea.org on equipment that sits between the testing machine and NWEA servers. • Double Check That You Are Using the LATEST Version of the Lock Down Browser - Latest version was released on Aug. 28. If you did not download that browser, Web MAP will not be compatible • Check Internet Usage During Testing Hours - your district may have plenty of bandwidth, but if people are using the internet during testing hours, the bandwidth requirements may not meet the need for testing. • Check the Tech Requirements for Using Compatible Browsers – Chrome, Safari (NO INTERNET EXPLORER) • Use the “Proctor Console Issues” Quick Guide - If students are “stuck” don’t panic…use the guide and a few key strokes.

  17. BIGGEST CHANGE…….. No New Migrations to the Web Based System This School Year – will open again for Fall 2014

  18. Product Updates: Skills Pointer for Progress Monitoring & DesCartes (Fall 2014) • What Will Be New with DesCartes? • Interactive learning continuum • Student-Teacher Friendly language • Ability to create “Data-Ladders” for Differentiated Instruction

  19. What are our Partners Saying? “Our teachers give up on it pretty quick.” “It’s become a hurdle. It’s too intense and stymies the teachers and detracts from the value of MAP.”“…intimidating and takes too much time.” “The statements don’t appear to be directly based on the CCSS.”

  20. DesCartes Vision The vision for a future DesCartes addresses the specific limitations of the current offering, including the following: • Teacher and student friendly content that clearly describes a learnable skill or understanding • Improved organizational structure providing options of sorting Learning Statements by Topic and Standard in addition to Sub-Goal • Interactive and customizable reporting capabilities & features support instructional planning • Ability to link to external resources including lesson plans, interactives, and student learning resources

  21. Advanced options for report display, filtering & content Ability to view statements by standards language Improved content for instructional use Linking to additional resources

  22. View Learning Statements by State Standards Automatically generated class-based ladders for instructional groupings

  23. What Will Be New in Skills Pointer? User-friendly design & interface Easy-to-use reports Rich digital learning content High-quality test items & larger item bank Revised CCSS skills framework Integration with the MAP platform NEW! Skills Pointer – diagnostic tool (gr. 3-9)

  24. Improved UI

  25. Skills Pointer Today • Skills mastery assessment for: • Grades 3-9: Math, Reading, Language Usage and Science • aligned to the Common Core and state standards • Intended for frequent use, primarily for remediation • Provides access to printable learning plans based on individual skills • Standalone system, independent of MAP

  26. What will the new Skills Pointer provide? • Ability to leverage MAP as an RTI Screener to inform remediation plans for students with Skills Pointer • CAT based Skills Locator Assessment determines a student's actual skill level, independent of grade to find the exact learning objectives/skills each student needs to learn • Track frequent progress to plan with Skills Pointers short skills mastery tests • Accesslearning and instructional resources based on each students plan, saving teachers time and giving every student resources to learn • Developed from the ground-up based on the Common Core State Standards

  27. Display Resources for: • Interactive • Lesson Plans • Handouts • Questions • Articles

  28. Science Test Update • Changes to Science: • One test instead of two • Concepts and Processes no longer a stand alone test • Incorporated into the “General Science” test • 64 to 45 items – saving time • Current “General Science” Norms are applicable

  29. Field Testing Opportunities • NWEA Needs YOUR Help! • 3 Areas for Field Testing This Year: • Spanish Math grades 2-5 • Upper Level Math (high school) Standalone Field Testing (SFT) is NWEA’s means of introducing and testing new items with a goal of increasing the breadth and depth of assessment knowledge in future Measures of Academic Progress® (MAP) testing. The field-testing of items also allows NWEA to maintain the stability and accuracy of our measurement scale.

  30. What is required of a field test partner? • ANY District is eligible: • No limit to the number of students that can participate (just limited to grade levels mentioned) • Each test will take about an hour – students can take multiple tests on different subjects • Workstations must meet current technical requirements – web based testing. • Either Server based districts or Web Based MAP districts can participate

  31. What’s in it for YOUR District? • NWEA will give renewal or new product credit up to $5.00 per completed test • You will expose your students to new item types that will eventually become part of the NWEA standard item bank • Your district will help to create the “Next Generation” of MAP Assessments

  32. Knowledge Academy

  33. Knowledge Academy

  34. Knowledge Academy

  35. Knowledge Academy Nomination

  36. Minnesota Updates: MCA Alignment • Kingsbury Center is Collecting Data NOW for a new MAP/MCA Cut Point Alignment • Please send Don Draper your MCA data • Need at least 1000 students per grade • Once data are collected, linking will take about 2 weeks • Write and publish the report • Look for a new alignment by Jan-Feb this year

  37. Norm Study - New Norms? • Norm Study Conducted every 3 years • Last norm study was in 2011 • Due to implementation of the Common Core SS in many states, research has decided to wait until 2015

  38. MAP & STARWhat’s the Difference? • STAR Claims: • Cheaper - Faster • Better Let’s look “Under the Hood” to see if these claims are valid

  39. Buying a New Car….. By only looking at the shiny new outside???

  40. You Have to Look Under the Hood….. …..not all assessments are equal under the hood

  41. NWEA is the only adaptive test… • Adapts across all grades on a single scale (pre-k through 12) • Based on 30+ years of solid research • Has the most stable-predictive scale in the industry • Uses norms that are truly “national” • Is content neutral • Has the largest item bank of any assessment program • Is a non-profit, mission driven organization Setting the Standard in Adaptive Testing for 30+ years

  42. Review the Questions • How stable is the scale? • How are norms created, and how often updated? • Is the assessment FULLY adaptive across a pre-K through 12 scale? • What type of items are used, and how many items support the assessment? By strand? • How long is the assessment and does it include enough items to give accurate data? • What type of organization is the assessment company? Will they support you?

  43. Scale Stability & Accuracy • RIT Scale: stability and predictability • Virtually no shift (change) in the RIT Scale for 20 years • Scale is stable and scores mean the same thing over time • Ex: a student with a 200 RIT is at the same instructional level today as 20 years ago • Question: How stable is the scale an assessment is using?

  44. Norms for Comparisons • Norming Process • Kingsbury Center at NWEA re-norms MAP test every 3 years (4 years this time) • Norms both for growth & status are nationally representative • Our Norm Studies now include 5+ million students and 10’s of million of test events • Some test makers use a “quartile” or “quintile” regression model (misrepresents growth at extremes) Question: how representative are the norms? What methodology do they use?

  45. Adaptive Tests • NWEA MAP Assessments are “fully adaptive” – across all grades • Some “adaptive” assessments are “constrained adaptive” • Within grade level (MCA) • A couple of grades above or below (STAR) • Will still miss outlier students Question: how adaptive is the assessment?

  46. Item Depth & Breadth • New Item Types to assess deeper levels of DOK (depth of coverage) • Sample of new items: here. • STAR is still primarily MC and assesses lower cognitive skills • NWEA uses 7 to 10 items per strandto determine goal strand scores • STAR uses fewer items per strand • STAR tests are shorter tests

  47. Item Depth & Breadth • Item exposure: STAR limits items to 75 to 90 day exposure • MAP limits item exposure to 14 months • STAR item bank • Reading: 4700 items • Math: 4000 items • MAP item bank • Reading: 15,000 ELA items • Math: 12,000 items • Question: What type of items and how many items does an assessment use?

  48. Time On Test • Why does a MAP test take longer for students to complete? • avg is 45 to 50 minutes • Assessing the depth of knowledge – higher order skills (not just lower cognition) • Providing accurate information based on adequate item selection per strand • Gives you MORE information on broader range and depth of skills • Fewer items means higher SEM • LESS IS NOT ALWAYS MORE….especially in assessment

  49. Time on Test • MAP is untimed: students can show their achievement without time constraint • STAR is timed per item: forced selection after a set period of time • How will this impact performance? Question: How does time affect student performance?

More Related