1 / 36

Christine Pinsent -Johnson Matthias Sturm 2016

Assessment Challenges, Contradictions, and Inequities: An Analysis of the Use of digital technology and OALCF Milestones. Christine Pinsent -Johnson Matthias Sturm 2016. What was the project about?. Why was the project was done?.

tamanna
Download Presentation

Christine Pinsent -Johnson Matthias Sturm 2016

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Assessment Challenges, Contradictions, and Inequities: An Analysis of the Use of digital technology and OALCF Milestones Christine Pinsent-Johnson Matthias Sturm 2016

  2. What was the projectabout?

  3. Why was the project was done? • Use Digital Technology Milestones are used far more than other Milestones • In PMS training sessions it was suggested that the way LBS programs report on Milestone data was not consistent • Three previous studies described challenges • The theory and methods could be a problem

  4. Our research questions • Why are digital technology Milestones used more often? • Why are other Milestones not as popular? • How do assessors and instructors understand and use results? • What practices do programs develop when using Milestones?

  5. How did we collect our data? • Analysed OALCF documents and Milestones • Surveyed 181 assessors (28% from school boards representing 31% of learners) • Interviewed 26 coordinators, assessors and practitioners from six programs across the province in all streams and sectors • Analyseddata from EOIS-CaMS (2013-2014)

  6. What are the main findings? • Unique design and generic content disconnects them from teaching and learning • They are primarily used for compliance and not instruction • Programs then rely on digital technology Milestones, which are more predictable • Guidelines make the Milestones administrable but disregard teaching and learning • Programs are impacted in different ways and have devised various strategies to mitigate impacts and show compliance

  7. Unique test design and content Based on our analysis we found that • The OALCF was developed in part based on IALS, the International Adult Literacy Survey • This sort of testing doesn’t measure how or if people are learning literacy skills • It also doesn’t consider the skills and abilities of those with low levels of education • Making a direct connection between program performance and results from international adult literacy surveys is an unachievable goal

  8. What did survey respondents say? • Milestones are confusing • Milestones are too difficult for some or too easy for others • Learners can be confused by the instructions • Learners are not familiar with the content and kinds of questions asked • Milestones do not benefit learner goals, program purposes and the existing curriculum being used in the program • Milestones cannot provide useful information to assess learner progress

  9. What did the survey data indicate about the use of assessments in programs? • Nearly all respondents (89%) continue to use other assessments • Nearly all (86%) continue to use other curriculum frameworks aligned with K-12 and traditional approaches The OALCF is mainly assessment. ESKARGO and other documents from CESBA make the OALCF Curriculum Framework usable! Provides tangible guidance for program administrators and instructors to follow and adopt into programming.

  10. “All the service providers were terrified that they were getting funding pulled if they don’t get the numbers in and if they don’t have the Milestones. So there is all that tension implementing without the skill base to know what we are actually doing. This whole curriculum framework didn’t change what we did at all. We changed the language but we still do the same things. We are using text books. We are teaching to specific skills. We always try to show you how to use the skills in the real world […] So we are changing how we are reporting it, but the other information we want we are tracking ourselves.” Interviews Participant

  11. Survey respondents also said that they are experiencing a general sense of time pressure

  12. EOIS-CaMS data revealed Milestones are used to meet reporting minimum Milestones per learner (2013-14)

  13. What did survey respondents say? “Milestones are often in too big a chunk for learners in Secondary School goal path to show progress. There are so many skills needed to prepare for PLAR and credit system.” “Milestones aren’t really related to what we teach to prepare them for college. Most will score the same on Milestones the day they start and the day they leave.” “They're not really a measure of progress, just a measure for the ministry to use.” “Milestones are rarely applicable and seem like a 'hoop' we have to jump through—useless much of the time in relation to true goals, and even goal path.”

  14. EOIS-CaMS data shows assessors rely on digital Milestones Rates of OALCF Milestones Completed (2013-14)

  15. Digital Milestone use by LBS sector Comparison of Selection and Completion of Use Digital Technology Milestones by Sector (2013-14)

  16. Our analysis explains why digital Milestones are different

  17. What did the survey and interview data reveal about the digital tech Milestones? • The digital technology Milestones are appealing to learners, easy to use, and predictable • They can be adapted to ensure the use of truly authentic and skill appropriate texts and activities, particularly for learners with limited literacy skills, knowledge and strategies • They align with actual activities that learners are engaged in (learners recognize texts used for testing purposes) • They can be used in blended learning programs and computer courses • They introduce learners to the Milestone testing scheme and concept of competencies • They ensure compliance

  18. Guidelines make the Milestones administrable but not teachable • Only assessors can see the Milestones • Instructors cannot: • Analyse Milestone content to adequately prepare their learners for the Milestones and use this information to plan their lesson plans for learners • Provide additional information about the content and test instructions (not allowed to use content or information to add on during testing) • See detailed results that could be used to help learners complete unsuccessful Milestones (not allowed to share results with learners)

  19. Do all learners experience these impacts?

  20. Do all learners experience these impacts?

  21. Education levels by sector

  22. Strategies to mitigate impacts and show compliance

  23. “Milestones are silly in their present form...I appreciate the need for a structured and consistent method of evaluation, but the Milestones miss the mark. I generally regard them as a means to an administrative end rather than an educational tool. I never select Milestones at intake to get service plans open ASAP, but I also rarely select them in the way that they are intended (select a planned start date, end date, mark the actual start and finish as the student progresses, etc.). I get my students to attempt a Milestone and enter it if they complete it. If they aren't successful, I don't record it since it is a waste of my time and will reflect negatively on my program.” Interviews Participant

  24. What alarge program did (case study) • Work together • Hire extra admin. staff • Shift extra work to admin. • Embed Milestones into existing activities • Keep learning activities the same • Downplay the meaning and importance • Not spend time preparing learners

  25. Milestones embedded in a course

  26. What a small program did (case study) • Work independently to develop a strategy • Change and adapt regular curriculum planning • Spend time preparing students and developing additional activities to help complete Milestones • Questioned her strategies and expertise

  27. Feedback from a School Board • “Offsetting” learners who need more time to complete a Milestone in their program • Focus on developing short-term computer-related activities with employment goals • Use of digital Milestones to report progress “We’re bringing in people who can show progress more efficiently.” “Learners have to be able to complete a Milestone. We’re just bringing in anyone who meets this criteria for learning.” “The problem with a lot of the Milestones, unless you have a person working on those exact skills, they really don’t fit. The Milestones are very limited.” Interviews Participant

  28. Conclusion: The contradictions • Results are primarily used to provide data rather than assess learning • Too difficult, too easy • High-stakes, low-stakes • No alignments with other systems • Most commonly used Milestone results are useful for programs but do not provide objective and standardized results to MAESD (formerly MTCU)

  29. Conclusion: The inequities • Extra work and effort • Unfair assessment • Interfere with existing curriculum • Disconnect LBS system from provincial education and training initiatives

  30. Implications for the LBS system Milestones work counter to LBS objectives • Compliance-centred not learner and learning-centred • Programs can’t show actual progress and learner accomplishments • Undermines the aim to “ensure accountability to all stakeholders” • Interferes with program aims to help learners transition • Are not appropriate for learners with the least amount of education and/or with disabilities • LBS programs are being held accountable for things they don’t actually do

  31. Discussion • What are your conclusions about the use of the Milestones? • What resources and forms of knowledge building have found most useful? • What are the benefits to and barriers for your learners?

  32. Your recommendations • For CESBA and each other • For researchers like us • For the LBS Program and MAESD • Other stakeholders

  33. Our recommendations • Do not connect funding to results • Do rely on international literacy testing methods and results to build the effectiveness measure in the PMF • Review a complex and inconsistent assessment system • Ensure appropriate, fair, consistent and meaningful assessment by using experts who will not profit • Involve practitioners in a meaningful way in future re-design efforts

  34. Moving forward • You can use the research report and results to support your decisions • You can also use the findings (and slides) to explain your concerns to your ETC • Do you have other suggestions about what we can do moving forward?

  35. More information • Research Overview — Assessment Challenges, Contradictions and Inequities: An analysis of the use of digital technology and OALCFMilestones • Literacy and Basic Skills (LBS) Program Data • Lessons Learned From Analysing the OALCF Use Digital Technology Milestones • Practices Developed When Using the OALCF Milestones Download at AlphaPlus.ca

More Related