1 / 24

Assessment 2009

Assessment 2009. IMPACT ASSESSMENT – ACTION RESEARCH – PLEs – OLEs – CGOs – Kirkpatrick – ACCESS – Bertelsmann – Schön – Reflective Practice – Societal Impact – Personal Impact – Operative Attention. Scared or Inspired??.

vin
Download Presentation

Assessment 2009

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Assessment 2009 IMPACT ASSESSMENT – ACTION RESEARCH – PLEs – OLEs – CGOs – Kirkpatrick – ACCESS – Bertelsmann –Schön– Reflective Practice – Societal Impact – Personal Impact – Operative Attention

  2. Scared or Inspired?? • We are currently preparing students for jobs that don’t yet exist… Using technologies that haven’t been invented in order to solve problems we don’t even know are problems yet. • Karl Fisch • I want to discover. Be comfortable in not knowing. Identify the Questions, the Answers will come David Warlick

  3. Time to REALLY Assess VALUE…

  4. The Obligatory Joke • How many Academic Administrators does it take to change a light bulb? • CHANGE !!??

  5. Generation Pong – time to evolve • Are you ready for the Gamers? • Learning MUST BE - Continuous / Relevant / Adaptive • Flexible adaptive environments - encourage fluid thinking, easily reinforce concepts, engages mechanics and allows their creators to quickly iterate • Chris Melissinos – Chief Gaming Officer Sun Microsystems

  6. These New-Fangled Ideas… • 1998 Bertelsmann report: • Cognitive tools empower learners to design their own representations of knowledge rather than absorbing representations preconceived by others. • Cognitive tools can be used to support the deep reflective thinking that is necessary for meaningful learning. • Ideally, tasks or problems for the application of cognitive tools will be situated in realistic contexts with results that are personally meaningful for learners. • Using multimedia programs as cognitive tools engages many skills in learners such as: project management skills, research skills, organization and representation skills, presentation skills, and reflection skills.

  7. Learners as Designers • Spoehr (1993) reports that students who build and use hypermedia develop a proficiency in organizing knowledge about a subject in a more expert-like fashion. They are able to represent multiple linkages between ideas and organize concepts into meaningful clusters. • Salomon et al. (1991) - Educators should empower learners with cognitive tools and assess their abilities in conjunction with the use of these tools. Such a development will entail a new conception of ability as an intellectual partnership between learners and the tools they use.

  8. Asking the Right Questions. • Tools – P • Questions – tbd START with Action research : a reflective process of progressive problem solving led by individuals working with others in teams or as part of a "community of practice" to improve the way they address issues and solve problems. • Dr. Joan McMahon mcmahon@towson.edu • John Sener – jsener@senorlearning.com

  9. So What? • Impact Assessment basically asks the question “So what?” After anything you do or try, simply asking that question gets you closer to the “impact” issue. – Dr Joan McMahon

  10. Kirkpatrick’s evaluation of T+L • Kirkpatrick's 1975 book Evaluating Training Programs defined his originally published ideas of 1959, increasing awareness of them, so that his theory has now become arguably the most widely used and popular model for the evaluation of training and learning. • Kirkpatrick's four-level model is considered an industry standard across the HR and training communities. • The four levels of Kirkpatrick's evaluation model essentially measure:

  11. Kirkpatrick’s Four Levels • L1 – Teaching or reaction of student - what they thought and felt about the training • L2 - Learning - the resulting increase in knowledge or capability • L3 – Results - extent of behavior and capability improvement and implementation/application • L4 – Impact - the effects on the business or environment resulting from the trainee's performance All these measures are recommended for full and meaningful evaluation of learning in organizations, although their application broadly increases in complexity, and usually cost, through the levels from level L1-4.

  12. What questions can you ask to get to “impact”? • T = I taught it. • TL = I taught it but did they learn it? • TLR = I taught it but did they learn it AND RETAIN IT? • TLRI = I taught it but did they learn it and retain it AND SO WHAT?

  13. Examples – Better Questions!! • The teacher encouraged interaction – Yes / No • 1 2 3 4 5 • The teacher was tasked to encourage interaction among students – what would have motivated you personally to interact more with your peers? • Did you meet the Learning Goals or the course as in the syllabus? • What has been the impact of this course in your work or personal life? What new skills have you used to affect change?

  14. The Evolution of Assessment This slide attributed to Bobby Elliot - Scottish Qualifications Authority bobby.elliott@sqa.org.uk

  15. Assessment 1.0 • Paper-based • Classroom based • Formalised • Synchronised • Controlled • Industrialised • Changed little since early 20th Century This slide attributed to Bobby Elliot - Scottish Qualifications Authority bobby.elliott@sqa.org.uk

  16. Assessment 1.5 • Computer-based assessment • Characteristics • E-testing • E-portfolios • Simulations • Embedded in most VLEs • Stand-alone systems • Familiar to students and teachers This slide attributed to Bobby Elliot - Scottish Qualifications Authority bobby.elliott@sqa.org.uk

  17. Student Perceptions (of 1.0/1.5) • Artificial and contrived • Something that is done to them • Doesn’t measure anything important • (It’s a) Hurdle to be jumped • Not part of their learning • Sole purpose of their learning This slide attributed to Bobby Elliot - Scottish Qualifications Authority bobby.elliott@sqa.org.uk

  18. Web 2.0 • User-generated content • Architecture of participation • Network effects • Openness • Data on an epic scale • Power of the crowd

  19. Assessment 2.0 needs to be… • Authentic • Personalised • Negotiated • Collaborative • Recognising existing skills

  20. Web 2.0 Services This slide attributed to Bobby Elliot - Scottish Qualifications Authority bobby.elliott@sqa.org.uk

  21. Reasons to open things up • “Because you’re pouring money into a black hole that students don’t like, which is unnatural to them, which can’t possibly keep up with developments on the Web, and which is little more than a comfort blanket to teachers who can’t or won’t embrace the 21st Century.”Bobby Elliot

  22. A Little More Conservatively • It’s crude but it’s an important evolutionary step • Not every student is a digital native • Not every teacher can use Web 2.0 • “I can’t get my staff to use the quiz in Moodle so what chance is there that they’ll embrace Web 2.0?”

  23. Meld the right questions to the appropriate (student-selected media) • Ask the right questions – Open vs Closed / Fat vs. Skinny / IMPACT Assessment = SO WHAT?! • Review objectives to encourage multiple means of achieving success. • Even when you are in a Blackboard/Moodle (confined) world, be open to a variety of deliverables, and look to nurture that creativity. • Accept OLEs (Online Learning Environment) / PLEs (Personal Learning Environment) – Voicethread / Skitch …

  24. (Impactful) QUESTIONS??

More Related