1 / 44

Project Learning

Project Learning. No longer scissors, glue and dioramas…. Spaghetti and Marshmallows. Astrological signs Spaghetti and marshmallows Terri Steimer – Marshmallow commissioner… 18 minutes!!! Why This Activity?. What is Project Learning?. What it is NOT!!! An additional demand

freya
Download Presentation

Project Learning

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Project Learning No longer scissors, glue and dioramas…

  2. Spaghetti and Marshmallows • Astrological signs • Spaghetti and marshmallows • Terri Steimer – Marshmallow commissioner… • 18 minutes!!! • Why This Activity?

  3. What is Project Learning? What it is NOT!!! • An additional demand • Scissors and glue • Dioramas • Necessarily long, drawn out activities What it is • A strategy for application • An approach – a way of thinking about instruction • Integrated instructional method

  4. What is project learning? Project Based Learning is a teaching and learning model (curriculum development and instructional approach) that emphasizes student-centered instruction by assigning projects. It allows students to work more autonomously to construct their own learning, and culminates in realistic, student-generated products.

  5. WHY???? What, Why, Benefits

  6. The Homework “Project” • Quadratic Equations • Comma splices • Table of Elements • Civil War Reconstruction • 4/4 vs. Cut Time • Portrait drawing • “Where’s the restroom” in four different languages

  7. From the Simple to the Extraordinary • Resources: • http://www.hightechhigh.org • http://www.edutopia.org/project-based-learning • http://pbl-online.org • http://www.bie.org • http://21centuryedtech.wordpress.com/2010/01/16/free-project-based-learning-resources-that-will-place-students-at-the-center-of-learning • http://lone-eagles.com/pbl.htm • http://cell.uindy.edu/NTHS/PBLresources.php

  8. EPIC – Generation iY • Experiential • Participatory • Image Rich • Collaborative

  9. Break Time… Ticket to break – 2 sentences on the Marshmallow Challenge

  10. Standards Grading • Paper Airplane Challenge • Does it Fly • 1 – little or no sustained flight – “crash & burn” • 2 – flies but not on a predictable path, or makes an immediate turn • 3 – sustained flight on some throws, unpredictable performance, above average reliability • 4 – flies straight and true for distance and speed • Standards vs. Traditional Grading??????????

  11. The Elementary Report Card • Learner Objectives • Defined standards • Separates effort and behavior from mastery • Can it work at the high school level?

  12. Standards-Based Grading (SBG) A Case Study…Kinda @MR_ABUD #TEAMPHYSICS

  13. Motivation for the Shift • Meaning of a grade • What you did vs. what you know • Students motivated (by meaning of the grade) to do assignments/tasks • Like finishing the weekend chores checklist • Focus on learning and mastery is lost • Students “make up” missing work at the end of a marking period for credit to help their grade • Without focus on learning for mastery, taking learning risks can be too costly

  14. The Problem How might we make a gradebetter represent what students know rather than what they did?

  15. The Solution At least, it was an attempt at one…

  16. Actions Taken: This plan was developed over christmas break & implemented at the start of 2nd semester • Stopped checking in homework all together • Previously was checked for “completion” (approach to assess student attempt) • Wrote standards for each unit (adapted from curriculum map and state HSCEs) • Written in student-centered “I statements” • Developed assessments that generated ostensible evidence of student learning on standards • Linked individual assessments (and even individual questions) to one or more standards • Kept record of student scores on assessments • Reported student assessment scores according to standards • Used an average of all assessment scores for a given standards • Managed Scores in Microsoft Excel • Created “Standards” in Pinnacle instead of assignments • Input standards scores in Pinnacle • Calculated summative score & reported all standards and ratings to students separately from their report card

  17. Tracking Progress with Graphs

  18. Tracking Progress with Tables From: Frank Noschese’s blog post, “The Tower” (7/27/11)

  19. Gradebook Before

  20. The SBG Scale Inspired by the 4.0 grade point average system, the rating system used in SBG simply assigns a numeric value to a level of proficiency. Here is how it looks in comparison to how we are used to grading:

  21. GradebookAfter

  22. Findings • Students: • Agreed with SBG as a means to more accurately reflect what they know • Used the feedback from their SBG report to improve their understanding and strived for mastery (completed reassessments) • Stopped copying just to get assignments completed so they would “get [their] points” toward a grade • Shifted their focus from doing to learning • Struggled with the adjustment to a different grading approach that • Did not readily see how to use SBG feedback, or did not choose to use the SBG feedback • Still asked, “what assignments am I missing that my grade is so low?” • Wished it had been implemented from day 1

  23. Complaints & Criticisms of SBG Mr. Abud, I was wondering what my grade is; it says “approaching proficiency” a bunch of times. • Students: • Cognitive dissonance with regard to grading • “What the formula” (WTF) moments occurred when students could not figure the calculation of their grade • SOLUTION: MORE TRANSPARENT PROCESS • Forces them to have to know it • Many are used to being able to “fake it to make it” (the game of school) • SOLUTION: START FROM DAY 1 & INTRODUCE PROCESS WITH AN INVITING EXPERIENCE

  24. Complaints & Criticisms of SBG • Parents: • Not possible with Pinnacle to see why their student has the grades on the standards that they do • Searching for a missing “checklist” that their student did not complete • SOLUTION: STUDENTS KEEP TRACK OF THEIR PROGRESS • Grade lower than expected, yet student “does their work” • Parents used to having completion accountability for students’ grades • SOLUTION: LETTERS HOME EXPLAINING PROCESS Perhaps my grade was artificially inflated by all that copying I did on my homework for completion…

  25. Conclusions, Reflections, & Recommendations Strengths (+) Changes (∆) • Focus on learning, not just doing • Opportunity for improvement • Removes pressure of academic risk-taking • Connects grade to learning • Makes assessment and grading more transparent/relevant • Standards in gradebook instead of tasks • Starts from day 1 • Students track their own progress • More formative assessment • Obtrusive • Unobtrusive • Student-generated • Rubrics • Letter home to explain process

  26. Next Steps…for Interested Parties • Consider • How you ALREADY use rubrics to assess students • What your performance objectives look like • To what extent your assessments connect to your objectives • The function of homework in your class (practice or chore) • Reflect on • How often you have students wanting to make up missing work for points but gain nothing from doing that work • The extent to which a grade in your class truly reflects learning • Whether your students are motivated to learn or do • What opportunities exist for students to recover from early mistakes

  27. ACTIONS YOU CAN TAKE TO TRY SBG The following actions can be part of a small action research project in your own classroom: • Consider for an upcoming unit/chapter/lesson/project • Writing objectives in “I statement” language that connect to your content standards • Developing assessments (formative and summative) that make it easy to observe proficiency with the objectives • Adding a rubric to that assessment if it doesn’t already have one (omit components unrelated to objectives, e.g., 1” margins) • Creating proficiency rankings with explanations of each ranking level • Assessing students according to the rubric components • Reporting students’ scores on the rubric and generating a summative score as well • Providing students a means to track their own performance • Debriefing the grading approach with your class

  28. Final Thoughts • SBG is not a replacement for a summative grade • It just gives more substantive meaning to that grade • It is completely possible to implement with any number of students in any content area • It is best for teachingand learning • It promotes formative assessment, feedback, and student ownership over learning • It can be done in a low-tech (paper grids / graphs) or a high-tech way (Excel, Pinnacle, cloud-based apps) It is more “fun” when you do it with others

  29. Resources • Always Formative Blog: • http://bit.ly/ksFvZk • ActiveGrade – a cloud-based web app for SBG • http://activegrade.com • Sample Classroom SBG Policy Handout • http://bit.ly/nnAGly • SBG w/Voice • http://t.co/MBvlgNM • US Dept. of Ed. SBG Resources • http://1.usa.gov/lOgtWu • Frank Noschese’s Action-Reaction Blog • http://fnoschese.wordpress.com/

  30. GPPSS Format for effective teacher evaluation A guide to the new GPPSS teacher evaluation process

  31. The Process • Representatives from teachers’ union, building administrators, and central office began meeting in winter of 2011 • Established parameters and philosophies • Examined various models and tools, with significant focus on Marzano and Danielson teacher evaluation models • Developed a framework • Developed an instrument to fit the framework • Agreed on rubric • Agreed on rating system based on rubric • Implementation for 2011-2012

  32. The Team • Teachers: • Ranae Beyerlein – GPEA President • Dan Quinn – GPEA Executive Board • Chris Geerer • Peter Signorello • Nancy Nihem • Administrators • Tom Harwood – Asst. Supt. for HR • Monique Beels – Asst. Supt. for C & I • Tim Bearden – North Principal • Mary Macdonald – Barrett – Richard Principal • Mark Mulholland – Parcells Principal

  33. Foundational Beliefs • The committee established the following beliefs as the basis for the new process: • The goal of the instrument is collaboration to improve instruction with the intent of improving student learning • The instrument / rubric must define good instruction • Multiple rating categories • Adaptable Instrument • Focus Observation Areas • Locally established growth and measurement models • Establish clear standards for effective performance • Focus on evidence of planning & Preparation • Focus on a variety of instructional methodologies

  34. The Basis - Danielson • Committee agreed upon Charlotte Danielson domains and rubric as a basis for the process. Danielson’s Framework for Enhancing Professional Practice was published in the mid 90’s as a guide for improving instruction, and has been a standard ever since. • “A framework for professional practice can be used for a wide range of purposes, from meeting novices’ needs to enhancing veterans’ skills.” – Charlotte Danielson

  35. Michigan Law – New Requirements • Among other things, key features of Michigan’s new legislation relative to teacher evaluation include: • Mandate that all teachers and administrators must be evaluated annually • Evaluation must include measurement data relative to student achievement • Ratings must use the categories of Ineffective, Effective, and Highly Effective

  36. GPPSS New Format - Features • Same process for on-cycle tenured teachers and probationary teachers except for the # of required observations • Teachers and administrators reach mutual agreement on measurement tools • For some observations, walk-through visits can be substituted for longer formal visits • Clearly defined categories of effectiveness based on an established rubric

  37. Process • Initial meeting with all teachers being evaluated to review process • Email or personal notification to set goal-setting meeting • Teacher and evaluator agree on three goals: an instructional goal from Danielson rubric Domain 1 or 3, Achievement Goal, A classroom environment / affective goal from Danielson Domain 2

  38. Process (cont.) • Teacher and Evaluator agree on measurement tools for each goal. A variety of options are outlined in the instrument, and allowance is made for the teacher and evaluator to agree on a tool not listed as an option. • Classroom Observations • Written Evaluation Completed

  39. Process – Classroom Observation Requirements • Probationary Teacher • Minimum of 1 pre-scheduled observation of 30 minutes or more in the teacher’s first month of teaching • Minimum of 1 un-announced observation of 30 minutes or more to occur in an agreed upon one week window within the teacher’s first four months of teaching and approx. 60 days after the first observation • 1 additional un-announced observation of 30 minutes or more, OR 3 or more walk-through visits of ten minutes or more each • Tenured On-Cycle Teacher • Minimum of 1 pre-scheduled observation of 30 minutes or more • Minimum of 1 un-announced observation of 30 minutes or more to occur in an agreed upon one week window, OR 6 or more walk-through visits of a minimum of 5 minutes each

  40. Evaluation Tool - Handout • Focus Areas: • Admins will evaluate in at least ten total categories with a minimum of two per domain • Administrators will provide comments in the expandable table for each domain component used for evaluation • There is an opportunity for teacher comment within each domain • Administrators will comment on progress towards identified goals in the provided narrative space

  41. Evaluation Tool - Rating • Probationary Year 1 Effective = 100% of ratings in Rubric columns 2-4 • Probationary Year 2 Effective = 100% of ratings in columns 2-4, 75% or more in columns 3 & 4 • Probationary Year 3 Effective = 100% of ratings in columns 2-4, 90% in columns 3 & 4 • Probationary Year 4 Effective = 100% of rankings in columns 3 & 4 • Tenured Teacher Effective = 100% of rankings in 3 or 4. Any rankings in columns 1 or 2 result in an IDP • For a rating of “Effective” or “Highly Effective”, a teacher must have made measurable progress towards identified goals.

  42. Interim Annual Evaluation Process for Off-Cycle Tenured Teachers • Michigan law requires annual evaluation of teachers and administrators. In GP tenured teachers will continue to be on a 3 year evaluation cycle. In interim years, the following process will be used: • Email notification of assigned evaluator. • Minimum of one un-announced classroom observation of 30 minutes or more, OR 3 walk-through observations of 5-10 minutes each. • Teacher self-reflection on the year. • Evaluator narrative summary of observations in writing by May 31st.

  43. Handouts Include: • GPPSS Format for Effective Teacher Evaluation • Definition of Terms • Teacher Evaluation Report • Probationary and Tenured Checklists • Domain rubric

  44. Goal The goal is effective, dynamic instruction that leads to growth in student achievement. This is intended to be a collaborative process built on a researched model to enhance professional practice.

More Related