Using tpa as inquiry for improving te programs the minnesota model
Download
1 / 79

USING TPA as Inquiry for Improving TE Programs – the MINNESOTA MODEL - PowerPoint PPT Presentation


  • 104 Views
  • Uploaded on

USING TPA as Inquiry for Improving TE Programs – the MINNESOTA MODEL. Presentation for Minnesota TPA Liaisons Minnesota Department of Education 9:00am-12:00pm, December 19, 2011 Cathy Zozakiewicz TPAC Consultant, Stanford University. Introductions.

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about ' USING TPA as Inquiry for Improving TE Programs – the MINNESOTA MODEL' - zocha


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
Using tpa as inquiry for improving te programs the minnesota model
USING TPA as Inquiry for Improving TE Programs – the MINNESOTA MODEL

Presentation for Minnesota TPA Liaisons

Minnesota Department of Education

9:00am-12:00pm, December 19, 2011

Cathy Zozakiewicz

TPAC Consultant, Stanford University


Introductions
Introductions

  • Introduce yourself, your affiliation and one thing you hope to get out of today

  • Review of MN TPA Process So Far – My Outsider/Insider Perspective Based on My TPAC Travels

Stanford Center for Assessment, Learning and Equity 2011


TPA Liaison Goals

  • Participate, as a faculty member in TPA as Inquiry for Improving our TE Programs

  • As you participate as faculty, be metacognitive and consider how this model will work and/or need to be modified for your TE program context

  • To ask questions about this process as a TPA Liaison, who may prepare and/or facilitate a similar session at your institution


OVERALL Goals/Outcomes

  • Briefly reorient ourselves to the architecture of the TPA including the conceptual framework, constructs of the rubrics, and the scoring process

  • Consider how to utilize quantitative and qualitative TPA data for program renewal/improvement

  • Review, analyze and map evidence to rubric concepts for a “Representative” TPA program sample in an appropriate Content Area – Elementary Literacy/4th Grade

  • Reflect upon our TE program content, practices and structures in light of the sample candidate’s strengths and areas of need in understanding and implementing effective teaching practices in the field


OUR GUIDING QUESTIONS

  • What do we value as teacher educators in preparing new teachers for the cycle of teaching: planning, academic language, instruction, assessment, and reflection and how does this align with what is measured in the TPA?

    • How effectively are candidates making sense of and performing the act of teaching? What do the data show in terms of teacher candidates’ understandings and professional performance?

      AND

    • How are we making sense of candidates’ performances and data in order to reflect upon, improve and have evidence-based dialogue about our own TE program, practices, and structures? What are the implications for our program in terms of what and how we teach?


What today is not about our be carefuls
WHAT TODAY IS NOT ABOUT: OUR BE CAREFULS

TODAY IS NOT ABOUT:

  • ACCURATELY SCORING A TPA SAMPLE

  • CRITIQUING/EVALUATING THE PROGRAM FROM WHICH THE TPA SAMPLE CAME

  • CRITIQUING/JUDGING THE STUDENT TEACHER WHO COMPLETED THE SAMPLE

  • CRITIQUING THE TPA/TPA RUBRIC

Stanford Center for Assessment, Learning and Equity 2011


Tpac goals
TPAC GOALS

TPAC is working to develop and implement at scale a way of assessing teaching that…

  • Provides evidence of teaching effectiveness,

  • Supports teacher preparation program improvement

  • Informs policy makers about qualities of teaching associated with student learning.

    TPAC is ONE example of an assessment system that is designed to leverage the alignment of policies and support program renewal.

Stanford Center for Assessment, Learning and Equity 2011


Accountability reframed
Accountability reframed

….as our professional responsibility

USING INQUIRY

How can we gather and use evidence of the qualities of teaching performance that inspire, engage, and sustain students as learners – to improve teachingand teacher preparation?

Stanford Center for Assessment, Learning and Equity 2011


Tpa overview and architecture
TPA Overview and Architecture

Stanford Center for Assessment, Learning and Equity 2011


Tpac lineage
TPAC Lineage

  • National Board for Professional Teaching Standards (NBPTS) portfolio assessments – accomplished teachers

  • Connecticut BEST assessment system – teachers at end of induction

  • Performance Assessment for California Teachers (PACT) – pre-service teachers

Stanford Center for Assessment, Learning and Equity 2011


Standards and tpac
Standards and tPAC

  • Common Core alignment

  • InTASC alignment

  • NCATE/CAEP endorsement

  • SPA endorsement

Stanford Center for Assessment, Learning and Equity 2011


Design principles for educative assessment
Design Principles forEducative Assessment

  • Discipline specific and embedded in curriculum

  • Student Centered: Examines teaching practice in relationship to student learning

  • Analytic: Provides feedback and support along targeted dimensions.

  • Integrative: maintains the complexity of teaching

  • Affords complex view of teaching based on multiple measures

Stanford Center for Assessment, Learning and Equity 2011


Tpa architecture
TPA Architecture

  • A summative assessment of teaching practice

  • Collection of artifacts and commentaries

  • “Learning Segment” of 3-5 days

  • Plans based on context and knowledge of students

    • Academic, social emotional and language development

    • Prior learning, lived experiences, family, community and cultural assets

Stanford Center for Assessment, Learning and Equity 2011


Tpac artifacts of practice
TPAC Artifacts of Practice

Stanford Center for Assessment, Learning and Equity 2011


Multiple measures assessment system
Multiple Measures Assessment System

TPAC Capstone

Assessment

Embedded Signature Assessments

  • Integration of:

  • Planning

  • Instruction

  • Assessment

  • Analysis of Teaching

  • with attention to Academic Language

Child Case Studies

Analyses of Student Learning

Curriculum/Teaching Analyses

Observation/Supervisory Evaluation & Feedback

Stanford Center for Assessment, Learning and Equity 2011


Rubric progression
Rubric progression

  • Early novice  highly accomplished beginner

  • Rubrics are additive and analytic

  • Candidates demonstrate:

    • Expanding repertoire of skills and strategies

    • Deepening of rationale and reflection

  • Teacher focus  student focus

    • Whole class  generic groups  individuals

Stanford Center for Assessment, Learning and Equity 2011



Considering both sets of data
CONSIDERING BOTH SETS OF DATA

  • QUANTITATIVE DATA – RUBRIC SCORES AND NUMBER OF CANDIDATES WHO PASS/DEMONSTRATE COMPETENCY AND WHO DO NOT PASS/DO NOT DEMONSTRATE COMPETENCY

  • QUALITATIVE DATA – TPA SAMPLES – WHOLE TPAs OR INDIVIDUAL TASKS (PLANNING, INSTRUCTION, ASSESSMENT, REFLECTION/ANALYZING TEACHING)

Stanford Center for Assessment, Learning and Equity 2011


Using quantitative tpa data
USING quantitative tpa data

  • Analyze scoring patterns of strengths and weakness (competency and failing) within and across program areas, and within and across TPA tasks.

  • Consider using electronic data platform to save and manipulate quantitative data across time for program improvement, accreditation and research purposes (for institutions and individual faculty). For example, at San Diego State University, we use Tableau for our TPA data.

  • Using data patterns, choose representative content area TPAs or TPA tasks to review and score for inquiry purposes as a whole faculty or in strategic faculty groups (including administrators as appropriate).

Stanford Center for Assessment, Learning and Equity 2011


Targeted competencies
Targeted Competencies

PLANNING

  • Planning for content understandings

  • Using knowledge of students to inform teaching

  • Planning assessments to monitor and support student learning

    INSTRUCTION

  • Engaging students in learning

  • Deepening student learning during instruction

ASSESSMENT

Analyzing student work

Using feedback to guide further learning

Using assessment to inform instruction

REFLECTION

Analyzing Teaching Effectiveness

ACADEMIC LANGUAGE

Identifying Language Demands

Supporting students’ academic language development

Evidence of language use

Stanford Center for Assessment, Learning and Equity 2011


Competencies to map quantitative data
Competencies To Map - QUANTITATIVE DATA

PLANNING

1. Planning for content understandings

2. Using knowledge of students to inform teaching

ACADEMIC LANGUAGE

10. Identifying Language Demands

11. Supporting students’ academic language development

PLANNING CONTINUED

3. Planning assessments to monitor and support student learning

ASSESSMENT

6. Analyzing student work

8. Using assessment to inform instruction

REFLECTION

9. Analyzing Teaching Effectiveness

Stanford Center for Assessment, Learning and Equity 2011


Discussing academic language
Discussing Academic Language

  • What are our present understandings/framings of academic language?

  • Why do we think it is included in the TPA?

  • How is it included in our present TE Program practices?

  • What are our questions about AL so far?

Stanford Center for Assessment, Learning and Equity


Why include academic language in tpa
WHY include Academic Language IN TPA?

  • Academic language is different from everyday language. Some students are not exposed to this language outside of school.

  • Much of academic language is discipline-specific and deepens subject matter THINKING.

  • Unless we make academic language explicit for learning, some students will be excluded from classroom discourse and future opportunities that depend on having acquired this language.

Stanford Center for Assessment, Learning and Equity


Academic language
Academic Language

  • Academic language is the oral and written language used in school necessary for learning content.

  • This includes the “language of the discipline” (vocabulary and forms/functions of language associated with learning outcomes) and the “instructional language” used to engage students’ in learning content.

Stanford Center for Assessment, Learning and Equity


Vocabulary
Vocabulary

  • Technical vocabulary: triangle, metaphor, metabolize

  • Words whose technical meaning is different than everyday language: “balance” in chemistry, “plane” in mathematics, “ruler” in history/social science, “force” in science

  • Connector words: and, but, because, therefore, however

Stanford Center for Assessment, Learning and Equity


Three f words
Three F Words

The FUNCTIONSof Academic Language areto clearly and explicitly define, classify, analyze, explain, argue, interpret and evaluate ideas for distant audiences.

Every language function has FORMS or structures that are common and often discipline specific (text, sentence or graphic/symbolic)

Developing students’ FLUENCY in academic language forms and functions provides access to the “language of school” and academic success

Stanford Center for Assessment, Learning and Equity


Academic language competencies measured
Academic LanguageCompetencies Measured

  • Understanding students’ language development and identifying language demands

  • Supporting language demands (vocabulary, form and function) to deepen content learning

  • Identifying evidence that students understand and use targeted academic language in ways that support content learning and language development.

Stanford Center for Assessment, Learning and Equity 2011


Academic language competencies measured1
Academic LanguageCompetencies Measured

  • Understanding students’ language development and identifying language demands

  • Supporting language demands (vocabulary, form and function) to deepen content learning

  • Identifying evidence that students understand and use targeted academic language in ways that support content learning and language development.

Stanford Center for Assessment, Learning and Equity 2011


Additional resources
Additional Resources

  • Jeff Zwiers

    • Building Academic Language: Essential Practices for Content Classrooms, Grades 5-12

  • SIOP

  • Academic Language webinars archived on the TPAC Ning

    • Melanie Hundley - Tennessee

    • Ann Lippincott and Laura Hill Bonet

Stanford Center for Assessment, Learning and Equity 2011


Context of classroom for tpa sample
Context of classroom FOR TPA SAMPLE

  • Urban Elementary School

  • 4th grade, 30 students, 15 girls, 15 boys, 1 ELL, 2 STs id as Gifted and Talented

  • Several students with ADHD, 1 with Asperger’s and 1 with dev. delays – both STs go to another room for language arts each day

Stanford Center for Assessment, Learning and Equity 2011


Planning rubric 1 planning for understanding
Planning Rubric 1 – PLANNING FOR UNDERSTANDING

READ LESSONS AND COMMENTARY SECTIONS - 1. CHOICE OF LEARNING TASKS/MATERIALS & 2. LESSON SEQUENCE AND CONNECTIONS (Prompts 1, 3a-b). PAY ATTENTION TO EVIDENCE FOR THESE CENTRAL CONCEPTS.

EL1: How do the candidate’s plans develop students’ abilities to comprehend or compose text through literacy skills and strategies?

  • Standards, objectives and learning tasks are aligned. Objectives define measurable outcomes.

  • Plans for instruction build on each other to support student learningof conventions/skills with connections to strategies for comprehending or composing text


Evidence for planning 1 see doc
EVIDENCE FOR Planning 1 (SEE DOC.)

SEGMENT FOCUS: For students to apply pre-reading strategies to become more efficient readers and improve comprehension.

Standard Grade 4: Refer to details and examples in a text when explaining what the text says explicitly and when drawing inferences from text.

Objectives: Activate PK to make predictions about text.

Lesson 1: Introduce and define PK. Chart PK versus predictions by skimming 3 text pages selected in pairs. Share out with class. Whole group questions to check for understanding.


More evidence for planning 1
MORE EVIDENCE FOR Planning 1

Objective: Analyze text to make predictions.

Lesson 2: Sts read pages in text, then skim next set of pages. Sts write 3-5 predictions based on skimming section. Sts close read to self assess predictions using rubric [not included]. Share out.

Objective: Utilize BK on topic before reading it.

Standard: Interpret information presented visually, orally… and explain how information contributes to understanding of text in which it appears

Lesson 3: Brainstorm what do you know about Minneapolis whole group. KWL earthquakes teacher-lead. Teacher shares sample concept map on rainforest. Sts create map on text character.


More evidence for planning 11
MORE EVIDENCE FOR Planning 1

Objective: Infer based on reading of text.

Lesson Plan 4: Mime actions to infer from with class discussion. Demonstrate inferring with cue cards, then pairs practice and review answers with teacher. Review definition of infer. Sts read text section – The Earth Shakes. Complete Collins Writing Assessment - write 5 sentences on inferences from text.


Excerpts from planning commentary
Excerpts From Planning commentary

I feel that lesson sequence and connecting skills and strategies may have been one of the strengths of my TPA. On Mon., they made predictions using pictures. On Wed., they were handed sentence strips on LRRH and asked to use PK to put them in order. On Thur., teams competed in coming up with known facts about Minneapolis. And on Fri., they used what they knew to infer what I was during during charades. At the end of each anticipatory set, I reviewed key terms (PK, prediction), asking volunteers to define them for class.

…From day to day, the pre-reading strategies became a little more complex, which allowed them to building on what they had previously practiced. Mon. and Wed. featured making predictions, Thurs. included graphic organizers to assist with reading, and Fri. was about using information to infer, which proved to be the most difficult for the students to grasp.


Planning rubric 1
Planning Rubric 1

Stanford Center for Assessment, Learning and Equity 2011


Planning rubric 2a using knowledge of students
Planning Rubric 2A- USING KNOWLEDGE OF STUDENTS

EL2: How does the candidate use knowledge of his/her students to target support for students’ understandings of comprehending or composing text?

  • Plans draw on students’ prior knowledge and experiences, and social/emotional development or interests.

  • Planned tasks and/or scaffolding are tied to learning objectives and student characteristics, including 504 and IEP requirements.


Evidence planning 2a
Evidence Planning 2A

  • To Begin:

    • Review LPs & Read Plan. Commentary Prompts

      2a-d, 3a, 3f – Knowledge of STs, Developmental Approximations, and Adaptations/Differentiation

    • Collect Evidence for Planning Concepts below:

    • Plans draw on students’ prior knowledge and experiences, and social/emotional development or interests.

    • Planned tasks and/or scaffolding are tied to learning objectives and student characteristics, including 504 and IEP requirements.


Recordinng evidence for planning 2a
RECORDINNG Evidence FOR Planning 2A

  • Plans draw on knowledge of students:

  • Scaffolding:



Planning rubric 2b
Planning RUBRIC 2B

  • GQ EL 2B: How does candidate use knowledge of students to target support for students’ literacy development?

    Candidate uses examples of students’

    prior learning and experience OR/AND relevant

    research/theory to justify learning tasks

    READ PLANNING SECTION ON: Social and Emotional Development (Prompt 5a)


Evidence for rubric 2b
EVIDENCE FORRUBRIC 2B

Candidate uses examples of students’ prior learning and experience OR/AND relevant research/theory to justify learning tasks:

  • Discusses cultural and social experiences used (Simpsons, clock partners, culturally relevant content)

  • Refers to reading tests given without discussing how results are used in planning

  • Uses Piaget – claims students are at concrete operational stage and therefore, “majority of lesson planning takes this into account… with exception of lesson on inferences… bordering on formal operational.”

  • Explains Vygotsky and Zone of PD, and discusses what she does



Acad lang rubric 10 understanding lang dev and lang demands
ACAD. LANG. Rubric 10 –UNDERSTANDING LANG. DEV. AND LANG. DEMANDS

READ COMMENTARY - ACAD LANG. (Prompts 2 a-b, 3b) and AL sections of lessons.

EL10: How does the candidate use knowledge of students’ language development to identify a key language demand central to literacy learning?

  • Description of academic language development identifies strengths and needs.

  • Candidate identifies vocabulary and a language demand that are central to learning segment and appropriate to students’language development.


Evidence for acad language rubric 10
EVIDENCE FOR ACAD. LANGUAGE Rubric 10

Description of academic language development identifies strengths and needs.

  • Candidate identifies vocabulary and a language demand that are central to learning segment and appropriate to students’language development.

    EVIDENCE:

    How does candidate describe students academic language development?

    What is/are the identified vocabulary and language demands?


Evidence for acad language rubric 101
EVIDENCE FOR ACAD. LANGUAGE Rubric 10

Description of academic language development identifies strengths and needs.

  • Candidate identifies vocabulary and a language demand that are central to learning segment and appropriate to students’language development.

    EVIDENCE:

    Identifies predict and infer as language demands, and prior knowledge and predict as vocabulary.

    Refers to students needs in terms of reading levels, but not in connection to AL demands. Instead focuses more on what she as teacher will do.



Acad lang rubric 11 scaffolding sts acad lang deepening lit learning
ACAD. LANG. Rubric 11 –SCAFFOLDING STS ACAD. LANG. & DEEPENING LIT. LEARNING

READ COMMENTARY - ACADEMIC LANG. (Prompts 3c-d) and lessons

EL11: How does the candidate support academic language development associated with literacy learning?

  • Candidate provides support so students can use language demand to engage in literacy tasks.


Evidence of acad lang rubric 11
EVIDENCE OF ACAD. LANG. Rubric 11

Candidate provides support so students can use language demand to engage in literacy tasks.

EVIDENCE – What supports for predicting or inferring are provided?



Planning rubric 3 planning assessment
Planning Rubric 3 – PLANNING ASSESSMENT

READ LESSON ASSESSMENTS & COMMENTARY – MONITORING STUDENT LEARNING (Prompts 4a-b)

EL3: How are the informal and formal assessments selected or designed to provide evidence of student progress toward standards/objectives?

  • Assessments are aligned to standards/objectives

  • Assessments provide evidence for monitoring student learning


Evidence for planning 3
Evidence for Planning 3

  • Excerpts from Candidate Planning Commentary for Elementary Literacy Sample – Task 1:

    • When using the strategy of predicting, each read, Food in America and then were asked to write 2-3 sentences that began with “I predict”

    • On the 3rd day, I taught 2 pre-reading strategies – concept mapping and KWL – which acted as natural assessments which I could simply check how they used the graphic organizers. [Map on Minneapolis & rainforest char. from text, KWL on earthquakes]

    • The final day, for making inferences, featured a Collins writing assignment, where kids were awarded 40 points for making inferences, 40 points for capital letters and periods and 20 points for good spelling.

Stanford Center for Assessment, Learning and Equity 2011


Evidence for planning 31
Evidence for Planning 3

  • More Excerpts from Candidate Planning Commentary for Elementary Literacy Sample – Task 1:

    • Students were also assessed verbally over course of TPA. In predicting lesson, they worked in pairs and shared their foresight of what they thought would happen in passages read.

    • On second day, they shared what they had predicted versus what they actually read and discussed if they matched up.

Stanford Center for Assessment, Learning and Equity 2011


Evidence p3 standards objectives in literacy tpa
Evidence P3: Standards/Objectives in Literacy TPA

  • LS Focus: Demonstrate pre-reading strategies to help students better comprehend texts

  • 4th Grade Standards: 1. Refer to details and examples in text when explaining what the text says explicitly and drawing inferences from text; 2. Provide reasons that are supported by facts and details.

  • Objectives: 1. Activate prior knowledge to make predictions; 2. Analyze texts to make predictions; 3. Utilize BK on a topic before reading about it; 4. Create inferences based upon their reading of text.

Stanford Center for Assessment, Learning and Equity 2011


Excerpts student work samples
Excerpts -- STUDENT WORK SAMPLES

  • Title: Earthquakes

    • I think that the people of Kobe feel sad and mad because one they were covered in ruble and they’re wondering why they deserved it.

    • Title: Earth Cwack

      • Douglas copp risked his life to save many other people. Red Cross worker rescues a boy from a ditch. A rescue dog and his handler spreach earthquake rubble. Rescue dogs have good smell.

Stanford Center for Assessment, Learning and Equity 2011



Returning to OUR GUIDING QUESTIONS

  • What do we value as teacher educators in preparing new teachers for the cycle of teaching: planning, academic language, instruction, assessment, and reflection and how does this align with what is measured in the TPA?

    • How effectively are candidates making sense of and performing the act of teaching? What do the data show in terms of teacher candidates’ understandings and professional performance?

      AND

    • How are we making sense of candidates’ performances and data in order to reflect upon, improve and have evidence-based dialogue about our own TE program, practices, and structures? What are the implications for our program in terms of what and how we teach?


Some independent analyzing of task 3 4
Some Independent Analyzing of task 3-4

  • Read and analyze Assessment Task 3 and the Reflection/Analyzing Teaching Task 4

  • Gather Evidence for Rubrics 6, 8 and 9 across the tasks/artifacts and remaining commentaries. SKIP RUBRIC 7 - FEEDBACK

  • Consider the candidate’s strengths and areas of need as you review the materials

  • TIME: 20 minutes

Stanford Center for Assessment, Learning and Equity 2011


Assessment rubric 6 analyzing student work
ASSESSMENT Rubric 6 - ANALYZING STUDENT WORK

  • EL6: How does the candidate demonstrate an understanding of student performance with respect to standards/objectives?

  • Criteria are clearlyaligned with standards/objectives

  • Analysis focuses on what students did right and wrong OR/AND patterns of student understandings

  • Analysis is supported by summary and work samples, and includes some differences in whole class learning OR/AND evidence for patterns for individuals or groups


Assessment rubric 8 using assessment to inform instruction
ASSESSMENT Rubric 8 – USING ASSESSMENT TO INFORM INSTRUCTION

EL8: How does candidate use conclusions about what students know and are able to do to plan next steps in instruction?

  • Next steps propose general support OR/AND targeted support that improves student performance related to the standards/objectives assessed.


Reflection rubric 9 analyzing teaching effectiveness
Reflection Rubric 9 – ANALYZING TEACHing EFFECTIVENESS

EL9: How does candidate use evidence to evaluate and change teaching practice to meet the varied learning needs?

  • Proposed changes address collective needs OR/AND some individual needs connected to standards/objectives.

  • Candidate cites evidence student learning, experiences or prior knowledge OR/AND examples of successful/unsuccessful practices


Cultures of evidence
“Cultures of Evidence”

  • From Peck and McDonald’s Study of PACT Implementation (2011)

    • Critical and collegial conversations about PACT adoption

    • Inquiry and program improvement as motivational orientation, not compliance

    • Affirmed values and identity of programs

    • Strategic inclusion of faculty in examining cases of candidate performance at regularly scheduled events

Stanford Center for Assessment, Learning and Equity 2011


Final thought
Final Thought…

  • Maintain the focus on program outcomes for improved program quality and effective teaching

    • NOT for success on TPA

    • NOT as compliance

  • Inquiry within a professional learning community

    • Locally

    • State

    • Nationally


Questions and goals
Questions and Goals

  • What are your remaining questions?

  • What will you set as a goal for your campus given the “workshop” this afternoon?

Stanford Center for Assessment, Learning and Equity


Critical decisions
Critical decisions

  • Build faculty consensus about valued outcomes

  • Make decisions based on candidate performance data

  • Sequence ESAs (and rubric criteria/levels) in ways that reflect candidate development

  • Allow faculty autonomy in instruction supporting ESAs

  • Standardize only what is necessary!

Stanford Center for Assessment, Learning and Equity


Field test participation
Field Test Participation

  • Pearson will support scoring training and scoring stipends for a national sample of 18,000 candidates

  • Scorers to include IHE faculty, field supervisors, cooperating teaching, principals, NBCTs and others with pedagogical content knowledge and experience with beginning teacher development. Recruitment via TPAC Online

  • Scoring training and certification online begins March 1, 2012. Scoring begins in April.

  • Local, state and national scoring in field test and beyond

Stanford Center for Assessment, Learning and Equity 2011


Field test participation1
FieldTestParticipation

  • Subject Areas to be field tested

    • Elementary Literacy , Elementary Mathematics, English/Language Arts, History/Social Science, Secondary Mathematics, Science

    • Special Education, Early Childhood Development, Middle Grades (Science, ELA, Math, and History Social Science), Art, Performing Arts (Music, Dance, Theater), Physical Education, and World Language

Stanford Center for Assessment, Learning and Equity 2011


Highlights of pearson s role in the tpa
Highlights of Pearson’s Rolein the TPA

  • Pearson has been selected as Stanford’s operational partner.

  • Support Stanford and AACTE with assessment development and technical review.

  • Train and certify scorers, provide a scoring platform and report results for the operational TPA.

Stanford Center for Assessment, Learning and Equity 2011


Field test design
Field Test Design

  • Design is driven by overall goals:

    • Data to enhance validity evidence

    • Reports to describe technical aspects and the set of validity and reliability studies

    • Effectiveness and efficiency of scorer training materials and process

    • Refinements to the assessment

    • Reporting design and distribution

    • Support systems:

      • portfolio management system and

      • scoring management system

  • Participation/Sampling plan – location (state-based or national population) and discipline-specific

Stanford Center for Assessment, Learning and Equity 2011


View teaching performance
View Teaching Performance

What do you see in the video clip that tells you the candidate is ready to be the teacher of record?

Take notes and then after the clip share with a partner.


Instruction rubrics
Instruction Rubrics

  • Engaging Students in Learning

    EL 4 GQ: How does the candidate actively engage students in integrating skills and strategies to comprehend and/or compose text?

  • Deepening Student Learning

    EL5: How does the candidate elicit and monitor students’ responses to develop literacy skills and strategies to comprehend and/or compose text?


Instruction rubrics1
Instruction Rubrics

Rubric 4

  • Extent of opportunities for students to develop their own understanding

  • Use of strategies that reflect attention to students’ backgrounds and experiences.

    Rubric 5

  • Use of strategies eliciting student responses related to reasoning/problem solving

  • Representation of concepts in ways that helps student understand





ad