quality matters 2008 n.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
Quality Matters 2008 PowerPoint Presentation
Download Presentation
Quality Matters 2008

Loading in 2 Seconds...

play fullscreen
1 / 26

Quality Matters 2008 - PowerPoint PPT Presentation


  • 82 Views
  • Uploaded on

Quality Matters 2008. “We’ve got the Data Reports, now how do we use them?” -- UWM’s Data Feedback Loop UWM Team: Alison Ford Kerry Korinek Barbara Bales. Some Background, then Three Parts. I. What Data (8 Annual Reports)

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Quality Matters 2008' - sana


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
quality matters 2008

Quality Matters 2008

“We’ve got the Data Reports, now how do we use them?” --

UWM’s Data Feedback Loop

UWM Team: Alison Ford

Kerry Korinek

Barbara Bales

some background then three parts

Some Background, then Three Parts

I. What Data (8 Annual Reports)

II. Using the Data (Data Feedback

Loop)

III. Questions/Sharing Your

Strategies

some background

Some Background

The Council on Professional Education (CPE) & its Steering Group

Oversees the Assessment System --

some background1

Some Background

  • We have 34 programs [60 licenses] across 5 schools/colleges
  • The CPE has voting members representing each program, in addition to some other appointments (e.g., an Associate Dean from Letters & Science and 2 program graduates)
  • The CPE meets 4x/year and usually has a couple of workgroups each year.

Assessment process is overseen by our Council on Professional Education (CPE) & its Steering Group

some background continued

Some Background (continued)

  • The Team meets every three weeks
  • Designated offices/individuals take the lead on collecting and compiling data for assigned reports
  • Reports are written by the Assessment and E-Portfolio Coordinator and the individual(s) who collect and compile the data– the reports include summary findings for CPE discussion
  • The CPE Steering Group reviews reports before they are presented to the CPE

The CPE Data Management Team compiles the reports under the leadership of the “Assessment & E-Portfolio Coordinator”

i what data
I. What Data?
  • 8 Annual “Unit” Reports

(1) Exit Survey Report

(2) Standards/Portfolio Assessment Report

(3) Follow-up Report

- Graduates; 1-2 Years Out (odd years)

- Graduates; 4-5 Years Out (even years)

8 annual unit reports cont
8 Annual “Unit” Reports (Cont.)

(4) Candidate Status Report

(5) Praxis I Report

(6) Praxis II Report

(7) Field Experience Report

(8) Job Location Report

1 exit survey report
1) Exit Survey Report
  • How do completers assess the extent to which they…
  • Have been prepared to meet standards
  • Know their subject areas & draw upon coursework in Letters & Science & the Arts
  • Are prepared for the first year in the profession (a grade is given) & whether they would choose the same certification program

… And, other areas: Urban Education/Equity Mission, Field Experiences, Support Services & Employment Prospects

2 standards portfolio report
2) Standards/Portfolio Report
  • At what level of proficiency do faculty members rate completers (using the program’s standards rubric)?:
    • Proficient
    • Emerging Proficiency
    • Lacks Evidence of Proficiency
  • How do candidates self-assess (or programs assess) standards related to:
    • UWM’s Core Guiding Principle of Urban Ed/Equity (critically reflective & responsive about diversity & equity; aware of commitment & need for persistence; a learner & collaborator)?
    • Interpersonal Communication & Routine Use of Technology?
3 follow up report
3) Follow-up Report
  • 1-2 Years Out (odd years)… Our graduates:
    • Where are they working?
    • How satisfied are they with teaching/selected profession?
    • How well prepared are they for the standards, their first years of working… and how do employers view their preparation?
  • 4-5 Years Out (even years):
    • Graduates let us know what they are doing; they give us information about professional development, accomplishments, impressions of our program, their plans, and some demographic information
4 candidate status report
4) Candidate Status Report
  • Who is admitted?
  • Who completed our programs?
  • Who withdrew or was discontinued & why?

[This report includes numbers and demographics for each major question.]

5 praxis i report
5) Praxis I Report
  • What are the PPST pass rates for UWM candidates?
  • How do scores break down by gender, race/ethnicity, and age group?
  • How many waivers are there by subtest? and by program?
  • What is the relationship of those receiving Praxis I waivers to their pass rates on Praxis II?
6 praxis ii report
6) Praxis II Report
  • What are the Praxis II pass rates for UWM candidates & by program?
  • How do scores break down by gender, race/ethnicity, and age group?
  • What is the passing percentage based on attempts?
  • Which tests have pass rates of 95% or below?
  • What is the breakdown of test categories for tests of concern?
7 field experiences report
7) Field Experiences Report
  • How many placements are made, what type (e.g., On-the-Job), and in what location [MPS, Milwaukee (but not non-urban), non-urban]?
  • What is the candidates’ level of satisfaction with their field placements and supervision?
8 job location outlook report
8) Job Location & Outlook Report
  • In what job locations do our completers work [MPS, Milwaukee (but not MPS), other large urban, non-urban]?
  • What is the employment outlook? [Using the DPI Supply and Demand Report; Table 20 – Supply ratings (Wisconsin vs. CESA #1); and Table 21 (Ratio of Applicants to Vacancies, Average Supply Rating, and Number of Emergency Licenses)]
report format facilitating the use of data
Report format, facilitating the use of data…
  • We try to:
  • keep the format straight forward – answering key questions (& avoiding collecting data that we will not use)
  • cluster programs-so we look beyond our own program
  • provide summary findings at the start of each report
  • make connections with other findings
  • include data over a 3-year period
  • note limitations
slide18

Data Feedback Loop – Step 1

  • Assessment & E-Portfolio Coordinator and the Data Management Team Collect Data and Develop Reports
  • Reports are addressed on a regular schedule:
    • Sept. Meeting - Exit Survey
    • Nov. Meeting - Standards & Portfolio; Follow-up
    • Feb. Meeting - Candidate Status; Praxis I & II
    • May Meeting - Job Location & Outlook
slide19

Data Feedback Loop – Step 2

  • CPE Meetings: Reports are Reviewed, Major Implications Discussed, Workgroups Convened
  • Steering Group reviews first
  • Draft report distributed in advance of CPE Meeting
  • There are benefits to the “public” discussion
  • “Unit view” taken –strengths noted; problem-solving approach to top items of concern
  • Workgroups convened as appropriate (e.g., Praxis Workgroup); Assessment tools & procedures revised as needed.
slide20

Data Feedback Loop – Step 3

  • Program Coordinators share results with

program faculty and staff

e Coordinators are asked to “bring back” &

share findings with their program colleagues

e Unit reports are made available online with

a password

e Individual program reports are sent directly

to Coordinators and are not online

slide21

Data Feedback Loop – Step 4

  • Program Coordinators share answers to the following questions at the May Meeting and have opportunity to discuss:

e What are the top 1-2 program strengths

emerging from reports?

e What are 1-2 areas of concern based on the

data?

e What actions are planned or underway & what

progress has been made?

slide22

Data Feedback Loop: Summary Points

Using Data …influenced by:

  • The predictable flow of data (e.g., 8 annual reports on a regular schedule)
  • Data that matter– not cluttered with data that is of little consequence
  • The report itself (e.g., straight-forward; not a lot of narrative; key summary points up front)
slide23

Data Feedback Loop: Summary Points

Using Data …influenced by:

  • A process of engagement (e.g., data go to Steering Group first, then CPE, then to all programs & accessible online; in May – programs report out on how data used; and in 5-year review in Licensure Program Report)
  • A process that focuses on the “unit” & encourages support and concern for the whole
slide24

Data Feedback Loop: Summary Points

Using Data …influenced by:

  • Connections made among reports and other data
  • Workgroups that follow through on top concerns
  • Strengths are emphasized too (results might find their way into program documents, recruitment brochures, etc.)
slide25

Data Feedback Loop: Summary Points

Using Data …influenced by:

  • Climate – high concern for program quality, respect for work of program faculty, comfort with revealing and meaningfully addressing program weaknesses
  • An infrastructure that supports this work
slide26

III. Your Turn

  • What else? What are your questions?
  • What strategies are working for you as

you develop a system to ensure use of

the data collected?