CRESST Quality School Portfolio System
1 / 105

CRESST Quality School Portfolio System Reporting on School Goals and Student Achievement - PowerPoint PPT Presentation

  • Updated On :

CRESST Quality School Portfolio System Reporting on School Goals and Student Achievement The What And Why of School Information Systems: Why QSP? Catchwords of the Day Standards Accountability Total Quality Management School Inquiry Learning Communities Continuous Improvement

I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
Download Presentation

PowerPoint Slideshow about 'CRESST Quality School Portfolio System Reporting on School Goals and Student Achievement' - jana

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
Slide1 l.jpg

CRESST Quality School Portfolio System

Reporting on School Goals and Student Achievement

Catchwords of the day l.jpg
Catchwords of the Day

  • Standards

  • Accountability

  • Total Quality Management

  • School Inquiry

  • Learning Communities

  • Continuous Improvement

Common perspectives l.jpg
Common Perspectives

  • Student performance matters

  • Status quo is insufficient

  • Students’ future success requires rigorous new standards and expectations

  • Local school communities hold the keys to students’ success

  • Continuous school improvement requires on-going assessment and self evaluation.

Why else l.jpg
Why Else?

  • Federal, state and local programs mandate evaluation

  • School accreditation requires it

  • Funders like it

Using data to address important questions some examples l.jpg
Using Data to Address Important Questions: Some Examples

  • How are students doing? How well are they accomplishing standards?

  • What are the relative strengths and weaknesses of student performance?

  • Why are results as they are? What reasons help to students’ progress?

  • Are our special programs working?

Using data to address important questions cont l.jpg
Using Data to Address Important Questions (cont):

  • Are we equitably helping all students to achieve? Who are we well serving? Who not?

  • What previously hidden challenges do we face?

  • Who needs special help? On what?

Using data to address important questions cont9 l.jpg
Using Data to Address Important Questions (cont):

  • Are specific program components accomplishing their objectives? e.g. professional development?

  • Are new strategies making a difference in student learning?– new curriculum materials being tried by some teachers, tutoring services, new mathematics course?

  • Are our day to day standards consistent? Do our grades mean anything?

Reasons to use qsp l.jpg

  • Understand how things are working so they can be improved

  • Raise awareness and support for action

  • Communicate with the school community

  • But don’t:

    • Use to assess blame

    • Use to evaluate teachers

Assessing information systems not all data are good l.jpg
Assessing Information Systems: Not All Data Are Good

  • Aligned

  • Technically Accurate

  • Fair

  • Credible

  • Useful

1 system is aligned with goals for student performance l.jpg
1. System Is Aligned with Goals for Student Performance

  • Tests reflect:

    • standards for student performance

    • valued aspects of learning

    • what’s being taught

  • Multiple measures required

2 assessments are technically accurate l.jpg
2. Assessments Are Technically Accurate

  • Evidence of validity: test measures what is intended, not something else; gives accurate information for intended purposes

  • Evidence of reliability: measure is stable and consistent

  • All scores are fallible and imperfect

3 assessments are fair l.jpg
3. Assessments Are Fair

  • Includes all students -- including students with disabilities and ELL students

  • Enables all students to show what they know and can do

  • All students have the opportunity to learn

4 assessments are credible l.jpg
4. Assessments Are Credible.

  • Valued and believable to important stakeholder groups

    • Teachers

    • Students

    • Parents and Community

    • Funders

  • Different interests mean multiple measures

5 assessment system is useful l.jpg
5. Assessment System is Useful

  • Diagnostic value

  • Focus on malleable variables

  • Action-oriented

  • Dynamic

Ultimately there are no easy answers l.jpg
Ultimately, There Are NO Easy Answers

  • Data provides evidence

  • School professionals must find answers:

    • Use expertise to make sense of the data

    • Make decisions on what to do next

Includes l.jpg

  • Norm Referenced Tests

    • SAT9 administered in 1998 and 1999

    • At least Math, Reading, and Spelling are represented.

  • Descriptive Variables

    • Ethnicity, Gender, Grade etc...

  • Score Types

    • Percentile Rankings

    • Long. data for the most recent 2 years for your school (1997 - 1999)

Slide21 l.jpg

Does NOT Include

  • Most Program Specific Data

  • Course Specific Grading Variables

    • Last Year’s English GPA for example

  • Non Cognitive Outcome Data

    • Student Affect measures... Or Survey Data

  • Long. SCHOOL Data beyond two years back

Longitudinal data issues l.jpg
Longitudinal Data Issues

  • Database is missing key data.

    • Dropouts, Graduates, transfers

  • No current variable that provides number of years at school.

Limitations l.jpg

  • Very difficult to accomplish school program evaluation with only two years of data.

  • Difficult to accomplish deep inquiry of subject matter strengths and weaknesses without subscale data.

  • Difficult to Make recommendations.

    • Often depend on contextual information

    • Should have multiple measures to verify

So what can we do with the data provided l.jpg
So, What CAN we do with the data provided?

Prospects l.jpg

  • Identify apparent general strengths and weaknesses.

  • Focus goals on apparent deficits.

    • Students at risk

  • Identify our apparent stars.

  • Inform our decisions

Slide27 l.jpg


1. Choose the correct file type first

2. Choose the file you want to import.

3. Open will begin the import process.




Slide28 l.jpg


1. If the variables in your data file are separated by commas (and each student is on a separate line), click OK.

2. Otherwise, adjust the item separator (and student separator, if necessary). Then press OK.



Slide29 l.jpg


1. If the first row of the data file holds variable names, indicate that here.

(The contents of the first row are shown here for your convenience.)

2. Click OK.



Slide30 l.jpg


Next, match the variables from your data to existing QSP Slots. Doing so carefully is critical. Validity of future analysis may be compromised by mistakes made here.


The Objective is to match a column in the file being imported with an appropriate place for that data within your QSP Data file.

It is best if you know the file being imported very well before you attempt to import it.

Slide31 l.jpg


1. If you are not sure what information is contained by variable in your database, the Column Items box can help by showing you a snapshot of the data.


Slide32 l.jpg


Some variables will not have a preset QSP variable appropriate for matching. For these, you may choose from a set of “optional” variables.




For Example you would:

1. Select “Language Proficiency” in the first box

2. Select “Opt Backgnd 1” in the second box

3. Press Select.

Slide33 l.jpg


1. When you have matched all the variables you want to import, you can choose Ignore Rest to quickly dispense with the others. They will appear IGNORED! in the variable pairs window to the right.

2. Finally, choose the correct year for the incoming data. (You must import ONE year at a time!)

3. Click on the Next button to proceed with the import.




Slide34 l.jpg


You will then see this import confirmation window. If all pairs appear correctly and are preceded by blue check marks, press OK to proceed with the import. Otherwise, you may click the Back button to make necessary changes.

Disaggregations within qsp l.jpg
Disaggregations Within QSP

To create new groups

in QSP, first choose the

Groups option under

the Database menu.

Or select the Button

indicated here.

Once groups has been chosen the box below will appear you would then click the new button l.jpg
Once Groups has been chosen, the box below will appear. You would then click the New button.

Creating Groups

The Current Group category represents the group that is presently represented.

The Custom Groups category represents any groups that have been created.

The System Groups category represents the already existing groups within the Data.

The Combo Groups represents any combination of groups.

Click the New button to create a new group.

Slide38 l.jpg
Once you click the New button, the following dialog box will appear. You then have the choice of Custom Group or Combo Group.

Creating Groups

First choose Custom or Combo group, then click OK.

If you choose the custom group category this dialog box will appear l.jpg
If you choose the Custom Group category, this dialog box will appear.

Creating Groups

Depending upon the variable you select the operator options will change l.jpg
Depending upon the variable you select, the operator options will change.

Creating Groups

Choosing a Categorical variable will provide the Booleans (Is or Is not). Choosing aNumerical Variable will provide the (=, <>, <, >) as possible selections.

To create a custom group you must name the group then provide the criteria for the group l.jpg
To create a will change. Custom Group, you must name the group, then provide the criteria for the group.

Creating Groups

For instance, in this example, we’d create a group that represents all students with a GPA below 2.0.

This Group can then be added to reports or used to filter Graphic representations like Pie charts and Histograms.

Type in the name of the group.

Choose a variable.

Select the Operator

Then indicate the comparison value.

Click OK when done.

You can also create combo groups by choosing the combo option l.jpg
You can also create will change. Combo Groups, by choosing the Combo option.

Creating Groups

Click on Combo Group.

Click OK when done.

Slide43 l.jpg
The Combo Group will change. , allows you to combine up to eight groups to in order to focus your reporting on a particular population of interest.

Creating Groups

Let’s say you want report on the Hispanic male Students who have Spanish as a Primary Language. You can select those groups, then combine these with other outcome related groups you have defined yourself like our Low GPA Group.

Name the Combo Group.

Choose your Groups. You may combine up to eight.

Click OK when done.

Slide44 l.jpg

Creating Groups will change.

Any Group can also be viewed in the QSP Database View. Select the Group in the Dialog and hit “OK” then return to the database View. Also indicates the number of students who fit that criteria.

Qsp tabular report options 5 types l.jpg
QSP Tabular Report Options (5 Types) will change.

Crosstabulations And Frequencies

Answers only Count and %

Not Longitudinal

Only Categorical Variables

Slide47 l.jpg

QSP Tabular Report Options cont. will change.

Tables - Mean, Median, and % at or above

Groups By Variable on Multiple Outcomes

Longitudinal Option

Questions Table - Mean, Median, # & % Tested

Groups by Variable on One Outcome

Multiple Operators Simultaneously

Advanced Table (Mean, Median, Number)

Groups by Groups on One Outcome

Key issues with tables l.jpg
Key Issues with Tables will change.

Often Difficult to Read

Formatting Difficulties

Information Overload

Require More effort on the part of reader.

Provide more Information, especially Longitudinal options.

Crosstabs l.jpg
Crosstabs will change.

frequency tables for one categorical variable, sub-divided by another

We Will Make a Crosstab that includes Gender & Grade

Frequency tables l.jpg
Frequency Tables will change.

List the numbers of times each value occurs in a set of values

We will create a Frequency Table of the Language Fluency Variable

Tables l.jpg
Tables will change.

best way to display exact numbers.

We will create a Table for Grade by Median %ile for Reading, Math and Spelling

Advanced tables l.jpg
Advanced Tables will change.

simply frequency tables for one category, sub-divided by another

We will create an Advanced Table for Grade by Ethnicity for Median Reading %tile Ranking

Questions table l.jpg
Questions Table will change.

Best for showing multiple aspects of a test including:

We will create a Questions Table for Grade By Math Median %ile rank.

Graphical report objects 3 bars l.jpg
Graphical Report Objects (3 Bars) will change.

Bar Chart - Graphs the Mean, Median, and % at or above, count and Percentage. Groups By Variable on a Single outcome.

Histogram - Graphs the Distribution of a Population or a Group score. User selects Range (width of bars).

Percentile Bar - Separates Scores into Quartiles or Quintiles. Then displays distribution of User-selected Groups

Graphical report objects cont l.jpg
Graphical Report Objects (cont.) will change.

Line Chart - Graphs the mean or median of four groups on One Outcome measure over time. Or shows the scores of the entire population on four like outcome measures.

Scatterplot - Graphs the scores of a group on two outcome measures (1 on the X-axis, the other on the y-axis.

Pie Chart - Displays part whole relationships of Filtered group against a categorical Variable. Can apply any group to restrict population of comparison.

Gauge - Displays % of filtered group that meets criteria of a second grouping. Best for use with goals.

Bar chart l.jpg
Bar Chart will change.

We Will Make a Bar Chart of % of students at the Top Quartile by Ethnicity.

Histogram l.jpg

Histogram will change.

We Will Make a Histogram of Math NRT Percentile Rank.

Percentile bar chart l.jpg
Percentile Bar Chart will change.

We Will Make a Percentile Bar Chart of Reading quartiles by Grade.

Line graphs l.jpg
Line Graphs will change.

We Will Make a Line Chart of median Spelling, Math & Reading.

Slide61 l.jpg

Scatterplot will change.

We Will Make a scatterplot of Spelling & Reading.

Pie charts l.jpg
Pie Charts will change.

We Will Make two Pie Charts: one: Ethnicity of all Students, and two: Ethnicity of students <= 25th Percentile on Reading.

Note the Outcome for Hispanic Students.

Gauges l.jpg
Gauges will change.

We Will Make a Gauge of Hispanic Students <= 25th Percentile on Reading in 1998, put in the 1997 % as a comparison.

Different types of progress analyses l.jpg
Different Types of Progress Analyses Time

  • Cross sectional -- this years’ 10th graders compared to last years 10th graders

  • Quasi-longitudinal

    • this year’s 11th graders compared to last year’s 10th graders

    • This year’s 11th graders compared to 9th grade results two years ago

  • *True longitudinal -- includes only those students who were present over the entire period

Strengths and weaknesses l.jpg
Strengths and Weaknesses Time

  • Cross sectional: Cheapest data collection option, but influenced by cohort differences

  • Quasi Longitudinal:

    • Requires multi-grade testing

    • Influenced by student transiency

  • True Longitudinal: Most powerful but what about responsibility for other students?

Which analysis is best l.jpg
Which Analysis Is Best? Time

  • Depends on available data: sometimes only cross sectional analyses are possible

  • Longitudinal usually stronger for looking at questions of school and program impact

  • In any case, multiyear analyses are preferred over single year

  • But beware changes in tests that make results non comparable.

Basic approaches to diagnosis l.jpg
Basic Approaches To Diagnosis Within A Subject Area

  • Consider Alignment -- What is being tested? What of value is and is not being tested?

  • Look at Patterns of Performance Across Different Tests

  • Look at Patterns of Performance Across Subscales

  • Look at Distribution of Performance

Examine performance across different tests l.jpg
Examine Performance Across Different Tests Within A Subject Area

  • What’s being assessed by each?

    • What is emphasized? What is missing?

    • Is the assessment aligned with our standards?

    • Are/should teachers be teaching what’s assessed?

  • Is there differential performance across different tests?

  • Implications for curriculum planning?

Examine distribution of performance l.jpg
Examine Distribution of Performance Within A Subject Area

  • Look for distribution of performance

    • -- is it “normal” or are there higher or lower than expected proportions of low and high achieving students?

  • Implications for the effectiveness of different courses or course sequences?

Examine patterns of performance across subscales l.jpg
Examine Patterns of Performance Across Subscales Within A Subject Area

  • Do the Subscales Represent Important Curriculum Foci?

  • Is There Differential Performance Across the Different Subscales?

  • Given Number and Percentage of Items Correct, if available, is this a reasonable area to show improvement?

  • Beware: Unreliable subscales composed of few items

Consider implications l.jpg
Consider Implications: Within A Subject Area

  • Curriculum Planning/Revision

  • Assessment Planning/Revision

Are we serving all students l.jpg
Are We Serving All Students Within A Subject Area

  • Federal mandates insist on progress for all students:

    • Students of diverse ethnicity

    • Students living in poverty

    • Limited English proficient students

    • Students with disabilities

  • Requires analysis of performance by subgroup and accountability for progress

  • Beware over-interpreting small n’s

Analysis of subgroup performance l.jpg
Analysis of Subgroup Performance Within A Subject Area

  • Looking for action, not excuses.

  • Goal: Assure equity and progress toward excellence for all

    • Upward trends in performance

    • Lower Gaps in Performance Between Subgroup

How can we improve effective learning opportunities for traditionally under achieving subgroups l.jpg
How Can We Improve Effective Learning Opportunities for Traditionally Under Achieving Subgroups?

  • Data raises the question and the imperative -- answers will require serious reflection and additional assessment

  • What factors are getting in the way?

  • What are potential strategies for overcoming them?

Who are students at most risk l.jpg
Who Are Students At Most Risk? Needs

  • At Risk of Dropping Out of School

  • At Risk of Dropping Out of the College Pipeline

Research on factors predicting drop outs l.jpg
Research on Factors Predicting Drop Outs Needs

  • Prior grade retention

  • Poor grades

  • Excessive absences

  • Poverty

  • School Transciency

  • Permissive parenting

  • Low participation in school organizations

  • Depression

Research on factors influencing pipeline success l.jpg
Research on Factors Influencing Pipeline Success Needs

  • Mathematics course taking, e.g., College preparatory math courses, completion of one advanced mathematics course(e.g., calculus)

  • School help filling out college applications

  • Parent interest in school matters

  • Number of friends with plans to attend college

  • Involvement in after-school activities

  • Aspirations and self expectations

Overview l.jpg
Overview Needs

  • Establish critical question for analysis and inquiry

  • Determine indicators that inform our question(s)

  • Set specific performance goals for primary indicators

  • Move forward on vision for accomplishing goals

Establishing a critical question for inquiry l.jpg
Establishing A Critical Question for Inquiry Needs

  • What are our goals for student learning and performance?

  • What do the data say?

  • What do our observations and experience suggest?

  • Are there new opportunities or mandates that imply a specific focus for the year?

Indicators of student learning and performance existing tests l.jpg
Indicators of Student Learning and Performance: Existing Tests

  • State, standards based assessment

  • District standardized tests

  • District course exams

  • Placement tests

  • School or department exams

  • Classroom tests

Indicators of performance cont getting to classroom practice l.jpg
Indicators of Performance Tests(cont):Getting to Classroom Practice

  • Student work samples/assignments

  • Writing and/or performance assessments

  • Portfolios

  • Projects

  • Grades

Indicators of performance cont academic preparation l.jpg
Indicators of Performance (cont): Tests Academic Preparation

  • Completion of college preparatory coursework

  • Enrollment/passing AP courses

  • SAT/ACT performance

  • Enrollment/success in gatekeeper courses

Indicators of performance cont other important outcomes l.jpg
Indicators of Performance (cont): Other Important Outcomes Tests

  • School completion: graduation, drop outs

  • Attendance, tardies

  • Social/civic indicators -- extra-curricular involvement, suspensions, attitudes

  • Health and safety indicators

  • Post secondary success indicators

Establishing priority indicators l.jpg
Establishing Priority Indicators Tests

  • Don’t get lost in “wouldn’t it be interesting…?

  • Set priorities based on

    • Alignment with important goals

    • Technical quality

    • Utility/credibility

    • Feasibility of access

Overarching design concern l.jpg
Overarching Design Concern Tests

  • How to connect with classroom practice?

  • How to get early and repeated assessments of whether we’re making progress over the year?

  • How will the assessment process connect to reform strategies for classroom practices?

Setting specific performance goals on priority indicators l.jpg
Setting Specific Performance Goals on Priority Indicators Tests

  • Where do we want to go in terms of specific performance?

  • How long should it take us to get there?

  • Where should we be in the next year?

    No “right answer”

    Judgmental process, in conjunction with mandates, if any

What are reasonable performance goals l.jpg
What Are Reasonable Performance Goals? Tests

  • Reflect high expectations

  • Consider established benchmarks and relevant comparisons:

    • What does all students achieving standards mean?

    • Where should our performance be relative to the state, district, similar schools, exemplary programs?

  • Are realistic

What are reasonable performance goals cont l.jpg
What Are Reasonable Performance Goals? (cont) Tests

  • Are specific: what we’re accountable for

  • Some choices:

    • Average performance

    • Percentage of students scoring at or above a performance level (meets standard, at/above 50%ile)

    • Percentage of students moving from one performance level to another

Overarching design concern95 l.jpg
Overarching Design Concern Tests

  • How to connect with classroom practice?

  • How to get early and repeated assessments of whether we’re making progress over the year?

  • How will the assessment process connect to reform strategies for classroom practices?

Theory of action a vision of how we ll get to goals l.jpg
Theory of Action: A Vision of How We’ll Get To Goals Tests

  • Vision of how we’ll get from where we are to where we want to be

  • Starts with specific performance goal(s)

  • Backward chain to identify practices that need to change to achieve the goals and strategies for achieving those changes

Theory of action a vision of how we ll get to goals97 l.jpg
Theory of Action: A Vision of How We’ll Get To Goals Tests

Integrate available program components (staff development, curriculum resources, parent involvement) to focus on goal

Model a l.jpg
Model A Tests


Slide99 l.jpg

Model B Tests


Theory of action l.jpg
Theory of Action Tests

  • Use model that makes sense to you

  • Focus on explanatory factors and program components that will make a difference

  • Focus on things you can influence/change, not things that are less malleable

  • Forward and backward chain to and from performance goals

Planning for qsp augmentation l.jpg
Planning for QSP Augmentation Tests

  • What variables will we add?

  • How will we collect/access the data?

  • Do we need new data collection?

    • New instruments selected/constructed

    • Schedule for creation, review, production, administration

  • How will we involve others in new assessments?

Planning for qsp augmentation103 l.jpg
Planning for QSP Augmentation Tests

  • What schedule -- for access/collection,QSP entry, analysis and reporting?

  • Who’s responsible?

  • Process or technical ideas for “getting it done?”

Communicating with others l.jpg
Communicating With Others Tests

  • Who are the important stakeholders?

  • How can we represent our analyses and vision?

  • How can we use the process to build consensus and collaboration?

  • How can we involve others actively in the assessment and reform process?

Building professional learning communities l.jpg
Building Professional Learning Communities Tests

  • High Norms of teacher collaboration and planning

  • Strong leadership and focus

  • High norms of collegiality

  • Teacher use of data to inform practice (McLaughlin)