1 / 14

COMPASS

COMPASS. National and Local Norming Sandra Bolt, M.S., Director Student Assessment Services South Seattle Community College February 2010. TABLE OF CONTENTS. What Is COMPASS? Is Placement Testing Necessary? How Are Cut-scores Developed And Evaluated?

akamu
Download Presentation

COMPASS

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. COMPASS National and Local Norming Sandra Bolt, M.S., Director Student Assessment Services South Seattle Community College February 2010

  2. TABLE OF CONTENTS • What Is COMPASS? • Is Placement Testing Necessary? • How Are Cut-scores Developed And Evaluated? • Answers To The Most Common Questions. • Concluding Notes.

  3. WHAT IS COMPASS • It is a placement test. It is curriculum based with a diagnostic component. • COMPASS ensures the examinee is ready for the rigor of the course. The cut-scores signal that the course will not be too hard or too easy. It is used for post-secondary level placement. • It is not an entrance exam; it is not an aptitude test. COMPASS does not have the criteria required by EEOC to be a workplace test instrument. • The instrument is correlated with the institution’s academic courses. Reliability is maintained by correlating test scores to course grade. • Reliability: Grammar .85; ESL-Reading .86; Listening .85 Writing .85; Reading .85; Math .85 - .86 (The above national scores are based on the standard test length. Reliability increases with the extended or maximum test lengths. South’s test is based on the standard length. We can adjust the length with data.)

  4. WHAT IS COMPASS • The instrument is progressive. • The ESL COMPASS is linked to the Standard COMPASS. With the appropriate skill set, for example, an ESL student can start in ESL COMPASS and test into Standard COMPASS and into ENGL 101. • Examinees are given six questions. If answered correctly, the system presents easier or harder questions within a domain. Examples of domains are: pre-algebra, algebra, college algebra and trigonometry. With data, we set the routing scale, what is tested, and the diagnostic measures.

  5. IS A PLACEMENT TEST NECESSARY? • When the COMPASS test is correlated to the course, the instructor is able to teach students who are ready for the course content at the level of the course outline and course description.

  6. INSTRUCTOR COMMENTS FROM AN UNDERUTILIZED COMPASS TEST • “I don’t use COMPASS, it doesn’t work for my class.” • “I review the whole chapter in class; I find it necessary to teach to the test.” • “I rarely use supplemental material; my time is spent reviewing, over-and-over, the basic text material.” • “I was hired to teach in my field; I’m doing less of that and more instruction in English and math skills.” • “Over the years, I have had to ‘dumb down’ my course.”

  7. HOW ARE CUT SCORES DEVELOPED AND EVALUATED? • Cut scores are developed in two ways: • The Normed Reference Process • The Criterion Reference Process

  8. HOW ARE CUT-SCORES DEVELOPED AND EVALUATED? • Each measurement helps to identify: Success rate (% above median who get a “B” or better.) When 85% of the class receives a “C” or better, “B” is used. Accuracy rate (sum of true positives & true negatives). True positives: those that scored above the median and passed with a “B”. True negatives: those that scored below the median and failed course. Percent placed into lower classes

  9. HOW ARE CUT SCORES DEVELOPED AND EVALUATED? • The Normed Reference Process • All examines are tested. Scatter graphs of test scores are sent with course descriptions to ACT. The data are analyzed against national data and returned with preliminary cut-scores. • ACT reviews, for example, 100 colleges that offer the same English courses and test for the median. The median is the minimum score for which we estimate that a student has 50% chance of earning a ‘B’ or better in the course. • The percent of correct or incorrect placement can be based on the course grade of each cut score above the median.

  10. HOW ARE CUT-SCORES DEVELOPED AND EVALUATED? • The Criterion Referenced Process • Instructors review the COMPASS test categories and determine what percent of the battery a student should answer correctly based on the needs of their class. Answers to this process are added to the national data we receive. • Accuracy of placement is conducted by assigning each student a rating based on ACT’s ‘course readiness’ definition. The rating is conducted within the first week of class. This score is then matched to the course final grade. The cut-score is determined by those receiving a ‘B’ grade and a rating of 3.0+ in the course.

  11. HOW ARE CUT SCORES DEVELOPED AND EVALUATED? • The Student Assessment Office reviews the national and local data with faculty. The initial scores are set. • The process must not end at that point. For cut scores to be reliable, scores must be measured against grades every few years. This is not difficult or time consuming process. It is necessary.

  12. QUESTIONS TYPICALLY ASKED • “When cuts are normed, is it purely South’s data or is data comprised from all three campuses?” • All three campuses must show a rationale for placement. The test developer, ACT, details the procedure. Some institutions choose other methods. • District aligned scores have only been set for college level English at this time. Developmental courses will not likely be set district wide. • “Do we see different levels of classroom success among different groups of students?” • I do not have data on that question. If, however, students are not placed into the courses by established cut scores, simple reliability studies are not conducted, and teaching and grading are not consistent, then it is likely that the instructor must teach outside the course outline.

  13. CONCLUDING NOTES: South’s Story • English and math were normed in 1989 for ASSET. When COMPASS was established in 1995, we correlated scores and faculty conducted reliability studies. Some academic and professional technical courses went through the 1989 rigor to determine cuts.

  14. CONCLUDING NOTES • Proper prerequisites are very important in sequential courses. Scores need to be upheld. Instruction and grades need to be consistent. • The institution is receiving dollars based on the State’s criteria of student retention and completion. COMPASS can ensure course readiness, the foundation for success and retention, but grades and program entry need to be no less reliable than the COMPASS scores.

More Related