Laura Renninger Shepherd University
Overview CLA Approach CLA Administration CLA Measures CLA Scoring and Shepherd’s CLA Results CLA Additional Data and Next Steps
CLA Approach Holistic assessment of important skills • Critical Thinking • Analytic Reasoning • Written Communication • Problem Solving Measurement of value-added Institution as unit of analysis
CLA Administration The CLA is administered by the Council for Aid to Education (CAE), a non-profit organization based in New York City. Reporting Products • Institutional Presentation • Institutional Report • Student Data File Results are not reported publicly • Schools can share data within consortia of peer institutions
CLA Administration We participated in a cross-sectional study, in which growth between freshmen and seniors is estimated by testing samples of students, not the entire class. Students take the CLA online in proctored settings. Testing time is approximately 90 minutes. 150 “native”, first-year students were selected at random and tested at Orientation in August 2004, 2005 and 2006 150 “native” seniors were selected via a stratified random sampling process and tested during the spring 2005, 2006 and 2007 semesters Tests were proctored by members of the Assessment Task Force: Dr. Gordon DeMeritt, Dr. Barri Tinkler, Dr. John Adams, Shannon Holliday, Laura Renninger and John Sheridan
CLA Measures Analytic Writing Task • Make-an-Argument • Critique-an-Argument Performance Task
Analytic Writing Task: Make-an-Argument “In our time, specialists of all kinds are highly overrated. We need more generalists -- people who can provide broad perspectives.” Directions: In 45 minutes, agree or disagree and explain the reasons for your position.
Analytic Writing Task:Critique-an-Argument “Butter has now been replaced by margarine in Happy Pancake House restaurants throughout the southwestern United States. Only about 2 percent of customers have complained, indicating that 98 people out of 100 are happy with the change. Furthermore, many servers have reported that a number of customers who still ask for butter do not complain when they are given margarine instead. Clearly, either these customers cannot distinguish margarine from butter, or they use the term "butter" to refer to either butter or margarine. Thus, to avoid the expense of purchasing butter, the Happy Pancake House should extend this cost-saving change to its restaurants in the southeast and northeast as well.” Directions: In 30 minutes, discuss how well- reasoned you find the argument.
Performance Task • Performance Tasks place students in a real-world scenario. • In the following case, students have 90 minutes to advise the • mayor on crime reduction strategies and evaluate two potential • policies: • Invest in a drug treatment program or • Put more police on the streets. • Students are provided with a Document Library, which includes • different types of information sources, such as…
Performance Task A MEMO by a private investigator that reports on connections between a specific drug treatment program and a vocal critic of placing more police on the streets.
Performance Task CRIME STATISTICS that compare the percentage of drug addicts to the number of crimes committed in the area.
Performance Task Crime and community DATA TABLES provided by the Police Department.
Performance Task A NEWS story highlighting a rise in local drug-related crime.
Performance Task A RESEARCH BRIEF summarizing a scientific study that found the drug treatment program to be effective.
Performance Task A CHART that shows that counties with a relatively large number of police officers per resident tend to have more crime than those with fewer officers per resident.
Performance Task WEB SEARCH results of other studies evaluating the drug treatment program.
Performance Task • Performance Tasks require students to use an integrated set of critical thinking, analytic reasoning, problem solving, and written communication skills. • There are no “right” answers. The goal is to stimulate students’ abilities to make reasoned, reflective arguments.
Performance Task • Students are expected to evaluate evidence by: • Determining what information is or is not pertinent • Distinguishing between fact and opinion • Recognizing limitations in the evidence • Spotting deception and holes in the arguments of others
Performance Task • Students are expected to analyze and synthesize the evidence by: • Presenting his/her own analysis of the data • Breaking down the evidence into its component parts • Drawing connections between discrete sources of data • Attending to contradictory or inadequate information
Performance Task • Students are also expected to draw conclusions by: • Constructing cogent arguments rooted in data rather than speculation • Selecting the strongest set of supporting evidence • Avoiding overstated or understated conclusions and suggesting additional information to complete the analysis
CLA Scoring and our CLA Results CLA scores for a school represent the average (or “mean”) score for all students that completed a CLA task and who also have an SAT score (or ACT score converted to the SAT scale) on file with the registrar. The CLA scale approximates the SAT scale.
CLA Scoring and our CLA Results Mean SAT Scores (on the horizontal x-axis) are used to control for incoming academic ability. Put another way, it allows for a level playing field when comparing performance across all CLA schools.
CLA Scoring and our CLA Results This blue dot represents the mean CLA score and mean SAT score for the 78 freshmen we sampled in 2006.
CLA Scoring and our CLA Results These blue circles represent mean CLA and SAT scores at the other 114 schools testing freshmen in fall 2006. Once again, the unit of analysis is institutions, not students.
CLA Scoring and our CLA Results The diagonal blue line shows the typical relationship between academic ability and mean CLA scores of freshmen across all participating institutions.
CLA Scoring and our CLA Results Points along the line represent expected CLA scores for a school testing freshmen across the range of mean SAT scores.
CLA Scoring and our CLA Results The focus is on the difference between a college’s actual and expected CLA scores—graphically, the vertical distance between the dot and the line. These differences are reported in both CLA scale points and standard errors.
CLA Scoring and our CLA Results Colleges with actual mean scores between -1.00 and +1.00 standard errors from their expected scores are categorized as being “At Expected.”
CLA Scoring and our CLA Results Institutions with actual mean CLA scores greater than one standard error (but less than two standard errors) from their expected scores are in the “Above Expected” or “Below Expected” categories (depending on the direction of the deviation).
CLA Scoring and our CLA Results The schools with actual scores greater than two standard errors from their expected scores are in the “Well Above Expected” or “Well Below Expected” categories.
CLA Scoring and our CLA Results Based on the average SAT score (1005) of the 78 freshmen we sampled, their expected average CLA score was 1038. Our freshmen scored 972, which is Below Expected (-66 CLA scale points and -1.6 standard errors from the line).
CLA Scoring and our CLA Results Repeating the process for seniors, this solid red square represents the mean CLA score and mean SAT score for the 79 seniors we sampled in spring 2007.
CLA Scoring and our CLA Results These red squares represent mean CLA and SAT scores at the other 105 schools testing seniors in spring 2007.
CLA Scoring and our CLA Results The diagonal red line shows the typical relationship between academic ability and mean CLA scores of seniors across all participating institutions.
CLA Scoring and our CLA Results Points along the line represent the expected CLA score for a school testing seniors across the range of mean SAT scores.
CLA Scoring and our CLA Results Based on the average SAT score (1042) of the 79 seniors we sampled, their expected average CLA score was 1148. Our seniors scored 1150, which is At Expected (2 CLA scale points and 0.0 standard errors from the line).
CLA Scoring and our CLA Results Based on the average SAT scores of our freshmen and seniors, we would expect a difference of 110 CLA scale points.
CLA Scoring and our CLA Results So how did we do? The difference between how our seniors and freshmen scored was 178 points, which places us in the decile group 10; we performed better than 90 percent of institutions. (68 CLA scale points and 1.6 standard error units from expected).
CLA Data and Next Steps Institution-level CLA results operate as a signaling tool of overall institutional performance that we can compare with other outcomes, such as retention and graduation rates.
CLA Data and Next Steps Student-level CLA results are also provided for us to link with other data sources (e.g., course‐taking patterns, grades, portfolio assessments, student satisfaction and engagement, major-specific tests, etc.) so we can identify correlations, begin to explain our results and formulate additional questions for investigation.
CLA Data and Next Steps In-depth sampling focuses on specific populations • transfers versus “native” students • fields of study • academic majors • students living on/off campus • work-study students • financial aid recipients • athletes Longitudinal studies track the same students over time • Students tested as freshmen, rising juniors and seniors • Cross-sectional sample of seniors tested in first year to establish baseline for performance
Finally, the Performance Task described earlier in this presentation will be released publicly in fall 2007 as an instructional tool, complete with a scoring guide.This will provide faculty with the chance to work with students to understand why they achieved the scores they did, and what to do next to improve their skills.This initiative is called “CLA in the Classroom”.How can you help? CLA Data and Next Steps
MAPP Test • “Measure of Academic Proficiency and Progress” • The MAPP is a standardized measure of college-level reading, mathematics, writing, and critical thinking in the context of the humanities, social sciences, and natural sciences. The exam is designed by the Educational Testing Service. • According to ETS, “The MAPP test is designed for colleges and universities to assess their general education outcomes, so they may improve the quality of instruction and learning. It focuses on the academic skills developed through general education courses, rather than on the knowledge acquired about the subjects taught in these courses.” • Multiple Choice Exam
MAPP Administration • Shepherd University “native” sophomores (students who have completed between 25 and 56 semester hours of work). • 100 sophomores randomly selected • Spring 2007 N = 97; mean GPA = 2.80; median GPA = 2.84 • Paper and pencil test proctored by members of the Assessment Task Force
MAPP Proficiency Levels • MAPP Proficiency Classifications: • Reading/Critical Thinking • To be considered Proficient at level 1 a student should be able to • Recognize factual material explicitly presented in a reading passage • Understand the meaning of particular words or phrases in the context of a reading passage • To be considered Proficient at level 2 a student should be able to • Synthesize material from different sections of a passage • Recognize valid inferences derived from material in the passage • Identify accurate summaries of a passage or of significant sections of the passage • Understand and interpret figurative language • Discern the main idea, purpose, or focus of a passage or a significant portion of the passage • To be considered Proficient at level 3 a student should be able to • Evaluate competing causal explanations • Evaluate hypotheses for consistency with known facts • Determine the relevance of information for evaluating an argument or conclusion • Determine whether an artistic interpretation is supported by evidence contained in a work • Recognize the salient features or themes in a work of art • Evaluate the appropriateness of procedures for investigating a question of causation • Evaluate data for consistency with known facts, hypotheses or methods • Recognize flaws and inconsistencies in an argument
MAPP Proficiency Levels • Writing Skills • To be considered Proficient at level 1 a student should be able to • Recognize agreement among basic grammatical elements (e.g., nouns, verbs, pronouns and conjunctions) • Recognize appropriate transition words • Recognize incorrect word choice • Order sentences in a paragraph • Order elements in an outline • To be considered Proficient at level 2 a student should be able to • Incorporate new material into a passage • Recognize agreement among basic grammatical elements (e.g., nouns, verbs, pronouns, and conjunctions) when these elements are complicated by intervening words or phrases • Combine simple clauses into single, more complex combinations • Recast existing sentences into new syntactic combinations • To be considered Proficient at level 3 a student should be able to • Discriminate between appropriate and inappropriate use of parallelism • Discriminate between appropriate and inappropriate use of idiomatic language • Recognize redundancy • Discriminate between correct and incorrect constructions • Recognize the most effective revision of a sentence
MAPP Proficiency Levels • Mathematics • To be considered Proficient at level 1 a student should be able to • Solve word problems that would most likely be solved by arithmetic and do not involve conversion of units or proportionality. These problems can be multi-step if the steps are repeated rather than embedded. • Solve problems involving the informal properties of numbers and operations, often involving the Number Line, including positive and negative numbers, whole numbers and fractions (including conversions of common fractions to percent, such as converting "1/4" to 25%). • Solve problems requiring a general understanding of square roots and the squares of numbers. • Solve a simple equation or substitute numbers into a algebraic expression. • Find information from a graph. This task may involve finding a specified piece of information in a graph that also contains other information. • To be considered Proficient at level 2 a student should be able to • Solve arithmetic problems with some complications, such as complex wording, maximizing or minimizing, and embedded ratios. These problems include algebra problems that can be solved by arithmetic (the answer choices are numeric). • Simplify algebraic expressions, perform basic translations, and draw conclusions from algebraic equations and inequalities. These tasks are more complicated than solving a simple equation, though they may be approached arithmetically by substituting numbers. • Interpret a trend represented in a graph, or choose a graph that reflects a trend. • Solve problems involving sets; the problems would have numeric answer choices. • To be considered Proficient at level 3 a student should be able to • Solve word problems that would be unlikely to be solved by arithmetic; the answer choices are either algebraic expressions or are numbers that do not lend themselves to back-solving. • Solve problems involving difficult arithmetic concepts such as exponents and roots other than squares and square roots and percent of increase or decrease. • Generalize about numbers, e.g., identify the values of (x) for which an expression increases as (x) increases. • Solve problems requiring an understanding of the properties of integers, rational numbers, etc. • Interpret a graph in which the trends are to be expressed algebraically or in which one of the following is involved: exponents and roots other than squares and square roots, percent of increase or decrease. • Solve problems requiring insight or logical reasoning.