1 / 51

Laura Renninger Shepherd University

Laura Renninger Shepherd University. Overview. CLA Approach CLA Administration CLA Measures CLA Scoring and Shepherd’s CLA Results CLA Additional Data and Next Steps. CLA Approach. Holistic assessment of important skills Critical Thinking Analytic Reasoning Written Communication

carol-yates
Download Presentation

Laura Renninger Shepherd University

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Laura Renninger Shepherd University

  2. Overview CLA Approach CLA Administration CLA Measures CLA Scoring and Shepherd’s CLA Results CLA Additional Data and Next Steps

  3. CLA Approach Holistic assessment of important skills • Critical Thinking • Analytic Reasoning • Written Communication • Problem Solving Measurement of value-added Institution as unit of analysis

  4. CLA Administration The CLA is administered by the Council for Aid to Education (CAE), a non-profit organization based in New York City. Reporting Products • Institutional Presentation • Institutional Report • Student Data File Results are not reported publicly • Schools can share data within consortia of peer institutions

  5. CLA Administration We participated in a cross-sectional study, in which growth between freshmen and seniors is estimated by testing samples of students, not the entire class. Students take the CLA online in proctored settings. Testing time is approximately 90 minutes. 150 “native”, first-year students were selected at random and tested at Orientation in August 2004, 2005 and 2006 150 “native” seniors were selected via a stratified random sampling process and tested during the spring 2005, 2006 and 2007 semesters Tests were proctored by members of the Assessment Task Force: Dr. Gordon DeMeritt, Dr. Barri Tinkler, Dr. John Adams, Shannon Holliday, Laura Renninger and John Sheridan

  6. CLA Measures Analytic Writing Task • Make-an-Argument • Critique-an-Argument Performance Task

  7. Analytic Writing Task: Make-an-Argument “In our time, specialists of all kinds are highly overrated. We need more generalists -- people who can provide broad perspectives.” Directions: In 45 minutes, agree or disagree and explain the reasons for your position.

  8. Analytic Writing Task:Critique-an-Argument “Butter has now been replaced by margarine in Happy Pancake House restaurants throughout the southwestern United States. Only about 2 percent of customers have complained, indicating that 98 people out of 100 are happy with the change. Furthermore, many servers have reported that a number of customers who still ask for butter do not complain when they are given margarine instead. Clearly, either these customers cannot distinguish margarine from butter, or they use the term "butter" to refer to either butter or margarine. Thus, to avoid the expense of purchasing butter, the Happy Pancake House should extend this cost-saving change to its restaurants in the southeast and northeast as well.” Directions: In 30 minutes, discuss how well- reasoned you find the argument.

  9. Performance Task • Performance Tasks place students in a real-world scenario. • In the following case, students have 90 minutes to advise the • mayor on crime reduction strategies and evaluate two potential • policies: • Invest in a drug treatment program or • Put more police on the streets. • Students are provided with a Document Library, which includes • different types of information sources, such as…

  10. Performance Task A MEMO by a private investigator that reports on connections between a specific drug treatment program and a vocal critic of placing more police on the streets.

  11. Performance Task CRIME STATISTICS that compare the percentage of drug addicts to the number of crimes committed in the area.

  12. Performance Task Crime and community DATA TABLES provided by the Police Department.

  13. Performance Task A NEWS story highlighting a rise in local drug-related crime.

  14. Performance Task A RESEARCH BRIEF summarizing a scientific study that found the drug treatment program to be effective.

  15. Performance Task A CHART that shows that counties with a relatively large number of police officers per resident tend to have more crime than those with fewer officers per resident.

  16. Performance Task WEB SEARCH results of other studies evaluating the drug treatment program.

  17. Performance Task • Performance Tasks require students to use an integrated set of critical thinking, analytic reasoning, problem solving, and written communication skills. • There are no “right” answers. The goal is to stimulate students’ abilities to make reasoned, reflective arguments.

  18. Performance Task • Students are expected to evaluate evidence by: • Determining what information is or is not pertinent • Distinguishing between fact and opinion • Recognizing limitations in the evidence • Spotting deception and holes in the arguments of others

  19. Performance Task • Students are expected to analyze and synthesize the evidence by: • Presenting his/her own analysis of the data • Breaking down the evidence into its component parts • Drawing connections between discrete sources of data • Attending to contradictory or inadequate information

  20. Performance Task • Students are also expected to draw conclusions by: • Constructing cogent arguments rooted in data rather than speculation • Selecting the strongest set of supporting evidence • Avoiding overstated or understated conclusions and suggesting additional information to complete the analysis

  21. CLA Scoring and our CLA Results CLA scores for a school represent the average (or “mean”) score for all students that completed a CLA task and who also have an SAT score (or ACT score converted to the SAT scale) on file with the registrar. The CLA scale approximates the SAT scale.

  22. CLA Scoring and our CLA Results Mean SAT Scores (on the horizontal x-axis) are used to control for incoming academic ability. Put another way, it allows for a level playing field when comparing performance across all CLA schools.

  23. CLA Scoring and our CLA Results This blue dot represents the mean CLA score and mean SAT score for the 78 freshmen we sampled in 2006.

  24. CLA Scoring and our CLA Results These blue circles represent mean CLA and SAT scores at the other 114 schools testing freshmen in fall 2006. Once again, the unit of analysis is institutions, not students.

  25. CLA Scoring and our CLA Results The diagonal blue line shows the typical relationship between academic ability and mean CLA scores of freshmen across all participating institutions.

  26. CLA Scoring and our CLA Results Points along the line represent expected CLA scores for a school testing freshmen across the range of mean SAT scores.

  27. CLA Scoring and our CLA Results The focus is on the difference between a college’s actual and expected CLA scores—graphically, the vertical distance between the dot and the line. These differences are reported in both CLA scale points and standard errors.

  28. CLA Scoring and our CLA Results Colleges with actual mean scores between -1.00 and +1.00 standard errors from their expected scores are categorized as being “At Expected.”

  29. CLA Scoring and our CLA Results Institutions with actual mean CLA scores greater than one standard error (but less than two standard errors) from their expected scores are in the “Above Expected” or “Below Expected” categories (depending on the direction of the deviation).

  30. CLA Scoring and our CLA Results The schools with actual scores greater than two standard errors from their expected scores are in the “Well Above Expected” or “Well Below Expected” categories.

  31. CLA Scoring and our CLA Results Based on the average SAT score (1005) of the 78 freshmen we sampled, their expected average CLA score was 1038. Our freshmen scored 972, which is Below Expected (-66 CLA scale points and -1.6 standard errors from the line).

  32. CLA Scoring and our CLA Results Repeating the process for seniors, this solid red square represents the mean CLA score and mean SAT score for the 79 seniors we sampled in spring 2007.

  33. CLA Scoring and our CLA Results These red squares represent mean CLA and SAT scores at the other 105 schools testing seniors in spring 2007.

  34. CLA Scoring and our CLA Results The diagonal red line shows the typical relationship between academic ability and mean CLA scores of seniors across all participating institutions.

  35. CLA Scoring and our CLA Results Points along the line represent the expected CLA score for a school testing seniors across the range of mean SAT scores.

  36. CLA Scoring and our CLA Results Based on the average SAT score (1042) of the 79 seniors we sampled, their expected average CLA score was 1148. Our seniors scored 1150, which is At Expected (2 CLA scale points and 0.0 standard errors from the line).

  37. CLA Scoring and our CLA Results Based on the average SAT scores of our freshmen and seniors, we would expect a difference of 110 CLA scale points.

  38. CLA Scoring and our CLA Results So how did we do? The difference between how our seniors and freshmen scored was 178 points, which places us in the decile group 10; we performed better than 90 percent of institutions. (68 CLA scale points and 1.6 standard error units from expected).

  39. CLA Scoring and our CLA Results

  40. CLA Data and Next Steps Institution-level CLA results operate as a signaling tool of overall institutional performance that we can compare with other outcomes, such as retention and graduation rates.

  41. CLA Data and Next Steps Student-level CLA results are also provided for us to link with other data sources (e.g., course‐taking patterns, grades, portfolio assessments, student satisfaction and engagement, major-specific tests, etc.) so we can identify correlations, begin to explain our results and formulate additional questions for investigation.

  42. CLA Data and Next Steps In-depth sampling focuses on specific populations • transfers versus “native” students • fields of study • academic majors • students living on/off campus • work-study students • financial aid recipients • athletes Longitudinal studies track the same students over time • Students tested as freshmen, rising juniors and seniors • Cross-sectional sample of seniors tested in first year to establish baseline for performance

  43. Finally, the Performance Task described earlier in this presentation will be released publicly in fall 2007 as an instructional tool, complete with a scoring guide.This will provide faculty with the chance to work with students to understand why they achieved the scores they did, and what to do next to improve their skills.This initiative is called “CLA in the Classroom”.How can you help? CLA Data and Next Steps

  44. Questions?

  45. MAPP Test • “Measure of Academic Proficiency and Progress” • The MAPP is a standardized measure of college-level reading, mathematics, writing, and critical thinking in the context of the humanities, social sciences, and natural sciences. The exam is designed by the Educational Testing Service. • According to ETS, “The MAPP test is designed for colleges and universities to assess their general education outcomes, so they may improve the quality of instruction and learning. It focuses on the academic skills developed through general education courses, rather than on the knowledge acquired about the subjects taught in these courses.” • Multiple Choice Exam

  46. MAPP Administration • Shepherd University “native” sophomores (students who have completed between 25 and 56 semester hours of work). • 100 sophomores randomly selected • Spring 2007 N = 97; mean GPA = 2.80; median GPA = 2.84 • Paper and pencil test proctored by members of the Assessment Task Force

  47. MAPP Proficiency Levels • MAPP Proficiency Classifications: • Reading/Critical Thinking • To be considered Proficient at level 1 a student should be able to • Recognize factual material explicitly presented in a reading passage • Understand the meaning of particular words or phrases in the context of a reading passage • To be considered Proficient at level 2 a student should be able to • Synthesize material from different sections of a passage • Recognize valid inferences derived from material in the passage • Identify accurate summaries of a passage or of significant sections of the passage • Understand and interpret figurative language • Discern the main idea, purpose, or focus of a passage or a significant portion of the passage • To be considered Proficient at level 3 a student should be able to • Evaluate competing causal explanations • Evaluate hypotheses for consistency with known facts • Determine the relevance of information for evaluating an argument or conclusion • Determine whether an artistic interpretation is supported by evidence contained in a work • Recognize the salient features or themes in a work of art • Evaluate the appropriateness of procedures for investigating a question of causation • Evaluate data for consistency with known facts, hypotheses or methods • Recognize flaws and inconsistencies in an argument

  48. MAPP Proficiency Levels • Writing Skills • To be considered Proficient at level 1 a student should be able to • Recognize agreement among basic grammatical elements (e.g., nouns, verbs, pronouns and conjunctions) • Recognize appropriate transition words • Recognize incorrect word choice • Order sentences in a paragraph • Order elements in an outline • To be considered Proficient at level 2 a student should be able to • Incorporate new material into a passage • Recognize agreement among basic grammatical elements (e.g., nouns, verbs, pronouns, and conjunctions) when these elements are complicated by intervening words or phrases • Combine simple clauses into single, more complex combinations • Recast existing sentences into new syntactic combinations • To be considered Proficient at level 3 a student should be able to • Discriminate between appropriate and inappropriate use of parallelism • Discriminate between appropriate and inappropriate use of idiomatic language • Recognize redundancy • Discriminate between correct and incorrect constructions • Recognize the most effective revision of a sentence

  49. MAPP Proficiency Levels • Mathematics • To be considered Proficient at level 1 a student should be able to • Solve word problems that would most likely be solved by arithmetic and do not involve conversion of units or proportionality. These problems can be multi-step if the steps are repeated rather than embedded. • Solve problems involving the informal properties of numbers and operations, often involving the Number Line, including positive and negative numbers, whole numbers and fractions (including conversions of common fractions to percent, such as converting "1/4" to 25%). • Solve problems requiring a general understanding of square roots and the squares of numbers. • Solve a simple equation or substitute numbers into a algebraic expression. • Find information from a graph. This task may involve finding a specified piece of information in a graph that also contains other information. • To be considered Proficient at level 2 a student should be able to • Solve arithmetic problems with some complications, such as complex wording, maximizing or minimizing, and embedded ratios. These problems include algebra problems that can be solved by arithmetic (the answer choices are numeric). • Simplify algebraic expressions, perform basic translations, and draw conclusions from algebraic equations and inequalities. These tasks are more complicated than solving a simple equation, though they may be approached arithmetically by substituting numbers. • Interpret a trend represented in a graph, or choose a graph that reflects a trend. • Solve problems involving sets; the problems would have numeric answer choices. • To be considered Proficient at level 3 a student should be able to • Solve word problems that would be unlikely to be solved by arithmetic; the answer choices are either algebraic expressions or are numbers that do not lend themselves to back-solving. • Solve problems involving difficult arithmetic concepts such as exponents and roots other than squares and square roots and percent of increase or decrease. • Generalize about numbers, e.g., identify the values of (x) for which an expression increases as (x) increases. • Solve problems requiring an understanding of the properties of integers, rational numbers, etc. • Interpret a graph in which the trends are to be expressed algebraically or in which one of the following is involved: exponents and roots other than squares and square roots, percent of increase or decrease. • Solve problems requiring insight or logical reasoning.

  50. Shepherd MAPP Scoresspring 2007

More Related