1 / 61

University of North Carolina at Greensboro Educational Research Methodology

Gathering Assessment Evidence To Evaluate your Courses and Your Program Terry Ackerman Department of Educational Research Methodology University of North Carolina at Greensboro. Building the Assessment Bridge Charlotte Law School: Assessment And Student Outcomes Conference Terry Ackerman.

Download Presentation

University of North Carolina at Greensboro Educational Research Methodology

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Gathering Assessment Evidence To Evaluate your Courses and Your Program Terry Ackerman Department of Educational Research Methodology University of North Carolina at Greensboro Building the Assessment Bridge Charlotte Law School: Assessment And Student Outcomes Conference Terry Ackerman University of North Carolina at Greensboro Educational Research Methodology

  2. Overview of my talk • Part I • Interesting parallels between practicing law and educational measurement • Thoughts on writing test items • Interesting assignments for your faculty • Concept Mapping • The race analogy • Thoughts on teaching and testing • Part II • Your questions • An interesting assessment story • United States Supreme Co University of North Carolina at Greensboro Educational Research Methodology

  3. Interesting parallels For me, there are some interesting similarities between law and educational measurement. In both arenas one builds a case based upon evidence. University of North Carolina at Greensboro Educational Research Methodology

  4. Interesting parallels In law you have different types of evidence e.g., testimony, documentary, physical/real, digital, exculpatory, etc. In law you build a case by laying a foundation of relevance and authenticityfor the admission of evidence in the form of exhibits. You even have the best evidence rule. University of North Carolina at Greensboro Educational Research Methodology

  5. Interesting parallels In educational measurement we have different types of evidence,e.g., quiz and test scores, class projects, presentations, observations, etc. In educational measurement we build a case by laying a foundationof reliability and validityfor the admission of evidence in the form of psychometric analysis of response data. We follow best practice as outlined in the Standards for Educational and Psychological Tests. University of North Carolina at Greensboro Educational Research Methodology

  6. Thoughts on writing test items I think sometimes the assessment of students is separated from , and given a back seat to, the actual teaching and content of the students. It should be an integral of pedagogy. It should inform instruction, help us to evaluate our teaching, and enable us to monitor student progress. University of North Carolina at Greensboro Educational Research Methodology

  7. Thoughts on writing test items • Items should be written in the vocabulary that the subject matter was taught, but with from novel, challenging perspectives. • Items should reflect the goals/objectives of the lectures • Instructors should take the time to actually take the test that they have created. University of North Carolina at Greensboro Educational Research Methodology

  8. Thoughts on writing test items I encourage you to apply the same measure of passion and enthusiasm, the same amount of systematic rigor and intensity, and the same level of creativity and ingenuity to your educational assessment efforts that you have for practicing law. Whenever possible perform basic measurement analyses to evaluate your items and tests. Recycle this information back into your instruction. University of North Carolina at Greensboro Educational Research Methodology

  9. Example item analysis for a 15-item classroom test with 17 students 7 people selected distractor C ITEM 2 0 1 2* 3 4 PBIS= .4985 UPP N= 6 0 0 3 3 0 BIS= .6623 MID N= 6 0 1 2 2 1 DIFF= .2941 LOW N= 5 0 3 0 2 0 IREL= .1466 TOT N= 17 0 4 5 7 1 DIS= .2727 5 out of 17 got the correct answer MID Frequency LOW UPP Test Score

  10. Average score was 10.9 *************** TEST TOTALS *************** HIGHEST POSSIBLE SCORE = 15 MEAN= 10.941 STANDARD DEVIATION = 3.702 SKEWNESS = .067 KURTOSIS = -.808 OBSERVED RANGE = 4 to 14 KR20 = .746 Shape of the raw score distribution was close to normal

  11. Interesting assignments for your faculty • Have each member of your faculty: • List the strategic goals of your program and what role they play in achieving these goals. How similar would the list of skills be? • Identify the expectations and skill set that they feel graduates from your program should/would each acquire? How similar would the list of skills be? • If you had a group of graduate students from ten different law schools, what are the characteristics that would distinguish the graduate from your law school from the rest? University of North Carolina at Greensboro Educational Research Methodology

  12. Have each member of your faculty • D. Draw a diagram/flow-chart listing the components (i.e., courses, activities, etc. )of your program and how they relate to each other. How similar would the diagrams be? • E. Create a timeline of your program and indicate the sequence (and subsequent rational) of the components/courses of your program. How similar would the timelines and rationales be? University of North Carolina at Greensboro Educational Research Methodology

  13. An assignment for your faculty • F. How well do the goals and objectives of your courses map onto the metrics of success/impact of your graduates? • (This information is key for planning and accreditation.) University of North Carolina at Greensboro Educational Research Methodology

  14. Concept Mapping is a planning tool that incorporates inclusiveness, provides for participatory planning, is structured for efficient use of time, utilizes both qualitative and quantitative analysis of data, generates a visualization of the concepts developed by the planning group and can assist with the articulation of program theory, thus laying groundwork for meaningful assessment of results University of North Carolina at Greensboro Educational Research Methodology

  15. The procedures for Concept Mapping as designed by Trochim (1989) are organized into six stages: 1) preparation, 2) idea generation, 3) idea structuring, 4) representation of the ideas, 5) interpretation, and 6) utilization of those results for action plans. Generally a trained facilitator guides the group through each of the stages and performs the analysis necessary to transform the group’s input to a Concept Map. University of North Carolina at Greensboro Educational Research Methodology

  16. Compensate faculty who offer experiential learning with research credits (28) Give each professor incentives to increase service learning/experiential learning in their classrooms (85) 85 28 Improve existing community relationships (25) Award extra credit hours for experiential learning classes (77) 77 25 10 Define experiential learning terms (10) Experiential Learning Project

  17. The analogy of running a race • I think it is helpful to think about the journey through each course in the program in terms of a race. • Once viewed from this perspective it will help faculty see things from the student’s standpoint. Faculty need to provide the following information • 1. The distance of the race they are about to run • 2. What level of training is needed to run this race? • 3. What is the pace of this race? • 4. Who is my competition? • 5. What are the conditions of the track/course I am running on? Uphill, flat, downhill? University of North Carolina at Greensboro Educational Research Methodology

  18. The analogy of running a race • 6. What are the coaches ‘ (professors’) expectations? • 7. What are the characteristics of the coach (overbearing, friendly, mentoring, knowledgeable) • 8. What are the expectations for success (I will consider my race a success if…) • 9. Why am I running this race? • 10. What are the rules of this race? • 11. Is this race an individual event or a relay? University of North Carolina at Greensboro Educational Research Methodology

  19. Thoughts on teaching and testing • We tend to teach how we were taught, no matter how distasteful it may have been, because it is all we know • True master teachers • Always are searching for new ways to present the material • Engage other faculty for their ideas, thoughts, suggestions, criticisms • Use student feedback to improve their teaching • Recognize diversity and that not all students learn in the same manner or at the same rate University of North Carolina at Greensboro Educational Research Methodology

  20. Thoughts on teaching and testing • C. Test data represent the interaction between examinees and a set of items. If you keep the questions the same but change the examinees the results could be quite different. For one group of examinees the data may be measuring multiple skills and be multidimensional, for another group of examinees it may measure a single skill and thus be unidimensional. • D. If you report a single score you are implicitly saying all the items are measuring the same skill or same composite of skills. University of North Carolina at Greensboro Educational Research Methodology

  21. Thoughts on teaching and testing • Once you understand a concept or master a skill you forget what it was like to when you didn’t understand and struggled to comprehend. Keep this in mind when working with students who are having a difficult time understanding. University of North Carolina at Greensboro Educational Research Methodology

  22. Part II Your questions

  23. Today I am struggling with students who were among the best prepared in class, but who did poorly on the exam – they did not demonstrate their knowledge on it. In the future, to try to offset this, I have decided to give a graded midterm that counts towards their grade, and I’ll probably give a percentage for class preparation. Any other ideas? What was the purpose of the exam? Students have to be motivated to take the test. Unfortunately, no firm rationale or it doesn’t “count”, translates to lack of motivation. University of North Carolina at Greensboro Educational Research Methodology

  24. Which is the better assessment tool, the lengthy final examination that attempts to cover most of the course or the shorter examination that covers the most essential course topics? What is the purpose of the final exam? How will the scores be used? Are the skills hierarchical? If you believe students have mastered previous concepts, then I wouldn’t retest them. If the skills build upon one another then testing the higher order skills will also test the prerequisite skills. University of North Carolina at Greensboro Educational Research Methodology

  25. I gave what I believed to be a very reasonable examination. As usual, the papers varied widely. However, I experienced more papers than ordinary that demonstrated a lack of real analysis. They were very terse with little discussion of the relevant law. I plan to place the best two papers and the worst two papers with my comments on my TWEN page. Because this is a continuous problems, I want to help more, but have run out of ideas. University of North Carolina at Greensboro Educational Research Methodology

  26. I think we all believe that most of the tests that we have given have been reasonable. I applaud your effort to demonstrate what is acceptable and what is not. I would also have a talk with the students to find out what happened. Did they not understand your expectations? Did they understand the format and conditions under which they would be tested? University of North Carolina at Greensboro Educational Research Methodology

  27. As Director of Experiential Learning, I know that faculty struggle with how to assess simulations, live client case work and even professionalism. In other words, how do you assess beyond written tests. Is this outside your expertise? There is much talk about use of standardized clients for assessing client interaction/counseling skills. So, if you have any thoughts on this, I would love to hear them. problems, I want to help more, but have run out of ideas. University of North Carolina at Greensboro Educational Research Methodology

  28. I consult for the Medical College of Wisconsin and have been somewhat involved in their testing using “standardized patients”. I think this has been very successful and novel approach to their licensure testing. It is very realistic and I believe one of the more valid approaches to assessing how physicians will perform after they graduate. I would strongly encourage you to study what they have done and see if there are legal parallels. University of North Carolina at Greensboro Educational Research Methodology

  29. How does one do meaningful assessment of any type in a class of 75 students? Large classes are always problematic, not only for teaching, but for learning, and for effective assessment. One approach that I have done with large classes is to have them work in groups and identify everyone’s contribution. I also like to have students evaluate and critique each other. Certain test formats (e.g., essay) become prohibitive unless you have TA help. “Clicker” technology, if available, can be really helpful. University of North Carolina at Greensboro Educational Research Methodology

  30. I am a practitioner with 36 years experience and an adjunct teacher of a contract drafting seminar. I believe that the teaching of contract drafting needs to be broken down into several parts, such as the language of contracts and the structure of contracts, that are separate courses but integrated with other courses on negotiation and on the substantive law of particular fields, such as real estate, M & A, sales, etc. The actual drafting of a particular contract is a "keystone performance" that integrates the discrete skills learned in these courses is best left for students to perform as beginning associates in a law firm under the one-on-one supervision of a partner. University of North Carolina at Greensboro Educational Research Methodology

  31. My experience has taught me that one of the major issues with contracts is ambiguity, so the seminar I teach is entitled "Contract Drafting: The Problem of Ambiguity." The purpose is to teach students to identify ambiguities in contract drafting and to eliminate them. In each class we examine examples of one kind of ambiguity and consider how to eliminate that kind of ambiguity. The final exam is to review a contract and identify examples of the ambiguities studied in the seminar, explain them, and make recommendations as to whether amending the contract would be in the client's interest and, if so, how it should be amended. University of North Carolina at Greensboro Educational Research Methodology

  32. I am trying to think how I might apply an "outcomes assessment" approach to this seminar. One suggestion would be to have the students see what they had learned in the seminar in a very graphic way. This would involve asking the students, as the reading for the first class, to read a sample contract and try to find ambiguities. More concretely, ask them to mark up all the ambiguities they could find. Then, at the end of the course, for the last class, give them that same contract to review to identify ambiguities and mark them up. The students could then compare their markups of the contract before they took the seminar and after they had the seminar and see how much they had learned. It would even be possible to reduce the assessment process to a statistical result, asking the students to count the number of ambiguities they found the first time and then how many they found the second time. The contrast would be instructive to them and would also provide at least one simple statistic that could provide concrete evidence for as assessment outcome. University of North Carolina at Greensboro Educational Research Methodology

  33. This suggestion, however, poses several issues of outcomes assessment. Is the assessment suggested a bit too concrete? I like your approach. It is after all accomplishing what you are trying to achieve in a very valid and realistic way. It sounds like this is more of a craft, and that one becomes an “expert” at this by simply doing it. University of North Carolina at Greensboro Educational Research Methodology

  34. Is it worthwhile sacrificing the original reading for the first class (a general description of contract drafting) in order to be able to achieve this kind of concrete outcome assessment? I think you need to examine the goals of the course and given that you only have a certain amount of time to cover the material experiment and see what works best. Perhaps there are more efficient/effective item/test formats. University of North Carolina at Greensboro Educational Research Methodology

  35. What is the right balance in allocating course time between content and assessment? I don’t think there is any formula for this. It depends on the content, the students, and the instructor’s ability to effectively cover the material. Teaching is iterative and each time we should strive to improve. How do we devise ways to measure what the students have learned? There are guidelines for creating items. Each instructor is probably better at using some formats than others. I would strive for realism, and try to be as creative as possible. Students should always be told about the type of test they are to take. University of North Carolina at Greensboro Educational Research Methodology

  36. We at the Center for Professionalism along with our Commission on Professionalism have been working to encourage law schools and the ABA accreditation standards committee to take the issues relating to character seriously as it relates to legal Education. I am a former College of Education teacher and my PhD. dealt with moral development. Now I run this Center very hard to encourage legal Ed to put some focus upon the character development of their students. So I would very much like to engage with the participants in, A conversation about how law schools can enhance students' understanding of their role vis a vis character development and how that can be measured ( a la Mickey Bebeau at U Minnesota school of Dentistry and her work on moral development in professional education) University of North Carolina at Greensboro Educational Research Methodology

  37. 2. How we could include in the law school accreditation standards language relating to good character ( another word for moral development really) which would enshrine that idea into the curricula of all law schools. I am aware that most legal educators want nothing to do with this but our Supreme Court Commission ( and our Supreme Court here in Florida) are both VERY interested in this issue because of all the discipline issues the courts are dealing with. University of North Carolina at Greensboro Educational Research Methodology

  38. I have seen some work in IO Psychology in which items are constructed to assess moral character. Steve Stark at the U. of Miami and Fritz Drasgow at the U. of Illinois have done some creative research with forced choice ipsative item types and “ideal point models”. I also know that the military is also interested in this type of testing. University of North Carolina at Greensboro Educational Research Methodology

  39. For most law professors, the shift from viewing their job as talking and then sorting their students to identifying goals, developing appropriate opportunities for learning, giving feedback, and then assessing progress towards those goals is a huge one. Any wisdom you have on how to facilitate that shift would be greatly appreciated. I understand that institutional goals and assessment will help in that regard, but I'm wondering whether you have any thoughts about how to deal with the individual and institutional psychology involved? University of North Carolina at Greensboro Educational Research Methodology

  40. I’m not certain what you mean by institutional psychology. I do understand, I think, the strategic goals of my university and my school of education. I try not to focus on hidden agendas, or try to figure out why the administration does this or that…I don’t have time to do this…and it could drive me crazy. I do expect my administrators to be honest , fair, transparent and provide me with the evidence they base their rationales upon. University of North Carolina at Greensboro Educational Research Methodology

  41. This may (or may not) be related to your first topic: Is there a way to design tests to minimize the degree to which pre-knowledge is counted in the assessment? This may be a little idiosyncratic for the whole group, but one of my concerns is that (a) teachers actually teach what they test and (b) students from privileged educational backgrounds not always end up on the top of heap because of a halo of well-developed skills and knowledge not relevant to the assessment (or only marginally relevant). You could give a pretest and see where people are before instruction. Unfortunately you can’t control for privilege, disparity or diversity on tests or in real life. I believe everyone should be able to be successful in my class. University of North Carolina at Greensboro Educational Research Methodology

  42. 3. Another idiosyncratic question, probably not for the group: I've experimented using Bloom's taxonomy to develop questions that test students in sequence through a unit of material kind of as a mastery instrument and then at the end of the unit or course to see where they are. I just kind of made this up and am wondering if there's any validity or utility to this. Bloom’s taxonomy is a great way to measure at different levels or depths of reasoning. Phrasing for questions are provided. University of North Carolina at Greensboro Educational Research Methodology

More Related