1 / 0

Using Data to Improve Learning

Using Data to Improve Learning. Pre-Conference Workshop Presenters Rob Johnstone , Senior Research Fellow, The Research & Planning Group for California Community Colleges, Berkeley , CA

raina
Download Presentation

Using Data to Improve Learning

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Using Data to Improve Learning

    Pre-Conference Workshop Presenters Rob Johnstone, Senior Research Fellow, The Research & Planning Group for California Community Colleges, Berkeley, CA Kurt Ewen, Assistant Vice President, Assessment and Institutional Effectiveness, Valencia College, Orlando, FL
  2. Overview of the Workshop Introductions Discussion about the Nature and Use of Data in Higher Education Implications for practice Playing with Actual Data and discussions about data display
  3. Introductions Name The name and location of your College Your roll at the College What are you hoping for from the workshop?
  4. Using Data to Improve student Learning Think – Pair – Share In your work in higher education, what is the best piece of data you have ever seen? The most effective / actionable data? Why? An example of Great Data What is different about data from learning assessment and student success?
  5. Lessons Learned about Data Collection and Use

  6. Lesson 1: The Purpose of Assessment (Data) is to improve student learning Assessment of learning creates the possibility of better conversations Course Level Assessment Faculty – Student Student – Students Program Level Assessment Faculty – Faculty
  7. Lesson 2: Data from Learning Outcomes Assessment at the Program Level is Messy and the Messiness is an Important Part of the Process An Example What should we expect from Students exiting Comp1 at a Community college?
  8. Lesson 3: The seemingly more useful / Understandable data about student learning is to an “outside” audience the less actionable that data will be to faculty. An Example What kinds of data can be reported as a result of assessment efforts using this rubric?
  9. Lesson 4 A “culture of evidence” requires the development of an institutional practice that gives careful consideration to the question being asked gives careful consideration to the data needed in order to answer the question. A genuine “culture of evidence” is dependent on a “culture of inquiry”
  10. What is a Culture of Inquiry? Institutional capacityfor supporting open, honest and collaborative dialog focused on strengthening the institution and the outcomesof its students.
  11. Culture of Inquiry: Features Widespread sharing and easy access to user-friendly information on student outcomes Encouraging more people to ask a wider collection of questions and use their evidence and conclusions to enhance decision making Shared, reflective and dynamic discussions
  12. Culture of Inquiry: More Features Multiple opportunities to discuss information within and across constituency groups Continuous feedback so adjustments can be made along the way and processes can be adapted Culture that values curiosity, questions and robust conversations
  13. Climateof Innovation Level I Level II “Eye for Evidence”: More rigorous at each level. 1000’s of opportunities tried. Maintain a Research and Development Component. 100 are selected for support as Phase I Innovations. “Angel Capital Stage” Prototype 10 supported as Phase II Innovations. “Venture Capital Stage” Pilot Implementation (Limited Scale) Level III 1 or 2 are brought up to scale and Institutionalized. Institutionalization Strategy Hypothesis Level II Innovations must be scalable and must show potential to bring systemic change and “business-changing results.” The challenge is moving from Level II to Level III. We have yet to figure out how this will work in the new structure Standard of evidence and reflection increases at each level.
  14. When gathering evidence, make sure you are focusing on the right data.
  15. 20 year trend for California CCcourse success rates Retention Rate Success Rate What does that tell us about the usefulness of these metrics in setting institutional strategies? 1989 2008
  16. Lesson 5 A culture of evidence is one that seeks data supported decisions. Data driven decision making runs the risk of over looking / underestimating the human factor which is very often concealed by the desire for statistical significance Data driven decision making runs the risk of underestimating the role / significance / importance of evidence informed hunches to inform our decisions
  17. Lesson 5 (con’t) Data supported decisions concerning student success oriented programs requires a consideration of "meaningful improvement" and may require balancing all or most of the following: Statistically significant improvement in target measures. Reflection on the “human impact” Economic efficiency in relationship to difficulty of the task at hand. A consideration of perception as it relates to benefit versus cost.
  18. Lesson 6 Structured reflection and dialogue allows for data to be transformed into meaningful (actionable) information The “meaning” of data in Higher Education is not generally self-evident and requires the benefit of the intersection of multiple perspectives The more meaningful learning data is to an outside audience the less actionable it is to those who work with students
  19. Data do not speak for themselves.
  20. The vital role of conversation ! In order to make data useful, ample time and space are needed to discuss and analyze the information and connect it back to the original research question. Answers are not always immediately apparent, so skilled facilitation may be needed to dig out the deeper meaning. Multiple perspectives and types of information are often needed to make sense of individual data points.
  21. Dialogue vs. DiscussionAn Etymological distinction Dialogue: Seeing the whole among the parts Seeing the connections between parts Inquiring into assumptions Learning through inquiry and disclosure Creating shared meaning Discussion: Breaking issues / problems into parts Seeing distinctions between parts Justifying / defending assumptions Gaining agreement on one meaning / result
  22. Lesson 7 Meaningful information promotes consensus about lessons learned and a shared vision / plan for the future No data should be shared as information until it has been processed in a collaborative and thoughtful way. 
  23. Lesson 8 Meaningful information from data does not generally emerge from a single data point but from the intersection of multiple and varied (quantitative and qualitative) sources of data There is rarely a silver bullet (and if there is, then the question being asked is probably not particularly interesting) What data can be add to learning data to make it more meaningful Student Assessment of instruction data CCSSE Grade distribution report ??
  24. Lesson 9 Data about student learning / success must be tailored to the needs / questions of particular groups Data concerning Students’ ability to write at the college-level Course Program Department Institution External Audience Etc.
  25. Lesson 10 A simple assessment measures does not necessarily produce less meaningful / actionable data. Checklist 4 question multiple choice “test”
  26. Lesson 11 The best insights often come as an unintended result of simple questions asked about things you were not planning to question.
  27. Lesson 11 How are our new students doing? Data was provide on FTIC Degree Seeking Students Who are our new students? Development of our Philosophy statement on the New Student All Students with less than 15 College-level credits at Valencia Who are our new Students?
  28. Who are our New Students?
  29. How are our New Students Doing?
  30. National Institute for Learning Outcomes Assessment (NILOA) “From Gathering to Using Assessment Results: Lessons from the Wabash National Study”
  31. From Gathering to Using Assessment Results “Most institutions have routinized data collection, but they have little experience in reviewing and making sense of data. It is far easier to sign up for a survey offered by an outside entity or to have an associate dean interview exiting students than to orchestrate a series of complex conversations with different groups on campus about what the findings from these data mean and what actions might follow.”
  32. Lesson 12 The sharing of Information should reflect standards of scholarly communication and evidence Biases and conclusions about the information should be clearly articulated Open and unanswered questions should be articulated (but should not be allowed to stop the process) Differing perspectives on the meaning of the information should be given equal time The use of programmatic and academic jargon should kept to a minimum The visual presentation of information should be monitored for consistency
More Related