1 / 25

Academic Senate for California Community Colleges Accreditation Institute, March 19-20, 2010

The Wonderful ( and daunting ) World of Data Presenters: Janet Fulks, Bakersfield College Cathy Hasson, San Diego CCD Richard Mahon, Riverside City College. Academic Senate for California Community Colleges Accreditation Institute, March 19-20, 2010. Overview ( Richard ).

cindy
Download Presentation

Academic Senate for California Community Colleges Accreditation Institute, March 19-20, 2010

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The Wonderful (and daunting) World of DataPresenters:Janet Fulks, Bakersfield CollegeCathy Hasson, San Diego CCDRichard Mahon, Riverside City College Academic Senate for California Community Colleges Accreditation Institute, March 19-20, 2010

  2. Overview (Richard) • Purpose of the session • SLOs for session • Presenter intro/background

  3. Developing a Culture of Collaborative Inquiry -Operational Definition -Building a Culture of Inquiry -Data Integration & Inquiry Strategies

  4. Characteristics of Evidence Good evidence used in evaluations has the following characteristics: • It is intentional, and a dialogue about its meaning and relevance has taken place. • It is purposeful, designed to answer questions the institution has raised. • It has been interpreted and reflected upon, not just offered up in its raw or unanalyzed form. • It is integrated and presented in a context of other information about the institution that creates a holistic view of the institution or program. • It is cumulative and is corroborated by multiple sources of data. • It is coherent and sound enough to provide guidance for improvement. Source: ACCJC/WASC Guide to Evaluating Institutions, 2009

  5. Evidence and Inquiry • Data are provided for multiple purposes. • Data demonstrate performance & outcomes. • Data are accessible to all constituency groups. • Data are continuously distributed. • Dialogue centers around data/evidence. • Dialogue focuses on taking action. • Dialogue is open and collaborative. • Dialogue is continuous and widespread. • Dialogue is reflective and dynamic. Culture of Evidence Culture of Inquiry

  6. Building a Culture of Inquiry • Make evidence and inquiry the paradigm. • Communicate often, widely and clearly. • Embed the cycle of collectively assessing, planning, implementing, re-assessing and re-planning….. • Create opportunities for continuous collaboration with multiple and mixed constituency groups.

  7. Data Integration & Inquiry Matrix High Scope Low Low High Impact High Scope Low Impact Low High

  8. Identifying Accessible External Sources of Data • Chancellor’s Datamart http://www.cccco.edu/ChancellorsOffice/Divisions/TechResearchInfo/MIS/DataMartandReports/tabid/282/Default.aspx • NCES National Center for Education Statistics http://nces.ed.gov/ • CPEC California Postsecondary Commission http://www.cpec.ca.gov/ • CalPASS - http://www.cal-pass.org/

  9. Institutional Data Fills Two Important Gaps (Poor Planning) (Good Planning) (Good Performance) Improved Processes & Performance Standard Level of Organizational Awareness Elevated Level of Organizational Awareness Knowledge Gap Performance Gap Strategic Planning Function Institutional Effectiveness & Student Success Function Institutional Outcomes and Benchmarks Classroom and Service Area Assessment

  10. Identifying internal data sources

  11. Students Progress in a Non-Linear Fashion

  12. Defining Actionable Data – Using Data Interactively Data collection does not equate to action or improvement. Even the most valid and reliable data are not a substitute for action and will not by themselves motivate action. Actionable data provide information that leads to improved practice. Actionable data result from “action-oriented” research.

  13. Actionable Data? Is this data actionable?

  14. Actionable Data? Is this data actionable?

  15. Actionable Data? Is this data actionable?

  16. Actionable Data? Is this data actionable?

  17. Student Learning Outcomes Data ESL Level 1 Outcomes Assessments % passed • Listening/Speaking SLO: Demonstrate understanding of frequently used words, phrases and questions in familiar contexts. Engage in limited social conversations to communicate basic survival needs. • Assessment = Interview • Outcomes = 84% passed successfully • Reading • SLO: Construct meaning from simplified print materials on familiar topics. • Assessment = CASAS Level A (A comprehensive standardized test) • Outcomes = CASAS Success rate 73% • Writing • SLO: Produce simple sentences in paragraph format and complete simple forms. • Assessment = Written Paragraph and Rubric for scoring • Outcomes = 74% pass rate

  18. Exploring data - Data 101 Principle 1 – Use longitudinal data when possible Principle 2 – Use data in context Principle 3 – Look for both direct and indirect data Principle 4 – Do not oversimplify cause and effect of data Principle 5 – Use appropriate levels of data for appropriate levels of decisions Principle 6 – Perception is the reality within which people operate Principle 7 – Use of data should be transparent Principle 8 – Consider carefully when to aggregate or disaggregate data Principle 9 – Focus on data that is actionable Principle 10 – Consider implications and the “What if?

  19. Creating  Institutional Processes for Conducting Research and Using Data to Inform Practice -Information Capacity Challenges -Research Processes and Procedures -Research Agendas -Action Research Guided Questions

  20. Information Capacity Challenges • Building an Evidence-based Infrastructure • Managing and responding to myriad requests • Maintaining quality and integrity of data process • Making data and information widely accessible • Keeping Up with the Demand • Responding to heightened accountability mandates • Linking research to (resource) planning • Supporting data-driven decision-making • Turning Data into Action • Making data available and applicable at all levels • Making sense of and taking action on the data • Building a Culture of Inquiry

  21. Processes & Procedures • Guidelines for use of data and information • Protection of human subjects policy • Review panels and committees • Request and fulfillment procedures • Criteria for prioritizing ad hoc requests • Linking requests to broader goals & initiatives • Creating and using Research Agendas

  22. Research Agendas • College-wide Research Agenda • Supports major college-wide initiatives & activities • Tied to college-wide plan (goals & priorities) • Includes recurring requests • Topical Research Agenda • Focused on a single topic or group of interest • Tied to a specific initiative or activity • Fewer research activities than college-wide

  23. Action Research Guided Questions Developing the Research Agenda • What and who will be researched? • How is research tied to college plans, goals, initiatives and/or activities? • How will the information be used, by whom and how often? • Which methodology or approach will be used? Turning Data into Information • What do the data tell us? • Which questions were fully answered by the research and which need more exploration? • What are reasonable benchmarks based on the research? Taking Action on the Information • What interventions or strategies do we need to deploy in order to move the needle? • How should this information be shared and applied across the college?

  24. Organizing data: Weave, Trakdat, Taskstream “The New Guys in Town”

  25. Questions? Thank YouPlease fill out the evaluations

More Related