1 / 39

For Data Geeks…

For Data Geeks…. Connecting Assessment with Practice: Moving Information from Interesting to Valuable. Darlena Jones, Ph.D. Director of Research and Development Educational Benchmarking Inc. Assessment to Practice: A Strategy. Assessment Instruments

honora
Download Presentation

For Data Geeks…

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. For Data Geeks…

  2. Connecting Assessment with Practice: Moving Information fromInteresting to Valuable Darlena Jones, Ph.D. Director of Research and Development Educational Benchmarking Inc

  3. Assessment to Practice: A Strategy • Assessment Instruments • Defining the goals of an assessment project • Designing assessment instruments • Identifying the various key stakeholders • Reporting Strategies • Comprehending the range of reporting methods • Linking effective reporting methods and key institutional stakeholders • Discussing how effective reporting can promote changes in institutional practice • Examples in Action • MAP-Works Student Reporting • MAP-Works Faculty/Staff Reporting

  4. Assessment Instruments

  5. Assessment Instruments Step 1: Understanding the Outcomes of the Assessment Step 2: Breadth or Depth of Assessment Keys to Successful Assessment Instruments Step 4: Appropriate Survey Items Step 3: Use of Assessment Information

  6. Keys to a Successful Assessment Instrument • Step 1: What are the outcomes of the assessment? • Program improvement? • Measuring the climate? • Measuring student learning? • Support accreditation / program review?

  7. Keys to a Successful Assessment Instrument • Step 2: What is the breadth or depth of your assessment? • To thoroughly educate your audience on a range of items (diagnostic assessment)? • Strengths and weaknesses • Participant needs • Areas of improvement • Longitudinal trends • Implications and suggestions • To quickly provide “just the facts” (targeted to a specific topic)? • Summaries • Focus on Outcomes

  8. Keys to a Successful Assessment Instrument • Step 3: What will your assessment inform? • Internal to Institution • Decision making • Program evaluation and improvement • Budget allocations • Marketing and education • Interventions • Program development • External to Institution • Accreditation reports • Grant reports • Benchmarking • Professional development • Publicity for alumni news, local media, etc.

  9. Keys to a Successful Assessment Instrument • Step 4: What questions are on your instrument? • Potential Problem: Survey is long and confusing • Solution: Create survey that is focused on: • Improvement, not marketing • Performance, not activity (measure outputs, not activity) • But, most important, MISSION • Potential Problem: Survey design is bad; factors have low reliability; results display poor validity • Solution: Have “survey experts” design survey or involve people closely connected to issue help design survey • Potential Problem: Don’t know where to begin improvement once analysis is finished • Solution: Must have both analyses/systems that help guide action planning

  10. Reporting Strategies

  11. Reporting Strategies Step 1: Understanding the Audience Step 2: Appropriate Form for Information Step 4: Choosing the Best Delivery Method Keys to Successful Reporting Step 3: Using Valuable Information

  12. Understanding the Audience • Step 1: Who will be reading your assessment report? • Internal Stakeholders • Administrative decision-makers • Boards of trustees • Budget administrators • Faculty or staff • Internal governing bodies (Faculty/Staff Senate, Unions, etc.) • Students • External Stakeholders • Accreditation bodies & reviewers • Alumni • Community members • Donors • Grant reviewers • Prospective students & parents • State and federal governments

  13. Understanding the Audience • Step 1 (cnt): What is your audience’s experience with information? • Quantitative vs. Qualitative • How comfortable are they with statistics? • Do they prefer narratives or numbers? • Interest and experience- How much explanation regarding the… • Topic? • Assessment methods? • Results? • Implications? • Time available - How much time can or will they spend reading the results? • Level - Will they use university level data? College level? Department level? Individual level?

  14. Appropriate Form for Information • Step 2: What is the best form for the information? • Types of reports • Executive summaries • Comprehensive reports • Assessment summaries • Notes, brochures, flyers, and memos • Institutional snapshots • Interactive data • Easily read? • What does the report look like (size of font, appearance, visuals, etc.)? • Do they want to read this? Does it draw them in? Does it intimidate them or overwhelm them?

  15. Using Valuable Information • Step 3: What is the content of the report? • Importance: Does this report… • Include important issues? • Highlight and emphasize the important results? • Differentiate between important and non-important results? • Usefulness: Does this report… • Discuss the implications of the results? • Clearly link the results to practice? • Help practitioners determine what should be done? • Differentiate between useful results and interesting results?

  16. Choosing the Best Delivery Method • Step 4: How will you deliver the assessment results? • Media formats • Paper • Electronic (websites, downloadable files, CDs, emails, etc.) • Oral presentations • Combinations • Easily accessible? • How hard is the information for Stakeholders to access or find? • Can they find what they need quickly?

  17. Examples in Action: MAP-Works Student Report EBI’s “best practices” model of reporting

  18. Who is Responsible? • Who is responsible for student success on your campus? • Enrollment Management/ Retention? • Student Affairs? • Academic Affairs? • What information do you know about this first-year student? Enrollment Management/Retention Academic Affairs Student Affairs Student ID: YD252952HS GPA: 3.93SAT Verbal: 29Location: In stateGender: FemaleRace: African AmericanAge: 18Major: Undecided Do you really know them?

  19. Paradigm Shift Enrollment Management / Retention Academic Advisor First-Year Seminar Instructor Academic Department Heads Financial Aid Minority Student Affairs I’m struggling in my math class • What would happen if… • ALL faculty/staff were responsible for student success? • YOU knew student was struggling? • Could you do something about it before it was too late? I’m thinking about transferring Student Affairs Academic Affairs Residence Hall Staff I’m really homesick I don’t think I can afford college My roommate and I argue all the time

  20. MAP-Works Mission – 4 Way Academic Success: Improve students' ability to succeed academically by realigning behavior with grade expectations and focusing on elements of academic success Student Development: Facilitate the establishment of relationships, address homesickness, identify residence hall living issues Retention: Minimize percentage of capable students who drop out due to issues that could have been addressed by self-awareness or timely intervention by staff/faculty Student Involvement: Connect students with campus resources to facilitate involvement with student organizations and campus programming

  21. MAP-Works History • In 1988, Ball State had a number of concerns… • First-year students arrived with unrealistic expectations (academics, grades, housing, etc.) • Retention rates were not as high as they wanted them to be • Faculty and staff were concerned that identifying student issues at mid-term was too late • Faculty and staff wanted better data about incoming students Fall 2008 -approximately 40 campuses will use MAP-Works! 2005, Ball State partnered with EBI to create MAP-Works 1989 to 2004, Ball State used MAP in-house 1988, Ball State developed concept

  22. MAP-Works Process Expectations Behaviors Student Profile Institution Profile Campus Resources • Social Norming • Expectations • Campus Resources Student Summary Scan Students

  23. Survey and Profile Items Understanding the Student’s Experience • Profile Information • Student information like gender and race/ethnicity • Entrance exam scores • # credit hours enrolled • Academic Integration • Academic Self-Efficacy • Core Academic Behaviors • Advanced Academic Behaviors • Commitment to Higher Education • Self-Assessment • Communication Skills • Analytical Skills • Personal Management • Time Management • Health and Wellness • Potential Issues (stress, financial, etc.) • Social Integration • Campus Relationships • Living Environment (on/off campus) • Roommate Relationships • Homesickness

  24. Who Benefits from MAP-Works? • Who benefits from MAP-Works? • First-Year Students • Departments like… • Housing & Residence Life • Academic Advising • Enrollment Management / Retention • First-Year Seminar Instructors • Minority Student Affairs • Athletic Department • Student Activities • Academic Assessment & Institutional Research • Upper Administration

  25. MAP-Works Student Reports • Student Reporting • Individualized on-line report provided directly to students within days of assessment • 3 Main Reporting Purposes • Purpose 1: Realign expectations • Purpose 2: Information to help them plan for their success • Purpose 3: Connect with appropriate campus resources

  26. Purpose 1: Realign Expectations (Example 1) Report provides benchmark information to help students realign expectations REPORTING TIP: Only provide the most valuable information in an summary report. Keep the “interesting” information for the larger report.

  27. Purpose 1: Realign Expectations (Example 2) Report provides benchmark information to help students realign expectations REPORTING TIP: Know your audience! Consider using very simple charts to relay complex information (most students don’t have experience reading complex charts/tables).

  28. Purpose 2: Plan for Success REPORTING TIP: Consider providing written explanations to help the reader draw conclusions. Report provides feedback to help student understand need to plan for their future success

  29. Purpose 3: Connect with Resources School lists campus resources/offices that link with reporting area REPORTING TIP: Consider providing additional information the reader may want to access.

  30. Evaluating MAP-Works Student Reporting Step 1: Understanding the Audience – Students have less experience with information; reporting is easy to read Step 2: Appropriate Form for Information –Extremely visual and colorful. Minimal information per slide. Step 4: Choosing the Best Delivery Method – Web based is a delivery method most expected by this generation Keys to Successful Reporting Step 3: Using Valuable Information - Targeted to helping them be successful in school

  31. Examples in Action: MAP-Works Faculty/Staff Report

  32. MAP-Works Faculty/Staff Reporting • Interactive system for faculty and staff • Residence hall staff • Academic advisors • First-Year Seminar Instructors • Other staff positions? • 3 Main Reporting Purposes • Purpose 1: Identify students who may benefit from personalized attention • Purpose 2: Provide information for one-on-one meetings with students • Purpose 3: Provide input regarding programming and training needs

  33. Purpose 1: Identifying Students (Example 1) REPORTING TIP:Consider creating high level “dashboards” that provides a lot of information in an easy-to-read format. These students are not adjusting academically or socially and are not committed to their institution.

  34. Purpose 1: Identifying Students (Example 2) REPORTING TIP:Consider using simple color codes instead of statistical information to relay information. These students very interested in leadership positions – Recruit them! These students are moderately interested in a leadership position – Convince them? These students definitely not interested in a leadership position – Why? Important on résumé

  35. Purpose 2: Individual Meetings (Example 1) REPORTING TIP:Consider creating “dashboard” reports that give a 30,000 foot look at the data. Kimberly is having issues adjusting to college (both academically and socially) and early warning indicators are not good.

  36. Purpose 2: Individual Meetings (Example 2) REPORTING TIP: Consider providing easy access to in-depth information if the reader needs more clarification.

  37. Purpose 3: Programming REPORTING TIP: Consider that the information you provide can be used in multiple ways and how that information can be presented to best tell the story. These students have self rated themselves low in Public Speaking skills. How could this be addressed through programming?

  38. Evaluating MAP-Works Faculty/Staff Reporting Step 1: Understanding the Audience – Reporting is color-coded for readability; No special training required Step 4: Choosing the Best Delivery Method – Web based is a delivery method that provides most flexibility in searching through information Step 2: Appropriate Form for Information –Extremely visual and colorful. Easy navigation. Keys to Successful Reporting Step 3: Using Valuable Information – Highlights students with more critical issues; dashboards are focused on critical transition issues

  39. Additional Questions and Discussion… Darlena Jones, Ph.D. Director of Research and DevelopmentEducational Benchmarking Inc Darlena@webebi.com For more information about MAP-Works, contactTodd Pica, Todd@webebi.com Or, visit www.MAP-Works.com

More Related