1 / 53

Assessing The Effectiveness Of Your Academic Advising Program

Assessing The Effectiveness Of Your Academic Advising Program. Tom Grites Assistant Provost The Richard Stockton College Tom.Grites@stockton.edu. Audience Poll. What are your expected learning outcomes for this webinar? A. Why do we assess? B. What do we assess? C. Where do we begin?

yvon
Download Presentation

Assessing The Effectiveness Of Your Academic Advising Program

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Assessing The Effectiveness Of Your Academic Advising Program Tom Grites Assistant Provost The Richard Stockton College Tom.Grites@stockton.edu

  2. Audience Poll What are your expected learning outcomes for this webinar? A. Why do we assess? B. What do we assess? C. Where do we begin? D. What tools do I use? E. Who defines our success? F. What I need to do tomorrow.

  3. Overview • Terminology and Rationale • Assessment as a Process • Focus on the Contexts • Non-Assessment • Summary

  4. Terminology • Assessment • Evaluation • Measurement (the tools)

  5. Assessment(re: academic advising) “Assessment is the process through which we gather evidence about the claims we are making with regard to student learning and the process/delivery of academic advising in order to inform and support improvement”(Campbell, 2008)

  6. Uses/Contexts Assessment – tends to be more related to programmatic issues and outcomes Evaluation – tends to be more related to people (advisor) skills, performance, and outcomes It’s OK to use evaluation as part of the assessment process

  7. Intentions (related to both) • Formative – more associated with assessment; includes a wider range of efforts; requires more analysis; provides a broader perspective; focus on improvement • Summative – more associated with evaluation; more limited effort; focus on “Does it work?” or “How well was job performed?”

  8. The Rationale • “…a lack of assessment data can sometimes lead to policies and practices based on intuition, prejudice, preconceived notions, or personal proclivities – none of them desirable bases for making decisions” • (Upcraft and Schuh, 2002, p. 20)

  9. More Rationale “In God we trust; all others bring data.” “An ounce of data is worth a pound of opinion.” (Magoon, c. 1975)

  10. Other Reasons • Accountability • Effectiveness • Accreditation • Trustees/Regents • Legislators • Program Improvement (to monitor and improve student success) – themost important reason

  11. The Assessment Process:A Cycle • Resources: • Assessment of Academic Advising Package (3 CDs available from NACADA via www.nacada.ksu.edu) • Assessment of Academic Advising Institute (Feb 12-14, 2014 Albuquerque, NM)

  12. Getting Started: Identify Stakeholders • Complete set of advising constituents (students, staff and faculty advisors) • Broad range of key offices (Registrar, Enrollment Management, similar advising units, certain campus referral resources, IR office) • Critics, Antagonists, and Naysayers • FYIs – Faculty Senate, Deans Council, Retention Committee, others as appropriate

  13. The Advising Hub

  14. What Do We want To Know or Demonstrate as a Result of Academic Advising? • Focus on student learning • Connect learning to mission, vision, values, goals in your advising program • How will your program contribute to student learning? • Who, what, where, when, how will learning take place? • Define measures of student learning • Gather evidence, set levels of expected performance

  15. The Assessment Process/Cycle • Alignment with institutional and unit missions • Specify goals and/or objectives • Identify the outcomes expected (student learning and/or programmatic) • Gather evidence (the measurements) • Share findings, interpretations, and recommendations • Begin implementation and re-start the cycle

  16. Mission/Purpose • A working model… • Academic advising is integral to fulfilling the teaching and learning mission of higher education. Through academic advising, students learn to become members of their higher education community, to think critically about their roles and responsibilities as students, and to prepare to be educated citizens of a democratic society and a global community. Academic advising engages students beyond their own world views, while acknowledging their individual characteristics, values, and motivations as they enter, move through, and exit the institution. • (Preamble, Concept of Academic Advising, NACADA, 2006)

  17. Goals/Objectives(how we intend to achieve our mission) These need to emanate from and reflect the nature of the unit to be assessed (total institution, Advising Center and its clientele, College Dean’s Office, etc) Examples: • To assist students to become independent and lifelong learners • To assist students in understanding the relevance of the total curriculum • To assist students in making good decisions based on their own evidence (e.g., selecting a major)

  18. Identify Outcomes Student Learning Outcomes – examples • All students will select an appropriate major by the end of their third semester. • All students will become engaged in at least one co-curricular activity each semester. • All students will be able to identify and will select courses that enhance their human capital. • At least 30% of the students will choose to participate in a service learning course. • All (CC) students will be able to distinguishamong the A.A., A.S., and A.A.S. degree programs

  19. A Task For You…re: Course Selection • How many courses are in your Catalog? (A) • How many courses are required to earn a degree from your institution? (B) • What percentage of what your institution offers do students actually take in order to earn a degree? (B/A) • Now, for each course a student takes, how many are eliminated?

  20. Outcomes (continued) • Programmatic/Process Outcomes – examples • As a result of our advising services, the retention/persistence rate of first-year students will increase by 10% in the next 3 years. • As a result of our intervention strategies, the percentage of students who are removed from academic probation will increase by 10% in the next academic year. • After two advising sessions, all students will come to their future sessions with a degree audit already run and with a plan for meeting outstanding requirements

  21. Everybody’s Favorite All students will be able to understand, appreciate, and articulate the value of general education.

  22. Gather Evidence Mapping the Experience (Maki, 2004)* • Not all outcomes will necessarily occur as a direct result of what we do as advisors, so we need to know what other learning opportunities exist in order for the students to meet our stated goals/objectives. • WHAT learning is to occur? • WHERE might it be learned? • By WHEN should it be learned? *This process can also inform the kinds of evidence that need to be gathered for appropriate assessment.

  23. The Advising Hub

  24. Types of Measurement and Data Qualitative – open-ended survey questions; focus groups; in-depth responses, but small N Quantitative – descriptive, structured, numbers and statistics from surveys, demographics, etc; limited content responses, but large N Direct– observations; recorded data; pre-post information Indirect– perceptions, inferences, even “inclinations” Use Multiple Measures!!!

  25. Gather (Multiple) Evidence • Satisfaction Surveys(OK, but not enough) • Institutional Data(changes of major, drop/add transactions, grades in gateway courses, retention and graduation rates, use of services provided elsewhere, advisor : advisee ratios, NSSE, etc) • Office Data(number of appointments vs. walk-ins, nature of sessions, results of sessions, transcript analyses, other advisor tasks/activities; “What did you learn?”) • Focus groups(of clients, of faculty advisors, others – a qualitative measure) • The Advising Syllabus* can inform what evidence should be collected *http://www.nacada.ksu.edu/Clearinghouse/AdvisingIssues/syllabus101.htm http://intraweb.stockton.edu/eyos/page.cfm?siteID=123&pageID=42#syllabus

  26. Share the Results Tips… • Be sure that the stakeholders you identified earlier are informed throughout the process in order to enable their support in the decision-making for implementation of your recommendations. • Academics have a preferred method of review, so it makes sense to conform to their expectations.

  27. Sharing the Results(Format and Content) These elements are often best provided in a standard research report or journal format… • Purposeof the assessment project • Methodof data collection • Results found • Interpretationof the results • Recommendationswith timetable for and anticipated cost of implementation • Executive Summary or Abstract

  28. How Results Will Inform Decision-Making • Revise pedagogy or curriculum or policy/procedure • Develop/reviseadvisor training programs • Design more effective programming • advising, orientation, mentoring, etc. • Increase out-of-class learning opportunities • Shapeinstitutional decision making • planning, resource allocation

  29. Sample Implementation Recommendations • Redesign the advising effort in the Orientation Program • Develop a peer advising/mentoringprogram • Streamline office procedures • Initiate proposals for policy changes • Improve communication with other service offices and personnel • Request/Reallocateresources(human, fiscal, and/or physical)

  30. You Did It!! • This will complete the assessment cycle, which provides the evidence for change and improvement. • Completion of the cycle may also provide new goals and objectives, new assessment strategies and tools, and other aspects that will be need to be included in beginning the next cycle. (See Darling, 2005 handout)

  31. You’ve Earned a Break Please take a few minutes to submit any questions you may have at this point via the chat function.

  32. Back to the Original Contexts People… Academic advising, as a teaching and learning process, requires a pedagogy that incorporates the preparation, facilitation, documentation, and assessment of advising interactions. Although the specific methods, strategies, and techniques may vary, the relationship between advisors and students is fundamental and is characterized by mutual respect, trust, and ethical behavior. (Concept of Academic Advising, NACADA, 2006)

  33. NACADA Core Values Academic Advisors are responsible • to the individuals they advise • for involving others, when appropriate, in the advising process • to their institutions • to higher education in general • to their educational community • for their professional practices and for themselves personally

  34. Assessment (Evaluation) of Advisors • SELECTION • TRAINING • EVALUATION • RECOGNITION/REWARD

  35. Selection of Academic Advisors • Use the best • Add from other resources/units • Target specific populations • Cross disciplinary lines • Develop mentors • Use other skills/expertise

  36. Potential Pitfalls Making a distinction Faculty Advising (Programmatic; Assessment) Faculty Advisors(Personal; Evaluation) Inappropriate Comparisons Professional Academic Advisors Peer Advisors No Improvement Plan Training

  37. Faculty vs. Professional Staff Advisors • Too often all are expected or required to advise, but also teach, publish, seek grants, etc – no selection • Training ranges from near nothing to perhaps a day or 2, but usually only a few hours • Evaluation is not systematic • Recognition/Reward is very limited in the tenure and promotion process; mostly intrinsic; can also be a reverse structure (better = more) • They are hired via a search process and have specific job descriptions – they are selected • Their training is systematic, intentional, and ongoing; staff development is expected • They are evaluated through annual performance reviews • They are rewarded with salary and benefits

  38. ASSESSMENT (Evaluation) • 37 % OF ALL INSTITUTIONS HAD NO PERFORMANCE EVALUATION MEASURES FOR FACULTY IN THEIR ACADEMIC ADVISING ROLE • 44 % in 2 yr public institutions • 25 % in 4 yr public institutions • 39 % in 4 yr private institutions (Habley, 2004)

  39. PARAMETERS (faculty advisors) • Faculty Contract • List of Responsibilities • Availability of Resources • Assignment of Advisees • Recognition/Reward

  40. Tools for Assessment (and/or Evaluation) OfAdvisors • Self evaluation • Student surveys (locally designed) • Survey of Academic Advising (ACT) • Academic Advising Inventory (NACADA) • Student Satisfaction Inventory (Noel-Levitz) • NACADA Clearinghouse

  41. Back to the Original Contexts • Program… • “…a lack of assessment data can sometimes lead to policies and practices based on intuition, prejudice, preconceived notions, or personal proclivities – none of them desirable bases for making decisions” • (Upcraft and Schuh, 2002, p. 20)

  42. Other Tools and Strategies • Satisfaction Surveys • Institutional Data • Office Data • Focus groups • The Advising Syllabus • External Reviews • CAS Standards • Others…

  43. CAS Assessment Worksheet

  44. An Economic Model • Though not an outcomes-based model per se, this approach to assessment is a functional analysis based on the premise that every task an advisor performs and every operation that an advising unit conducts has some monetary value related to it. • The analysis results in a comparison of the fiscal expenditures required to perform the tasks to the cost benefits as results. • The model operates from the perspective of a threat to the existence of an advising unit, function or personnel. A quick example…

  45. Determining Your Worth • Identify every function the unit performs • Identify all possible alternatives for each function, if the unit was dissolved • Determine the cost of those functions that cannot be replaced and who would perform them; estimates will sometimes be required • Determine the cost of those functions that could be eliminated (In Markee and Joslin, 2011)

  46. Where are the data? Bill Gates – “colleges today know more about how many kids attend basketball games and which alumni give money than how many students showed up for economics class during the week…” (jn review of Academically Adrift).

  47. Where are the Data? Jeff Selingo – “Think about it. Before we buy a car, we can find various measures on everything from gas mileage to results of safety tests. We can turn to objective sources to check comparisons of similar vehicles and see which cars hold their value over time. But when it becomes to potentially one of the most expensive purchases in a lifetime, the attitude from colleges has always been that we should just trust them on the quality of their product.” (p. 25)

  48. What Are We Not Assessing…And Should We Be? • Student expectations, intentions • Whether advising strategies actually can be attributed to different types of student success (removed from probation, successful choice of major, overcome a skills deficiency or harmful social habit, etc) • Retention and graduation rates of transfer students

  49. Expectations vs. Experience ExpectExperience Be Undecided 7% 20% Change Majors 12 65-85 Fail a course 1 16 Extra time to complete degree 8 60 Drop out 1 40 Transfer institutions 12 28 Work while in school 36 60 Seek personal counseling 6 27 Need tutoring 15 20 Seek career guidance 5 25 (Habley 2011)

  50. Non-Assessment (continued) • Use and value of articulation agreements – number of students who use them, are they updated • Currency of academic policies, e.g., course repeats, course pre-requisite criteria, drop/add/withdrawal processes, academic warning, probation, and suspension policies • Does advisor training result in better advising?

More Related