1 / 26

Program Level SLOs and Assessment Plans

Program Level SLOs and Assessment Plans. Kim Anderson ASLO Subcommittee Chair Spring 2010. Introduction. Instructional programs are more than a collection of random courses. Each program prepares students for a goal, such as transfer to a university or entering the workforce.

zelig
Download Presentation

Program Level SLOs and Assessment Plans

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Program Level SLOsand Assessment Plans Kim Anderson ASLO Subcommittee Chair Spring 2010

  2. Introduction • Instructional programs are more than a collection of random courses. • Each program prepares students for a goal, such as transfer to a university or entering the workforce. • Each program provides students with a definite set of skills, knowledge, and attitudes. • Instructional program level student learning outcomes state these results in measurable terms.

  3. Purpose • Instructional program assessment • finding out systematically what we often know (or think we know) informally • what impact our programs are having on students • whether our programs are achieving their desired learning outcomes • Having a written plan in place helps • enables faculty to play central role in managing student learning • facilitates program review process • keep everyone on the same page/management guideline • ensures that the program can be continued if a key individual leaves • documents the nature of the assessment program for outside agencies • facilitates periodic, not episodic, assessment of student learning and program outcomes

  4. Alignment • Department vs. Program • Departments constitute the organizational management structure; primarily used for fiscal and resource planning; business information for the Program Plan/Program Review Process • Programs are the curricular management structure ofthe instructional component for the Outcomes Assessment Process • Mission>Goals>Program SLOs/within a department • Department mission statement expresses the principal focus, values, and purpose (Program Plan/Program Review) • Department goals are overarching general statements that describe the department's strategic direction; what the department plans to do and are reflective of long-term priorities (Program Plan/Program Review) • Program mission statement is a broad statement of the program's direction. It should reflect the program's contribution to the educational and career paths of students who encounter the program specifically (Outcomes Assessment) • Program SLOs identify concrete statements of the most important general knowledge, abilities, and/or attitudes that students obtain upon the completion of an instructional program’s field of study (Outcomes Assessment)

  5. Alignment • Instructional Levels/integration • Course level: outcomes assessment examines the degree to which the SLOs are evidenced in demonstrated student learning. • Program level: outcomesassessment seeks to determine the degree to which programmatic learning outcomes are being met. • Institution level: outcomes assessment seeks to determine the degree to which broad institutional outcomes are being met.

  6. Alignment & Integration

  7. AlignmentBy Curriculum Mapping • Course SLOs > Program SLOs • Program SLOs > Instructional Program Outcomes • Instructional Program Outcomes > college’s mission • GEOs > Plan A > GE Philosophy

  8. Benefits of Alignment When these five elements • student expectations, • faculty expectations, • curriculum SLOs & content, • institutional support, • governance for assessment are in alignment, courses, programs, and student learning is likely to be effective and run more smoothly.

  9. Benefits of SLO Assessment • Documentation of best practices: • Share innovative teaching practices • Develop proper curriculum • Create student success initiatives • All in a convincing and reputable manner • Better evidence on outcomes: • Reliance on standard/broad measures of program effectiveness has difficulties—program SLO assessment can put such data in perspective. • Systematic program assessment can inform better decisions and target student success efforts more effectively.

  10. Definition of a Program • Concept: Addresses learning outcomes that are accomplished across multiple courses in the program’s core curriculum.Cumulative learningapproaches outcomes assessment from a more holistic mastery expectation than at the course level. • LBCC: Curriculum Guides; Precollegiate/Noncredit/Stand Alone; Hybrid; Cross-curricular

  11. Further definition specifics • The college’s curriculum guides will define a program (degrees). • Precollegiate (800), noncredit (600), and stand-alone courses align with the originating department, and the department will be defined as a program. • Hybrid programs encompass student experiences delivered by instructional and service components within a department, and the department will be defined as a program (e.g. LAR, ASD, Library, Counseling). • Cross-curricular programs encompass integrated student experiences that extend beyond the instructional component and across departments (e.g. Honors, Athletics, Study Abroad).

  12. Program Representation • Clear and accurate information to stakeholders (in and out of the college) • Inclusion of the program’s mission and SLOs on: • curriculum guides • program websites • other official materials

  13. Entire Outcomes Assessment Process Overview 1. Outcomes: Describe what students must know, do and value at the conclusion of the program. 2. Assessment: Indicate how the department will determine whether learning outcomes have been met, including methods, who is responsible for the assessment, and when the data will be collected. 3. Criteria: Establish the expected achievement of success (based on previous data if possible), target groups, and how will the program faculty will determine if that outcome is successful or if change is required to improve student learning. 4. Results: Indicate who, when, where, and how the results will be collected, aggregated, analyzed, reported (actual results, key findings, and supportable conclusions). 5. Actions: Describe the changes made to the program based on this information, provisions for sharing the plan with internal and external audiences; what new support mechanisms were required, if any, and the time frame when these actions will be re-evaluated in the future.

  14. Process Preparation • Time & organization of personnel • Aim for the program-level outcomes assessment plan to develop over the period of a semester; try to do it more quickly • Consultation & research possibly • Thinking & planning • Unanimity is not a goal; find decisions that most people can live with • Looking at student development, then the work of individual students has to be tracked over a period of time (safeguard privacy) • Looking at program efficaciousness, then assess a random sample at the beginning and another random sample at the end of the program • Looking at acquisition of program learning, then assess the a random sample at the end of the program (course or experience) • Philosophy of a program/Purpose and Goals of a program (mission statement) • Coordinated to the program plan/program review process • Start where are now even if see potential changes in near-term; build on those insights as develop this process • Resources required to carry out an assessment plan should be considered

  15. Plan Development Only 1. Outcomes: Describe what students must know, do and value at the conclusion of the program. 2. Assessment: Indicate the task/tool the department will use to determine whether learning outcomes have been met; including what assessment methods will be used, how the assessment will be conducted, who is responsible for the assessment, when the data will be collected and reported, and wherethis task will occur. 3. Criteria: Establish the expected achievement of success (based on previous data if possible), how will the program faculty will determine if that outcome is successful or if change is required to improve student learning, and identify the students included for the assessment.

  16. Program Sample Assessment Plan Layout

  17. 1. SLOs • focus on a few key learning outcomes • touch on a range of knowledge, skills and attitudes that represent what faculty, employers, students, and other stakeholders will value and consider to be essential • impossible to measure everything • Categories: • Knowledge and Skills: Declarative (what), procedural (how), and conditional (when & why) or Cognitive (Bloom’s) • Attitudes and Values: employers talk about “soft skills”; educators disagree about whether it is desirable, or even possible; barriers to learning are also likely to be different; the need for an educated workforce and civil society

  18. SLOs continued • Typically broader than course level • Tend to emphasize integrating skills into an interrelated set • Often put more stress on real world applications/student’s next experience • Identification: What do we want students to know and do when they successfully complete a program’s curriculum? • Number: 2-5 • should include learning obtained from both degree and certificate of achievement (18+ units)—these are on student transcripts • Development options: • the mission statement for the program; • learning outcomes published through a professional organization; • the required courses in the major; • a capstone course in the program’s curricular sequence; • areas of concentration within the program; • all courses within a program; • a program’s own curricular map. • General Rules: • approval of most faculty; • connected meaningfully with the required curriculum; • at least one distinct program‐level student learning outcome (SLO) must be identified for each program--the Outcomes Assessment Process must be engaged to validate each program’s unique student learning; • align with the outcomes of the institution (instructional program outcomes and mission).

  19. SLO Development Options • Begin with the Major: Focus on the foundational (required) courses • Begin with the Capstone Course or Experience: Focus on the program’s capstone expectations to derive SLOs for the entire program • Begin with Areas of Concentration: Typically these manifest as certificates or groupings of courses • Begin with the Courses: Tracks of study, which build from the program’s foundational (required) courses; find commonalities • Begin with a Curriculum Map: Articulate the courses within the program as well as the requisite requirements in a matrix format then review this map to extrapolate general statements of expected learning from decided key components

  20. 2. Assessment Tools • Curriculum map: illustrate where program-level outcomes are being addressed across the program’s courses. • identify where there are gaps • identify curricular needs and resource priorities • Students’ exit knowledge, skills, and attitudes: as the culmination of the program’s series of courses or holistic experience • Students’ development: by longitudinal tracking typically of majors • Program specific learning: through pre- and post-assessment and analysis of groups of students • Outside agency data • licensing • testing • College evidence • ARCC data

  21. Assessment Tools:Considerations • Direct vs. indirect: data that measures the exact value vs. data that measures a variable related to the intended value • Course-embedded vs. programmatic: student work in the capstone course vs. conducted outside of regular courses, periodically towards the end of a student’s program • Performance vs. self-reporting: students actually demonstrate their learning vs. students describe what they have learned and evaluate it from their subjective experience • Limited-response vs. open-ended: typically an exam or survey vs. papers/projects/performance tasks with a common rubric • Use all direct or a combination of direct and indirect

  22. Assessment Tools: Examples Direct Assessment Methods • Capstone Course Evaluation • Collective Portfolios • Embedded Questions on Assignments or Exams • Capstone Student Presentations Indirect Assessment Methods • Alumni Surveys • Employer Surveys • Student Exit Interviews/Surveys • Analysis of college or departmental records • Pass rates or scores on licensure

  23. 3. Criteria and Expectations • Expectations for achievement: Indicate the number (%, fraction, actual number) of students who are expected to meet this minimum or base projections on previous assessment findings so that realistic expectations may be established • Success criteria: Establish a minimum score for success in conjunction with the assessment tool being used • Who will be assessed: may decide which students to assess that will be informative • all students who complete the program requirements (i.e., exiting students) • all students who started the program • a truly random sample of students enrolled in the program (use student identification numbers and pick a certain amount) • a representative sampling of students (day/night; male/female; major/non-major; LAC/PCC) • one student who will be representative of the student population in the program

  24. Management • Alignment with Program Review 3-year cycle • complete assessing all SLOs at least once within that cycle • Responsibilities within the program/department • Cost and time are major issues. Be prepared to compromise on the sample size in order to attain feasibility in implementation. • Reporting occurs in TracDat

  25. Outcomes Assessment Plansdynamic and ongoing • Provides structure for the outcomes assessment process to capture discussions and actions • Can always be reviewed and revised as the need arises or as new developments occur • Report results and actions taken • Go through the cycle again to re-evaluate • Continuous refinements to improve student learning and program development

More Related