Strategies tips and tools for facilitating learning outcomes assessment in student services
This presentation is the property of its rightful owner.
Sponsored Links
1 / 84

Strategies, Tips, and Tools for Facilitating Learning Outcomes Assessment in Student Services PowerPoint PPT Presentation


  • 117 Views
  • Uploaded on
  • Presentation posted in: General

Student Learning Outcomes. Strategies, Tips, and Tools for Facilitating Learning Outcomes Assessment in Student Services. Jerry Rudmann, Irvine Valley College February 2008. Overview - Student Services. Fine-tuning assessment Tips for writing survey items Focus groups

Download Presentation

Strategies, Tips, and Tools for Facilitating Learning Outcomes Assessment in Student Services

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript


Strategies tips and tools for facilitating learning outcomes assessment in student services

Student Learning Outcomes

Strategies, Tips, and Tools for Facilitating Learning Outcomes Assessment in Student Services

Jerry Rudmann, Irvine Valley CollegeFebruary 2008


Overview student services

Overview - Student Services

  • Fine-tuning assessment

    • Tips for writing survey items

    • Focus groups

  • Helpful technology tools

    • Clickers - promote active learning and record SLO information

    • Rubric generators - a way to measure most anything

    • PDF Acrobat forms - autoscoring and recording student input

    • Portfolios - making students responsible and reflective

    • Scanning - some ideas

    • Tracking software - organizing all this stuff

  • Several options / strategies for making SLOs meaningful

    • SSO versus SLO

    • Problem focus

    • Less is better

    • Use what you already have

    • Think of SLOs in the context of student development

    • Qualitative assessment in OK

    • Other…?


Some options strategies for making slos meaningful

Some Options / Strategies for Making SLOs Meaningful

  • Address “robust” SLOs (overarching outcomes)

  • Problem focus

  • Less is better

  • Share SLOs with students

  • Use what you already have

  • Think of SLOs in the context of student development

  • Qualitative assessment is OK

  • SSOs vs. SLOs…


General tip 1 problem focus approach

General Tip 1: Problem Focus Approach

  • What competencies do students have difficulty mastering?

  • Focus SLO activities on problem areas.


General tip 2 keep it simple but meaningful

General Tip 2: Keep It Simple But Meaningful

  • Corollary - Often, less is better.


General tip 3 student development approach

General Tip 3: Student Development Approach

  • Student development

    • Academic self-efficacy (Bandura)

    • Academic self-regulation

    • Campus involvement (Astin)

  • Mentoring professor studies

  • Student Services DO help student success


Surveys

Surveys


Surveys slo uses

Students self-rate their competencies on program or college level learning outcomes.

Students’ satisfaction with various student services.

Surveys - SLO Uses


Types of questions

Types of Questions

Open-ended – respondents answer in own words

Closed-ended – respondent limited to a finite range of choices


Types of questions1

Types of Questions

Open-ended

Flexible

Hard to code answers

Good for preliminary work to finalize a survey

Closed-ended

Easier to code answers, process and analyze

Hard to write good closed-ended items


Item format

Item Format

Visual Analogue Scale

Food in the cafeteria is…

Poor_ _ __ _ _ _ _ _ _ _ _ _ _ _ _ _Excellent

Likert Scale

Food in the cafeteria isoutstanding!

SDDN A SA

(Strongly Agree) (Disagree) (Neutral) (Agree) (Strongly Agree)


Nine tips for designing and deploying a survey

Nine tips for designing and deploying a survey

Don’t call it a survey

Provide a carefully worded rationale or justification at the beginning

Group items by common format

Start with more interesting items

Put demographic items last

Mix in negative wording to catch acquiescence (aka “response set”)

Automate scoring when possible

If asking for sensitive information, use procedures designed to assure anonymity

Always, always, always pilot test first


Survey administration methods

Survey Administration Methods

Face to Face

Written

Group administration

Mail

Computerized http://research.ccc.cccd.edu

Password protected

Validation rules

Branching and piping

Telephone


Strategies tips and tools for facilitating learning outcomes assessment in student services

Focus Groups

Focus groups can be especially insightful and helpful for program and institutional level learning outcome assessment.

Have your college researcher provide some background materials.

Focus Groups: A Practical Guide for Applied Research

By Richard A. Krueger, Mary Anne Casey

The RP Group sponsored several “drive in” workshops over the last few years.


Goal for this section

Goal for This Section

  • Technology Uses

  • Technology ToolsExpected Outcome: Be able to select and use technology-based approaches to assess student learning outcomes


Assessment challenges

Assessment Challenges

  • Engaging Students in Self-Evaluation

  • More Efficient Assessment


Some technology tools

Some Technology Tools

  • Online Rubric Builders

  • eLumen (SLO Assessment/Tracking)

  • Classroom Responders (“Clickers”)

  • Scannable and Online Tests and Surveys

  • ePortfolios

  • Adobe Acrobat Forms

  • Excel Spreadsheets


Rubrics

Rubrics

  • Way to measure the heretofore immeasurable: products and performances.

  • A rubric breaks the assessment into important components.

  • Each component rated along a scale well-labeled scale.


Let s develop an assessment rubric for a resume

Let’s Develop an Assessment Rubric for a Resume


Chocolate chip cookie rubric

Chocolate Chip Cookie Rubric

Chocolate Chip Cookie Rubric


Rubrics are good

Rubrics are Good!

  • Facilitate staff dialogue regarding satisfactory performance.

  • Create a more objective assessment.

  • Make expectations more explicit to the student.

  • Encourage metacognitive skill of self-monitoring own learning.

  • Facilitate scoring and reporting of data.


Online discussion rubric

Online Discussion Rubric

http://www.uas.alaska.edu/sitka/IDC/resources/onlineDiscussionRubric.pdf


Design your own rubric

Design Your Own Rubric

  • Please work in groups and use the worksheet in your packet to design a scoring rubric for assessing one of the following:

    • Coffee shops

    • Syllabi

    • Customer service at retail stores

    • Grocery stores

    • Online courses


Online rubric builders

Online Rubric Builders

  • Rubrics to guide and measure learning

  • Tools

    • Rubistar http://rubistar.4teachers.org

    • Landmark Rubric Machine http://landmark-project.com/rubric_builder


Rubistar

Rubistar Art History Rubric

Rubistar


Rubric builder screen shot

Rubric Builder Screen Shot


Adobe acrobat forms

Adobe Acrobat Forms

  • Make form using MS Word

  • Import form and save as PDF form

  • Adjust the fields

  • Add fields to tally sub scores and total scores


How do you report results

How Do You Report Results?


Elumen to assess slos

eLumen to Assess SLOs

  • Reduce Time Spent Creating Reports

  • Assess Course, Program, and/or Degree-Level Outcomes

  • Share Assessment Rubrics Across Classes and Programs

  • View Individual or Aggregated Results

  • Use Online or Offline

http://www.elumen.info


Use online or offline

Use Online or Offline


Criterion based assessment

Criterion-Based Assessment

  • Rubrics are attached to each SLO

Excerpted from eLumen: A Brief Introduction by David Shupe, July 2007


Rubrics describe criteria

Rubrics Describe Criteria

  • Writes prose clearly

Excerpted from eLumen: A Brief Introduction by David Shupe, July 2007


Library of degree level slos

Library of Degree-Level SLOs

Excerpted from eLumen: A Brief Introduction by David Shupe, July 2007


And rubrics link to slos

And Rubrics Link to SLOs

Excerpted from eLumen: A Brief Introduction by David Shupe, July 2007


Science and gen ed slos rubrics

from the Science committee

from the Biology Department

from the faculty

committee on

critical thinking

from the faculty

committee on

communication

skills

Science and Gen Ed SLOs/Rubrics

Excerpted from eLumen: A Brief Introduction by David Shupe, July 2007


Scorecard for all students in the course

Scorecard for All Students in the Course

Excerpted from eLumen: A Brief Introduction by David Shupe, July 2007


Class scores by student

Class Scores by Student

Excerpted from eLumen: A Brief Introduction by David Shupe, July 2007


Aggregated data for course

Aggregated Data for Course

Excerpted from eLumen: A Brief Introduction by David Shupe, July 2007


Course aggregates by program

Course Aggregates by Program

Excerpted from eLumen: A Brief Introduction by David Shupe, July 2007


Classroom responders

Classroom Responders

  • Engage students

  • Monitor student understanding

  • Quickly and easily collect and store assessment data

  • Use publisher item banks or create your own


Renaissance classroom response system

Renaissance Classroom Response System

PBS Demo


Most valuable tip is

Most valuable tip is…

  • Finding ways to use technology to make SLO and SSO assessment easier and more efficient

  • Concentrating SLO work on skills students have difficulty mastering

  • Building SLOs around student development (self-efficacy, goal clarity, etc.)


Renaissance learning for clicker training resources

Renaissance Learningfor clicker training resources

http://www.renlearn.com


Scanning technology

Scanning Technology

  • A way to gather survey input from students

  • A way to test students’ knowledge

http://www.scantron.com and http://www.renlearn.com


Surveys and tests

Surveys and Tests

  • Online or Scannable

  • Surveys

    • Pre and post surveys of student self evaluation of progress

    • Gather stakeholder (faculty, business community leaders, advisory groups) input on expected learning outcomes

    • Student satisfaction with service (SSO)

  • Quizzes/Tests

    • Practice and graded


Some survey software options

Some Survey Software Options

  • Scannable surveys and quizzes - Optical Mark Reader by Remark (OMR Remark)

    • Need software and a Fujitsu scanner

    • Use word processor to create scannable bubble-in surveys or answer sheets.

    • Produces item analysis output.

  • Online survey tools

    • eListen (Scantron Co.)

    • SelectSurvey.net http://www.classapps.com/SelectSurveyNETOverview.asp


Excel spreadsheets

Excel Spreadsheets

Example of autoscoring and record keeping in a Japanese Program.


Eportfolios

ePortfolios

  • Advantages

    • Document artifacts of learning

    • Support diverse learning styles

    • Authentic assessment

    • Course, program, or degree-level tracking

    • Job skill documentation

  • Proprietary or Open Source

    • ePortfolio and Open Source Portfolio


Eportfolio org assessment

ePortfolio.org Assessment

Lock Assignments after submission

Random selection of assignments by learning objective

Anonymity of the student who produced the assignment and the instructor

Access to the work and the scoring rubrics

Reports aggregate scores; generate frequencies/means

Ability to download raw data which can be analyzed in another format

http://www.eportfolio.org


Open source portfolio

Open Source Portfolio

  • Aligned with Sakai

  • Admins or Faculty can structure and review work

  • Learning matrix documents levels of work

http://www.osportfolio.org


Resources

Resources

  • eListen: http://www.elisten.com

  • eLumen:http://www.elumen.info

  • ePortfolios:

    • ePortfolio.org:http://eportfolio.org

    • Open Source Portfolio:http://www.osportfolio.org/

    • For others, see EduTools ePortfolio product comparison: http://eportfolio.edutools.info/item_list.jsp?pj=16

  • Online Rubric Builders

    • Rubistar: http://rubistar.4teachers.org

    • Landmark Rubric Machine: http://landmark-project.com/rubric_builder/index.php

    • Coastline Rubric Builder: http://rubrics.coastline.edu

  • Remark Survey Software:http://www.principiaproducts.com/web/index.html

  • Renaissance Classroom Responders: http://www.renlearn.com/renresponder/

  • SelectSurvey.Nethttp://www.classapps.com/SelectSurveyNETOverview.asp


Contact info acknowledgements

Contact Info & Acknowledgements

Dr. Jerry Rudmann,

Professor of PsychologyIrvine Valley [email protected]

Much of this slide show was adapted (with the express written permission) from Pat Arlington, Instructor/Coordinator Instructional ResearchCoastline Community [email protected]


Strategies tips and tools for facilitating learning outcomes assessment in student services

Development of New SLO MeasuresProcedure, Findings, Conclusions and Recommendations from a Recent Exploratory Study


Purpose of the study

Purpose of the Study

The study was designed to explore whether assessment tools used to measure cognitive variables -- e.g., goal clarity, self-efficacy -- could serve as learning outcome measures in Student Services.


The spark for this study

The Spark for this Study

  • The need for truly appropriate and really useful assessment measures in Student Services.

  • Ideas generated by interviews with counselors.


Possible relationships

Possible Relationships

  • Students’ Academic Outcomes

  • Short term outcomes

  • Semester GPA

  • Units earned

  • % units completed

  • Return next semester

  • Long range outcomes

  • GPA

  • Units earned

  • Certificate, degree, and/or transfer

  • Attributes of New Students

  • Confidence level

  • Goals

  • Motivation

  • Study skills and habits


Possible relationships1

Possible Relationships

  • Students’ Academic Outcomes

  • Short term outcomes

  • Semester GPA

  • Units earned

  • % units completed

  • Return next semester

  • Long range outcomes

  • GPA

  • Units earned

  • Certificate, degree, and/or transfer

  • Attributes of New Students

  • Confidence level

  • Goals

  • Motivation

  • Study skills and habits

Student Services

College success courses

Academic counseling

Career Center presentations

Career counseling

Career course

Club, team, chorus, band, student government or other form of social connectedness

Formal and informal recognition for progress

Non-academic counseling

Transfer Center programs

Peer advisors

Tutoring center

University tours


Procedure

Procedure

  • Counselor interviews (preliminary brainstorming)

  • Literature survey for promising assessments tools

  • Recruitment presentations at Region 8 DSPS and EOPS meetings

  • A website was created having all assessments online


Study website

Study Website


Measures we tried

Measures We Tried

  • Academic and Career Goal Clarity

  • Academic Self-Efficacy

  • Dispositional Hope

  • Self-Regulation

  • Optimism

  • Positive Affect

  • Negative Affect


Summary and examples of measures found most useful in this study

Summary and Examples of Measures Found Most Useful in This Study

  • Academic Self-Efficacy

    • Beliefs about one’s capabilities to learn or perform at designated levels. Compared with students who doubt their learning capabilities, those who feel efficacious for learning or performing a task participate more readily, work harder, persist longer when they encounter difficulties, and achieve at a high level.

      • I know how to schedule my time to accomplish tasks.

      • I know how to study to perform well on tests.

  • Academic Self-Regulation

    • Confidence in ability to perform various academic tasks.

      • I can take notes of class instruction.

      • I know how to use the library to get information for assignments.

  • Academic and Career Goal Clarity

    • Measures clarity of immediate and long range academic plans and extent to which student has career clarity.

      • I have worked with a counselor to develop a plan listing the courses I need to complete my lower division coursework.

      • I have decided on an academic major.

      • I am familiar with the daily work routine for people working in my desired career.


Participation

Participation

  • DSPS study

    • Seven colleges

    • Students

      • Pre-test - 142

      • Post-test -127

  • EOPS study

    • Six colleges

    • Students

      • Pre-test - 276

      • Post-test - 154


Descriptive statistics from the dsps study

Descriptive Statistics from the DSPS Study


Intercorrelations among scales and academic outcomes dsps study

Intercorrelations Among Scales and Academic Outcomes (DSPS study)


Predicting academic outcomes based on correlation matrix and stepwise regression analyses

Predicting Academic Outcomes (Based on Correlation Matrix and Stepwise Regression Analyses)


Impact of services on student outcomes

Impact of Services on Student Outcomes


Dsps study self regulation receipt of counseling and semester gpa

DSPS StudySelf Regulation, Receipt of Counseling, and Semester GPA

Low SR - Yes Counseling = 32

Low SR - No Counseling = 31

High SR - YesCounseling = 20

High SR - No Counseling = 20

(N = 103)


Dsps study self regulation receipt of counseling and percentage of units earned for units attempted

DSPS StudySelf Regulation, Receipt of Counseling, and Percentage of Units Earned for Units Attempted

Low SR - Yes Counseling = 32

Low SR - No Counseling = 32

High SR - Yes Counseling = 20

High SR - No Counseling = 20

(N = 103)


Eops study self regulation peer advising and semester gpa

EOPS StudySelf Regulation, Peer Advising, and Semester GPA


Changes in goal clarity and receipt of transfer assistance

Changes in Goal Clarity and Receipt of Transfer Assistance


Changes in goal clarity and receipt of peer advisement

Changes in Goal Clarity and Receipt of Peer Advisement


Limitations of study

Limitations of Study

  • Lack of random selection and assignment to treatments

  • Self-selection bias

  • Results are correlational, not causal

  • Data are an aggregate from the participating colleges, but there may be significant differences among colleges, procedures, services, personnel, etc.


Thoughts

Thoughts…

The instruments are

  • Inexpensive, easy to complete and score

  • Can help identify “at risk” students

  • Can help formulate appropriate ways to assist students

  • Gain scores derived from pre to post-test assessments can be useful

  • Use as SLO assessment instruments that are good matches to the services provided within Student Services


Recommendations

Recommendations

  • “Map” your services to the constructs measured by these instruments.

  • Develop new interventions where none currently exist.

  • Create an assessment referral system.


The three instruments found most useful

Academic and Career Goal Clarity(Tucker & Rudmann, 2006)

Measures overall clarity and sub-components of goal clarity

Academic Self-Efficacy (Chemers, Hu, & Garcia, 2001)

Measures confidence in reaching positive academic outcomes

Efficacy for Self-Regulated Learning (Zimmerman, Bandura, & Marinez-Pons, 1992)

Measures confidence in one’s ability to manage and regulate academic tasks students face in college

The Three Instruments Found Most Useful


Goal clarity instrument structure

Goal Clarity Instrument Structure


One interpretation

One Interpretation


Converting to adobe acrobat interactive form

Converting to Adobe Acrobat Interactive Form


Current potential services for enhancing these important student learning outcomes

Current & Potential Services for Enhancing These Important Student Learning Outcomes


Research team

Research Team

Jerry Rudmann, PhD

Professor of Psychology

Irvine Valley College

[email protected]

Kari Tucker, PhD

Professor of Psychology,

Department Chair

[email protected]

Shañon Gonzalez, MA

Coastline College

Research Assistant III

[email protected]


Four sources of efficacy beliefs

Four Sources of Efficacy Beliefs

  • Mastery experiences -- Outcomes interpreted as successful raise efficacy, those interpreted as failures lower efficacy

  • Vicariously – success or failure of models

  • Verbal persuasions by others – positive or negative appraisals by others

  • Physiological states (e.g., anxiety, stress, arousal, fatigue, mood) act as information about efficacy beliefs and can raise or lower efficacy


  • Login