380 likes | 512 Views
Purpose. To build a toolbox" of effective methods for building patterns of evidence"CoursesProgramsSurveys A New Perspective (Marie)Knowledge Probes (Curt)Archived Course Records (Marie)Student Management Teams (Curt)Commercial Exams (Marie). Please complete the green survey (leave t
E N D
1. Program Assessment:Tools that WorkFOCUS-IR PracticumNational Defense University, Feb 2003 Lt Col Marie Revak, PhD (MarieARevak@aol.com)
Mr. Curtis Hughes (curtis.hughes@usafa.af.mil)
www.usafa.af.mil/dfe/assessment.htm
United States Air Force Academy
2. Purpose To build a toolbox of effective methods for building patterns of evidence
Courses
Programs
Surveys A New Perspective (Marie)
Knowledge Probes (Curt)
Archived Course Records (Marie)
Student Management Teams (Curt)
Commercial Exams (Marie)
3. Please complete the green survey (leave the empty boxes blank)and the pinkKnowledge Probe
4. Assessment Tools & Assessment Data Quantitative and Qualitative
Direct and Indirect
Internal and External Data Sources
Product and Process
5. Think, Pair, Share What particular objective or outcome presents an assessment challenge at your own institution?
Write down your biggest challenge
Share with your neighbor
6. The USAFA Assessment Catalog Not an assessment tool, but a catalog of tools
Encourages cross-flow of ideas between and among departments and agencies
Allows for easy identification of
Data sources
Imbalances
Key decisions
New tools
Helps us examine
Low-utility, high-cost, or outdated practices
Assessment cycles
7. The USAFA Assessment Catalog Available at (www.usafa.af.mil/dfe) -- go to Academic Assessment and click on Assessment Catalog
See sample page from the Department of Law
8. Surveys A New PerspectivePurpose To collect data on
Student opinions
Student perceptions
Student attitudes
9. Surveys A New PerspectiveStrengths and Weaknesses Strengths
Students feel that they have a voice
Easy to administer
Popular method
Students familiar with the method Weaknesses
Overuse
Low response rates
Voluntary response
Design principles often ignored
Can be difficult to analyze
10. Surveys A New PerspectiveStrategies and Guidelines Select topics and word items carefully
Think about the analysis while designing the items
Ask only what you really need to know
Conduct a pilot study
Maximize your response rate
11. Surveys A New PerspectiveStrategies and Guidelines Pay attention to human subjects issues
Basic Design Principles
Place demographic items at the end of the survey
Too much demographic information threatens anonymity
List all response choices for each item
Group items with the same response choices
Make sure the response choices give you the information you need
Use italics, underlines, and boldface type to stress important ideas
Keep it short
Make it appealing to the eye
Allow a space for open-ended comments
Offer respondents the opportunity to see the aggregate results
12. Surveys A New PerspectiveStrategies and Guidelines Response rates
70%-80% is best; 50% or higher is acceptable
Quality more important that quantity
More important to have a representative sample
Can use demographics to check representativeness
Sensitive topics get lower response rates
Be considerate to your respondents
Pilot test your survey
Provide postage
Offer incentives
Keep it short
Provide clear instructions
Survey should look professional
Ask important questions
Follow-up
Another survey packet
Telephone call
13. Surveys A New PerspectiveData Analysis Experiment with different graphical methods
Resist the urge to calculate averages
Dont cut the data too many ways
Present anecdotes from the entire range of responses
Think about longitudinal analysis
14. Surveys A New PerspectiveSample Graphic
15. Surveys A New PerspectiveSample Graphic
16. Surveys A New PerspectiveSample Graphic
17. Surveys A New PerspectiveAnecdotal Data
18. Knowledge ProbesPurpose To obtain both pre and post data on conceptual knowledge and skills
To measure changes in conceptual knowledge and skills
To include student self-assessment in the assessment mix
19. Knowledge ProbesStrengths Provides students with a snapshot of course topics
Increases student awareness of their knowledge and skills
Helps instructor plan the course
Serves as a diagnostic tool
Indicator of overlooked or poorly covered material
Can serve as a wake up call for the final exam
Less time consuming than an actual exam
20. Knowledge ProbesWeaknesses Not a direct measure of conceptual knowledge or skills
Requires comprehensive course goals and objectives
Students may not be mature enough to self-assess
21. Knowledge ProbesStrategies and Guidelines Prepare comprehensive course objectives and a series of self-assessment items
Scale is: no knowledge, some knowledge, full knowledge
Administer knowledge probe or first class meeting
Analyze resultant data
Repeat knowledge probe at the end of the semester
Compare results with pre results
Compare results with final exam results
22. Knowledge ProbesData Analysis Create bar graphs for each objective
Pre results
Post results
Final Exam results
23. Knowledge ProbesSample Graphic
24. Start, Stop Continue Complete the yellow Start, Stop, Continue feedback form
Pass it to the designated Student Management Team
The SMT will summarize the results and present the summary after the break
25. Ten Minute Break
26. Start, Stop, Continue Results
27. Archived Course RecordsOverview Purpose: To capture a snapshot of the entire course and document changes to the course over several semesters.
Strategies and Guidelines: Collect and assemble course related documents.
Data Analysis: Qualitative analysis of changes, trends, and themes.
28. Archived Course RecordsStrengths and Weaknesses Strengths
Helps maintain consistency of course content and delivery
Helps contain content creep
May help faculty update their portfolios
Provides corporate knowledge
Easily accomplished immediately after close-out of semester Weaknesses
May challenge the perception of academic freedom
Can be time-consuming if not accomplished immediately
Document storage space required
29. Archived Course RecordsContents Syllabus
Handouts
Assessments
Formative
Summative
Student ratings and feedback
Textbook information
List of instructors
Summary of curricular changes
Grade summary
Reflective commentary
30. Student Management TeamsOverview Purpose: To involve students in course assessment and problem solving while the course is on-going.
Strategies and Guidelines:
Use volunteers, appoint, or have students select representatives (3-4 students).
Teams meet once per week with the instructor and may schedule additional meetings.
Data Analysis:
Qualitative summary of the accomplishments of the team.
Possible correlation with student satisfaction ratings or other data.
31. Student Management TeamsStrengths and Weaknesses Strengths
Continual feedback allows for continuous improvement
Feedback mechanism in place all semester
Students have a voice
Richer, deeper data
Instructor may give the team problems to tackle Weaknesses
Time commitment
Requires mature, responsible student participation
May require a reward mechanism
Students may suggest unreasonable changes
32. Student Management TeamsCase Study Implemented in a senior level Computer Science Course
Team consisted of a leader, recorder, and two other members
Team recorder posted minutes on web site
Met about once every 2 weeks during a free lab period (consider having more frequent meetings at the beginning of the semester)
Provided an agenda beforehand and brought copies, posted agenda on web site
Team had the option of calling their own team meeting, with or without the instructor
33. Student Management TeamsCase Study Team accomplishments
Polled the class for the top 3 complaints and top 3 things that shouldnt change
Collected and analyzed start-stop-continue data
Tackled some on-going concerns:
Recommendations on weighting individual student contributions to a group project grade
Advice on forming project groups for next semester (this was the first half of a 2 semester course)
Instructor gave feedback to the rest of the class on the previous meeting of the team so they were informed of what the team was doing
34. Please Complete the Blue Post Knowledge Survey
35. Knowledge Survey Data Analysis
36. Summary By employing multiple assessment techniques, collecting both quantitative and qualitative data, and using both direct and indirect measures, assessors are able to identify patterns of convergence in data as a basis for programmatic changes.
Assessment must actively involve faculty and students and must be tailored to meet specific program goals.
37. References and Sources American Statistical Association: http://www.amstat.org
Phillip W. Gronseth, Course Diary: A Valuable Information Source, Mathematics Teacher, Vol. 92, No. 6, September 1999, Pages 496-497
Student Management Teams: (nuhfed@isu.edu)
Structured focus groups: www.usafa.af.mil/dfe (go to Academic Assessment, Tools/Resources, Tools for Assessing Courses and Programs, Focus Group FAQ)
38. Questions?