designing and evaluating assessments for introductory statistics n.
Skip this Video
Download Presentation
Designing and Evaluating Assessments for Introductory Statistics

Loading in 2 Seconds...

play fullscreen
1 / 59

Designing and Evaluating Assessments for Introductory Statistics - PowerPoint PPT Presentation

  • Uploaded on

Designing and Evaluating Assessments for Introductory Statistics . Minicourse #1 Beth Chance ( Bob delMas Allan Rossman NSF grant PI: Joan Garfield. Outline. Today: Overview of assessment Introductions Assessment goals in introductory statistics

I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
Download Presentation

PowerPoint Slideshow about 'Designing and Evaluating Assessments for Introductory Statistics' - joben

Download Now An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
designing and evaluating assessments for introductory statistics

Designing and Evaluating Assessments for Introductory Statistics

Minicourse #1

Beth Chance (

Bob delMas

Allan Rossman

NSF grant PI: Joan Garfield

  • Today: Overview of assessment
    • Introductions
    • Assessment goals in introductory statistics
    • Principles of effective assessment
    • Challenges and possibilities in statistics
  • Overview of ARTIST database
  • Friday: Putting an assessment plan together
    • Alternative assessment methods
    • Nitty Gritty details, individual plans
  • Assessment = on-going process of collecting and analyzing information relative to some objective or goal
    • Reflective, diagnostic, flexible, informal
  • Evaluation = interpretation of evidence, judgment, comparison between intended and actual, use information to make improvements
dimensions of assessment
Dimensions of Assessment
  • Evaluation of program
    • Evaluate curricula, allocate resources
  • Monitoring instructional decisions
    • Judge teaching effectiveness
  • Evaluating students
    • Give grades, monitor progress
  • Promoting student progress
    • Diagnose student needs
types of assessment
Types of Assessment
  • Formative Assessment
    • In-process monitoring of on-going efforts in attempt to make rapid adjustments
  • Summative Assessment
    • Record impact and overall achievement, compare outcomes to goals, decide next steps
  • Example: teaching
  • Example: learning
bloom s taxonomy
Bloom’s Taxonomy
  • Knowledge
  • Comprehension
  • Application
  • Analysis
  • Synthesis
    • Interrelationships
  • Evaluation
an assessment cycle
An Assessment Cycle
  • Set goals
  • Select methods
  • Gather evidence
  • Draw inference
  • Take action
  • Re-examine goals and methods

Example: Introductory course

Example: Lesson on sampling distributions

reflect on goals
Reflect on Goals
  • What do you value?
    • Instructor and student point of view
    • Content, abilities, values
  • At what point in the course should they develop the knowledge and skills?
  • Translate learning outcomes/objectives
    • What should students know and be able to do by the end of the course
    • Must be measurable!
some of my course goals
Some of My Course Goals
  • Understand basic terms (literacy)
  • Understand the statistical process
    • Not just the individual pieces, be able to apply
  • Be able to reason and think statistically
    • role of context, effect of sample size, caution when using procedures, belief in randomness, association vs. causation
  • Communication and collaboration skills
  • Computer literacy
  • Interest level in statistics
possible future goals
Possible Future Goals
  • Process (not just product) of collaboration
  • Learn how to learn
  • Appreciate learning for its own sake
  • Develop the necessary skills to understand both what they have learned and what they do not understand
assess what you value

Assess what you value

Students value what they

are assessed on

  • Given the numbers 5, 9, 11, 14, 17, 29

(a) Find the mean

(b) Find the median

(c) Find the mode

(d) Calculate a 95% confidence interval for m

traditional assessment
“Traditional” Assessment
  • Good for assessing:
    • Isolated computational skills, (short-term) memory retrieval
    • Use and tracking of common misconceptions
    • How many right answers?
  • Provides us with:
    • Consistent and timely scoring
    • Predictor of future performance
traditional assessment1
“Traditional” Assessment
  • Less effective at assessing:
    • Can they explain their knowledge?
    • Can they apply their knowledge?
    • What are the limitations in their knowledge?
    • Can they make good decisions?
    • Can they evaluate?
    • Can they deal with messy data?
    • Role of prior knowledge
nine principles aahe
Nine Principles (AAHE)
  • Start with educational values
  • Multi-dimensional, integrated, over-time
  • Clearly stated purposes
  • Pay attention to outcomes and process
  • On-going
  • Student representation
  • Important questions
  • Support change
  • Accountability
select methods
Select Methods
  • Need multiple, complimentary methods
    • observable behavior
    • adequate time
  • Need to extend students
    • less predictable, less discrete
  • Needs to provide indicators for change
  • Need prompt, informative feedback loop
    • On-going, linked series of activities over time
    • Continuous improvement, self-assessment
  • Students must believe in its value
repeat the cycle
Repeat the Cycle
  • Focus on the process of learning
    • Feedback to both instructors and students
    • Discuss results with students, motivate responsibility for their own learning
    • Consider other factors
  • Collaborate
    • External evaluation
  • Continual refinement
    • Consider unexpected outcomes
  • Don’t try to do it all at once!
challenges in statistics education
Challenges in Statistics Education
  • Doing statistics versus being an informed consumer of statistics
  • Statistics vs. mathematics
    • Role of context, messiness of solutions, computers handling the details of calculations, need to defend argument, evaluate based on quality of reasoning, methods, evidence used
  • Have become pretty comfortable with lecture/reproduction format
    • Traditional assessment feels more objective
challenges in statistics education1
Challenges in Statistics Education
  • Reduce focus on calculation
  • Reveal intuition, statistical reasoning
  • Require meaningful context
    • Purpose, statistical interest
    • Meaningful reason to calculate
    • Careful, detailed examination of data
  • Use of statistical language
  • Meaningful tasks, similar to what will be asked to do “in real life”
some techniques
Some Techniques
  • Multiple choice
    • with identification of false response
    • with explanation or reasoning choices
    • with judgment, critique (when is this appropriate)
  • “What if”, working backwards, “construct situation that”
  • Objective-format questions
    • e.g., comparative judgment of strength of relationship
    • e.g., matching boxplot with normal prob plots
  • Missing pieces of output, background
some techniques1
Some Techniques
  • Combine with alternative assessment methods,
    • e.g., projects: see entire process, messiness of real data collection and analysis
    • e.g., case studies: focus on real data, real questions, students doing and communicating about statistics
  • Self-assessment, peer-evaluation
what can we learn
What can we learn?
  • Sampling Distribution questions

Which graph best represents

a distribution of sample means

for 500 samples of size 4?


what can we learn1
What can we learn?
  • Asking them to write about their understanding of sampling distributions
    • Now place more emphasis in my teaching on labeling horizontal and vertical axes, considering the observational unit, distinguishing between symmetric and even, spending much more time of the concept of variability
  • Knowing better questions to ask to assess their understanding of the process
hw assignment
HW Assignment
  • Assessment Framework
    • WHAT: concept, applications, skills, attitudes, beliefs
    • PURPOSE: why, how used
    • WHO: student, peers, teacher
    • METHOD
    • ACTION/FEEDBACK: and so?
hw assignment1
HW Assignment
  • Suggest a learning goal, a method, and an action
    • Be ready to discuss with peers, then class, on Friday
  • Sample Final Exam (p. 17)
    • Skills/knowledge being assessed
    • Conceptual/interpretative vs. mechanical/computational
  • Quick leftovers on ARTIST database?
  • Critiquing sample final exam
  • Implementation issues (exam nitty gritty)
  • Additional assessment methods
  • Holistic scoring/Developing rubrics
  • Your goal/method/action
  • Developing assessment plan
  • Wrap-up/Evaluations
sample final exam
Sample Final Exam
  • In-class component (135 minutes)
  • What skills/knowledge are being assessed?
  • Conceptual/interpretative vs. Computational/mechanical?
sample exam question 1
Sample Exam Question 1
  • Stemplot
  • Shape of distribution
  • Appropriateness of numerical summaries
  • C/I: 5, C/M: 3
sample exam question 2
Sample Exam Question 2
  • Bias
  • Precision
  • Sample size
  • C/I: 8, C/M: 0
  • No calculations
  • No recitation of definitions
sample exam question 3
Sample Exam Question 3
  • Normal curve
  • Normal calculations
  • C/I: 4, C/M: 3
sample exam question 4
Sample Exam Question 4
  • Sampling distribution, CLT
  • Sample size
  • Empirical rule
  • C/I: 4, C/M: 0
  • Students would have had practice
  • Explanation more important than selection
sample exam question 5
Sample Exam Question 5
  • Confidence interval
  • Significance test, p-value
  • Practical vs. statistical significance
  • C/I: 7, C/M: 2
  • No calculations needed
  • Need to understand interval vs. test
sample exam question 6
Sample Exam Question 6
  • Experimentation
  • Randomization
  • Random number table
  • C/I: 4, C/M: 4
  • Tests data collection issue without requiring data collection
sample exam question 7
Sample Exam Question 7
  • Experimental design
  • Variables
  • Confounding
  • C/I: 13, C/M: 0
  • Another question on data collection issues
sample exam question 8
Sample Exam Question 8
  • Two-way table
  • Conditional proportions
  • Chi-square statistic, test
  • Causation
  • C/I: 5, C/M: 9
  • Does not require calculations to conduct test
sample exam question 9
Sample Exam Question 9
  • Boxplots
  • ANOVA table
  • Technical assumptions
  • C/I: 7, C/M: 3
  • Even calculations require understanding table relationships
sample exam question 10
Sample Exam Question 10
  • Scatterplot, association
  • Regression, slope, inference
  • Residual, influence
  • Prediction, extrapolation
  • C/I: 15, C/M: 0
  • Remarkable in regression question!
sample exam question 11
Sample Exam Question 11
  • Confidence interval, significance test
  • Duality
  • C/I: 9, C/M: 2
  • Again no calculations required
sample exam
Sample Exam
  • C/I: 79, C/M: 28 (74% conceptual)
  • Coverage
    • experimental design, randomization
    • bias, precision, confounding
    • stemplot, boxplots, scatterplots, association
    • normal curve, sampling distributions
    • confidence intervals, significance tests
    • chi-square, ANOVA, regression
nitty gritty
Nitty Gritty
  • External aids…
  • Process of constructing exam…
  • Timing issues…
  • Student preparation/debriefing…
beyond exams
Beyond Exams
  • Combine with additional assessment methods,
    • e.g., projects: see entire process, messiness of real data collection and analysis
    • e.g., case studies: focus on real data, real questions, students doing and communicating about statistics
    • generation instead of only validation…
beyond exams p 8
Beyond Exams (p. 8)…
  • Written homework assignments/lab assignments
  • Minute papers
  • Expository writings
  • Portfolios/journals
  • Student projects
  • Paired quizzes/group exams
  • Concept Maps
student projects
Student Projects
  • Best way to demonstrate to students the practice of statistics
  • Experience the fine points of research
  • Experience the “messiness” of data
  • Statistician’s role as team member
  • From beginning to end
    • Formulation and Explanation
    • Constant Reference
student projects1
Student Projects
  • Choice of Topic
    • Ownership
  • Choice of Group
    • In-class activities first
  • Periodic Progress Reports
  • Peer Review
  • Guidance/Interference
    • Early in process
    • Presentation (me, alum, fellow student)
    • Full Lab Reports
project issues
Project Issues
  • Assigning Grades, individual accountability
  • Insignificant/Negative Results
  • Reward the Effort
  • Iterative
    • Encourage/expect revision
  • Long vs. Short projects
    • Coverage of statistical tools
  • Workload
holistic scoring
Holistic Scoring
  • Not analytic - each part = X points
  • Problem is graded as a whole
  • Calculations are one of many parts
  • Strengths in one section can balance weaknesses in another
holistic scoring1
Holistic Scoring
  • Did the student demonstrate knowledge of the statistical concept involved?
  • Did the student communicate a clear explanation of what was done in the analysis and why?
  • Did the student express a clear statement of the conclusions drawn?
holistic scoring2
Holistic Scoring
  • May lose points if don’t clearly explain
    • why method was chosen
    • assumptions of method
    • line of reasoning
    • final conclusion in context
developing rubrics
Developing Rubrics
  • Scoring guide/plan
    • Consistency, inter-rater reliability
  • Focus on goal/purpose of the question
    • What information do you hope to learn based on the student’s performance
  • Identify valueable student behavior/correct characteristics
  • List those characteristics in observable terms (Best/Good/Fair/Poor)
developing rubrics1
Developing Rubrics
  • Multiple procedures, variety of techniques
  • Can you see the students’ thought processes, knowledge of assumptions
developing assessment plan
Developing Assessment Plan
  • Match (most important) instructional goals
    • Start with learning outcomes, own questions
  • Multiple and varied indicators
    • Inter-related, complementary
  • Well-defined, well-integrated throughout course
    • Detailed expectations, part of learning process
  • Goals understood by students
    • Promote self-reflection, responsibility, trust
developing assessment plan1
Developing Assessment Plan
  • Timely, consistent feedback
    • indicators for change, feedback loop, reinforcement
  • Individual and group accountability
  • Openness to other (justified) interpretations, reward thoughtfulness, creativity
  • Not all at once, Not too much
  • Collaborate
  • Continual reflection, refinement
  • Assess what you value
  • Consider time requirements for students and instructor!
    • Easier to solve than to explain
    • With experience, become more efficient
  • Provide sufficient guidance
    • Provide students with familiarity and clear understanding of your expectations
    • May not be used to being required to think!
    • Less comfortable writing in complete sentences