Designing and evaluating assessments for introductory statistics
Download
1 / 59

Designing and Evaluating Assessments for Introductory Statistics - PowerPoint PPT Presentation


  • 150 Views
  • Updated On :

Designing and Evaluating Assessments for Introductory Statistics . Minicourse #1 Beth Chance ([email protected]) Bob delMas Allan Rossman NSF grant PI: Joan Garfield. Outline. Today: Overview of assessment Introductions Assessment goals in introductory statistics

Related searches for Designing and Evaluating Assessments for Introductory Statistics

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Designing and Evaluating Assessments for Introductory Statistics' - joben


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
Designing and evaluating assessments for introductory statistics

Designing and Evaluating Assessments for Introductory Statistics

Minicourse #1

Beth Chance ([email protected])

Bob delMas

Allan Rossman

NSF grant PI: Joan Garfield


Outline
Outline Statistics

  • Today: Overview of assessment

    • Introductions

    • Assessment goals in introductory statistics

    • Principles of effective assessment

    • Challenges and possibilities in statistics

  • Overview of ARTIST database

  • Friday: Putting an assessment plan together

    • Alternative assessment methods

    • Nitty Gritty details, individual plans


Overview
Overview Statistics

  • Assessment = on-going process of collecting and analyzing information relative to some objective or goal

    • Reflective, diagnostic, flexible, informal

  • Evaluation = interpretation of evidence, judgment, comparison between intended and actual, use information to make improvements


Dimensions of assessment
Dimensions of Assessment Statistics

  • Evaluation of program

    • Evaluate curricula, allocate resources

  • Monitoring instructional decisions

    • Judge teaching effectiveness

  • Evaluating students

    • Give grades, monitor progress

  • Promoting student progress

    • Diagnose student needs


Types of assessment
Types of Assessment Statistics

  • Formative Assessment

    • In-process monitoring of on-going efforts in attempt to make rapid adjustments

  • Summative Assessment

    • Record impact and overall achievement, compare outcomes to goals, decide next steps

  • Example: teaching

  • Example: learning


Bloom s taxonomy
Bloom’s Taxonomy Statistics

  • Knowledge

  • Comprehension

  • Application

  • Analysis

  • Synthesis

    • Interrelationships

  • Evaluation


An assessment cycle
An Assessment Cycle Statistics

  • Set goals

  • Select methods

  • Gather evidence

  • Draw inference

  • Take action

  • Re-examine goals and methods

    Example: Introductory course

    Example: Lesson on sampling distributions


Reflect on goals
Reflect on Goals Statistics

  • What do you value?

    • Instructor and student point of view

    • Content, abilities, values

  • At what point in the course should they develop the knowledge and skills?

  • Translate learning outcomes/objectives

    • What should students know and be able to do by the end of the course

    • Must be measurable!


Some of my course goals
Some of My Course Goals Statistics

  • Understand basic terms (literacy)

  • Understand the statistical process

    • Not just the individual pieces, be able to apply

  • Be able to reason and think statistically

    • role of context, effect of sample size, caution when using procedures, belief in randomness, association vs. causation

  • Communication and collaboration skills

  • Computer literacy

  • Interest level in statistics


Possible future goals
Possible Future Goals Statistics

  • Process (not just product) of collaboration

  • Learn how to learn

  • Appreciate learning for its own sake

  • Develop the necessary skills to understand both what they have learned and what they do not understand


Assess what you value

Assess what you value Statistics

Students value what they

are assessed on


Example
Example Statistics

  • Given the numbers 5, 9, 11, 14, 17, 29

    (a) Find the mean

    (b) Find the median

    (c) Find the mode

    (d) Calculate a 95% confidence interval for m


Traditional assessment
“Traditional” Assessment Statistics

  • Good for assessing:

    • Isolated computational skills, (short-term) memory retrieval

    • Use and tracking of common misconceptions

    • How many right answers?

  • Provides us with:

    • Consistent and timely scoring

    • Predictor of future performance


Traditional assessment1
“Traditional” Assessment Statistics

  • Less effective at assessing:

    • Can they explain their knowledge?

    • Can they apply their knowledge?

    • What are the limitations in their knowledge?

    • Can they make good decisions?

    • Can they evaluate?

    • Can they deal with messy data?

    • Role of prior knowledge



Nine principles aahe
Nine Principles (AAHE) do

  • Start with educational values

  • Multi-dimensional, integrated, over-time

  • Clearly stated purposes

  • Pay attention to outcomes and process

  • On-going

  • Student representation

  • Important questions

  • Support change

  • Accountability


Select methods
Select Methods do

  • Need multiple, complimentary methods

    • observable behavior

    • adequate time

  • Need to extend students

    • less predictable, less discrete

  • Needs to provide indicators for change

  • Need prompt, informative feedback loop

    • On-going, linked series of activities over time

    • Continuous improvement, self-assessment

  • Students must believe in its value



Repeat the cycle
Repeat the Cycle do

  • Focus on the process of learning

    • Feedback to both instructors and students

    • Discuss results with students, motivate responsibility for their own learning

    • Consider other factors

  • Collaborate

    • External evaluation

  • Continual refinement

    • Consider unexpected outcomes

  • Don’t try to do it all at once!



Challenges in statistics education
Challenges in Statistics Education learning

  • Doing statistics versus being an informed consumer of statistics

  • Statistics vs. mathematics

    • Role of context, messiness of solutions, computers handling the details of calculations, need to defend argument, evaluate based on quality of reasoning, methods, evidence used

  • Have become pretty comfortable with lecture/reproduction format

    • Traditional assessment feels more objective


Challenges in statistics education1
Challenges in Statistics Education learning

  • Reduce focus on calculation

  • Reveal intuition, statistical reasoning

  • Require meaningful context

    • Purpose, statistical interest

    • Meaningful reason to calculate

    • Careful, detailed examination of data

  • Use of statistical language

  • Meaningful tasks, similar to what will be asked to do “in real life”


Some techniques
Some Techniques learning

  • Multiple choice

    • with identification of false response

    • with explanation or reasoning choices

    • with judgment, critique (when is this appropriate)

  • “What if”, working backwards, “construct situation that”

  • Objective-format questions

    • e.g., comparative judgment of strength of relationship

    • e.g., matching boxplot with normal prob plots

  • Missing pieces of output, background


Some techniques1
Some Techniques learning

  • Combine with alternative assessment methods,

    • e.g., projects: see entire process, messiness of real data collection and analysis

    • e.g., case studies: focus on real data, real questions, students doing and communicating about statistics

  • Self-assessment, peer-evaluation


What can we learn
What can we learn? learning

  • Sampling Distribution questions

Which graph best represents

a distribution of sample means

for 500 samples of size 4?

A B C D E


What can we learn1
What can we learn? learning

  • Asking them to write about their understanding of sampling distributions

    • Now place more emphasis in my teaching on labeling horizontal and vertical axes, considering the observational unit, distinguishing between symmetric and even, spending much more time of the concept of variability

  • Knowing better questions to ask to assess their understanding of the process


Artist database
ARTIST Database learning

  • First…


Hw assignment
HW Assignment learning

  • Assessment Framework

    • WHAT: concept, applications, skills, attitudes, beliefs

    • PURPOSE: why, how used

    • WHO: student, peers, teacher

    • METHOD

    • ACTION/FEEDBACK: and so?


Hw assignment1
HW Assignment learning

  • Suggest a learning goal, a method, and an action

    • Be ready to discuss with peers, then class, on Friday

  • Sample Final Exam (p. 17)

    • Skills/knowledge being assessed

    • Conceptual/interpretative vs. mechanical/computational


Day 2

Day 2 learning


Overview1
Overview learning

  • Quick leftovers on ARTIST database?

  • Critiquing sample final exam

  • Implementation issues (exam nitty gritty)

  • Additional assessment methods

  • Holistic scoring/Developing rubrics

  • Your goal/method/action

  • Developing assessment plan

  • Wrap-up/Evaluations


Sample final exam
Sample Final Exam learning

  • In-class component (135 minutes)

  • What skills/knowledge are being assessed?

  • Conceptual/interpretative vs. Computational/mechanical?


Sample exam question 1
Sample Exam Question 1 learning

  • Stemplot

  • Shape of distribution

  • Appropriateness of numerical summaries

  • C/I: 5, C/M: 3


Sample exam question 2
Sample Exam Question 2 learning

  • Bias

  • Precision

  • Sample size

  • C/I: 8, C/M: 0

  • No calculations

  • No recitation of definitions


Sample exam question 3
Sample Exam Question 3 learning

  • Normal curve

  • Normal calculations

  • C/I: 4, C/M: 3


Sample exam question 4
Sample Exam Question 4 learning

  • Sampling distribution, CLT

  • Sample size

  • Empirical rule

  • C/I: 4, C/M: 0

  • Students would have had practice

  • Explanation more important than selection


Sample exam question 5
Sample Exam Question 5 learning

  • Confidence interval

  • Significance test, p-value

  • Practical vs. statistical significance

  • C/I: 7, C/M: 2

  • No calculations needed

  • Need to understand interval vs. test


Sample exam question 6
Sample Exam Question 6 learning

  • Experimentation

  • Randomization

  • Random number table

  • C/I: 4, C/M: 4

  • Tests data collection issue without requiring data collection


Sample exam question 7
Sample Exam Question 7 learning

  • Experimental design

  • Variables

  • Confounding

  • C/I: 13, C/M: 0

  • Another question on data collection issues


Sample exam question 8
Sample Exam Question 8 learning

  • Two-way table

  • Conditional proportions

  • Chi-square statistic, test

  • Causation

  • C/I: 5, C/M: 9

  • Does not require calculations to conduct test


Sample exam question 9
Sample Exam Question 9 learning

  • Boxplots

  • ANOVA table

  • Technical assumptions

  • C/I: 7, C/M: 3

  • Even calculations require understanding table relationships


Sample exam question 10
Sample Exam Question 10 learning

  • Scatterplot, association

  • Regression, slope, inference

  • Residual, influence

  • Prediction, extrapolation

  • C/I: 15, C/M: 0

  • Remarkable in regression question!


Sample exam question 11
Sample Exam Question 11 learning

  • Confidence interval, significance test

  • Duality

  • C/I: 9, C/M: 2

  • Again no calculations required


Sample exam
Sample Exam learning

  • C/I: 79, C/M: 28 (74% conceptual)

  • Coverage

    • experimental design, randomization

    • bias, precision, confounding

    • stemplot, boxplots, scatterplots, association

    • normal curve, sampling distributions

    • confidence intervals, significance tests

    • chi-square, ANOVA, regression


Nitty gritty
Nitty Gritty learning

  • External aids…

  • Process of constructing exam…

  • Timing issues…

  • Student preparation/debriefing…


Beyond exams
Beyond Exams learning

  • Combine with additional assessment methods,

    • e.g., projects: see entire process, messiness of real data collection and analysis

    • e.g., case studies: focus on real data, real questions, students doing and communicating about statistics

    • generation instead of only validation…


Beyond exams p 8
Beyond Exams (p. 8)… learning

  • Written homework assignments/lab assignments

  • Minute papers

  • Expository writings

  • Portfolios/journals

  • Student projects

  • Paired quizzes/group exams

  • Concept Maps


Student projects
Student Projects learning

  • Best way to demonstrate to students the practice of statistics

  • Experience the fine points of research

  • Experience the “messiness” of data

  • Statistician’s role as team member

  • From beginning to end

    • Formulation and Explanation

    • Constant Reference


Student projects1
Student Projects learning

  • Choice of Topic

    • Ownership

  • Choice of Group

    • In-class activities first

  • Periodic Progress Reports

  • Peer Review

  • Guidance/Interference

    • Early in process

    • Presentation (me, alum, fellow student)

    • Full Lab Reports

    • statweb.calpoly.edu/chance/stat217/projects.html


Project issues
Project Issues learning

  • Assigning Grades, individual accountability

  • Insignificant/Negative Results

  • Reward the Effort

  • Iterative

    • Encourage/expect revision

  • Long vs. Short projects

    • Coverage of statistical tools

  • Workload


Holistic scoring
Holistic Scoring learning

  • Not analytic - each part = X points

  • Problem is graded as a whole

  • Calculations are one of many parts

  • Strengths in one section can balance weaknesses in another


Holistic scoring1
Holistic Scoring learning

  • Did the student demonstrate knowledge of the statistical concept involved?

  • Did the student communicate a clear explanation of what was done in the analysis and why?

  • Did the student express a clear statement of the conclusions drawn?


Holistic scoring2
Holistic Scoring learning

  • May lose points if don’t clearly explain

    • why method was chosen

    • assumptions of method

    • line of reasoning

    • final conclusion in context


Developing rubrics
Developing Rubrics learning

  • Scoring guide/plan

    • Consistency, inter-rater reliability

  • Focus on goal/purpose of the question

    • What information do you hope to learn based on the student’s performance

  • Identify valueable student behavior/correct characteristics

  • List those characteristics in observable terms (Best/Good/Fair/Poor)


Developing rubrics1
Developing Rubrics learning

  • Multiple procedures, variety of techniques

  • Can you see the students’ thought processes, knowledge of assumptions


Developing assessment plan
Developing Assessment Plan learning

  • Match (most important) instructional goals

    • Start with learning outcomes, own questions

  • Multiple and varied indicators

    • Inter-related, complementary

  • Well-defined, well-integrated throughout course

    • Detailed expectations, part of learning process

  • Goals understood by students

    • Promote self-reflection, responsibility, trust


Developing assessment plan1
Developing Assessment Plan learning

  • Timely, consistent feedback

    • indicators for change, feedback loop, reinforcement

  • Individual and group accountability

  • Openness to other (justified) interpretations, reward thoughtfulness, creativity

  • Not all at once, Not too much

  • Collaborate

  • Continual reflection, refinement

  • Assess what you value


Cautions
Cautions! learning

  • Consider time requirements for students and instructor!

    • Easier to solve than to explain

    • With experience, become more efficient

  • Provide sufficient guidance

    • Provide students with familiarity and clear understanding of your expectations

    • May not be used to being required to think!

    • Less comfortable writing in complete sentences


Wrap up
Wrap-Up learning


ad