1 / 18

JSM, Minneapolis August 7, 2005 M. Leigh Lunsford, Longwood University

Applying an Action Research Model to Assess Student Understanding of the Central Limit Theorem in Post-Calculus Probability and Statistics Courses. JSM, Minneapolis August 7, 2005 M. Leigh Lunsford, Longwood University Ginger Holmes Rowell, Middle TN State University

ahanu
Download Presentation

JSM, Minneapolis August 7, 2005 M. Leigh Lunsford, Longwood University

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Applying an Action Research Model to Assess Student Understanding of the Central Limit Theorem in Post-Calculus Probability and Statistics Courses JSM, Minneapolis August 7, 2005 M. Leigh Lunsford, Longwood University Ginger Holmes Rowell, Middle TN State University Tracy Goodson-Espy, Appalachian State University

  2. Action (Classroom) Research • Use our classes as “mini-labs” in which we can gather data and conduct experiments that will (hopefully!) improve our teaching. • NOT a randomized study! • Research done in post-calculusmathematical probability and statistics courses. • Building on previous researchers work • Not reinventing the wheel! • Comparison of our results to previous results. • Conclusions and conjectures based on our data and classroom observations may be interesting to other educators. Our Ultimate Goal: To Improve Our Teaching!

  3. The Big Picture • Work done via our NSF DUE A&I Collaborative Research Award*: • “Adaptation & Implementation of Activity & Web-Based Materials in Post-Calculus Introductory Probability & Statistics Courses” • Materials for A&I: • “A Data-Oriented, Active Learning, Post-Calculus Introduction to Statistical Concepts Methods, and Theory (SCMT),” A. Rossman, B. Chance, K. Ballman, NSF DUE-9950476 • “Virtual Laboratories in Probability and Statistics (VLPS),” K. Siegrist, NSF DUE-9652870 Use of activities and simulation (in-class and out-of-class) throughout the semester to improve teaching and student understanding in our post-calculus introductory probability and statistics courses *This project was partially support by the National Science Foundation. The project started in June 2002 and continued through August 2004.

  4. Today’s Topic! Coverage and Assessment of Student Understanding of Sampling Distributions and the CLT in “Math 300 – Introduction to Probability” • First semester of the typical post-calculus mathematical probability and statistics sequence • Prereq: Calculus II (through sequences and series) • Student Population: • Approx 50% Computer Science majors, 20% Engineering, 10% Mathematics, 20% Other Science majors and Graduate Students • Traditional Text: • Probability and Statistical Inference, 6th Edition, Hogg and Tanis

  5. Sampling Distributions and CLT Coverage • Less time spent on multivariate distributions (in order to get to CLT and its applications sooner). • Text Supplemented with: • In-class Activity using Sampling SIM* software. • Activity modified from an early SCMT activity*. • Written report based on extension of Activity (including use of Sampling SIM outside of class). • Supplemental materials had more emphasis on graphical understanding of sampling distributions and the CLT. • Also briefly covered (via the textbook) an application of the CLT via confidence intervals and sample sizes for proportions. *2001 - delMas, R. Sampling SIM. On line at http://www.gen.umn.edu/research/stat_tools/

  6. Assessment Tools Quantitative: • Used assessment tool developed by delMas, et. al.* that measures student understanding and reasoning regarding sampling distributions and the CLT. • Assessment items roughly characterized as • Graphical (the focus of this presentation) • Fact (Theory) Recollection/Computational (please see our paper) • Same assessment tool used as both a pre-test and post-test given before (week 11) and after (week 14) coverage of Sampling Distributions and CLT. • Data exploration of student responses to examine student understanding of the CLT and related concepts. Qualitative: • Student’s self perception of learning of concepts, use of technology, etc. • Survey administered at beginning and at end of semester. *Sampling Distributions Post Test - 2002 - delMas, R., Garfield, J., and Chance, B.

  7. Assessment Tool Graphical Item Example* Irregular Population • 1*. Which graph best represents a distribution of sample means for 500 samples of size n = 4? • A B C D E • 2(a)*. What do you expect for the shape of the sampling distribution? • - More like a normal dist. • - More like the population. • 2(b)*. I expect the sampling distribution to have: (choose one) • less the same more • variability than/as the population. • Same questions forn = 25. (3* and 4*) • Similar questions asked for a “skewed” population. *Please see your handout!

  8. Graphical Measures from Assessment Item • Percent of students with correct identification of sampling distribution for large and small n (sample size). • Distribution of students into “reasoning” categories by considering answer pairs for choices of sampling distribution histogram (answer for n=4, answer for n=25). Measures reasoning about variability of sampling distribution and CLT effect on shape as n increases (delMas, et. al.*). • Percent of students who showed “consistent graphical reasoning” regarding their stated belief about shape and variance of the sampling distribution versus their choice of sampling distribution. Measures taken for both “irregular” and “skewed” populations

  9. Correct Identification of Sampling Distribution *Please see your handout!

  10. Population Classification of Student Response Pairs into Reasoning Categories *2002 – delMas, R., Garfield, J., and Chance, B..

  11. A Few Observations • Students classified as having correct, good, or L-S normal reasoning for irregular population increasedfrom 11% pre-test to 78% post-test (N=18). Consistent with results of delMas, et. al.* • Majority of students in L-S Normal category. Mainly because they missed the sampling distribution for n=4. • Responses for sampling distribution when n=4: • A (8 or 44.4%), B (2 or 11.1%), C (6 or 33.3%), E (2 or 11.1%) • Responses for sampling distribution when n=25: • B (1 or 5.6%), C (2 or 11.1%), D (2 or 11.1%), E (14 or 77.8%) • Incorrect graphical interpretation (i.e.,confusing variation with frequency, improper knowledge of histogram shape)???? • Incorrect knowledge of CLT (averaging effect on shape)???? • Incorrect understanding of variability of sampling distribution of sample means???? WHY??? *2002 - delMas, et.al..

  12. Measure of Consistent Graphical Reasoning Student considered to have “consistent graphical reasoning” if sampling distribution chosen was consistent with stated variance and shape (regardless of correctness!).

  13. Details of Consistent Graphical Reasoning *Please see your handout!

  14. Conclusions/Conjectures Our students are generally displaying “consistent graphical reasoning.” So where are they going wrong? • Not recognizing that averaging effect of CLT on shape occurs quickly (waiting for the magic number n=30). • Not able to graphically estimate magnitude of standard deviation. Some may be confusing variability with frequency. • May be confusing limiting shape result of CLT with fixed variability result of sampling distribution via mathematical expectation. • Cannot necessarily expect upper-level students to extend their computational and/or theoretical knowledge to the graphical realm. • Qualitative results show students generally like using technology and believe that group work and activities contribute to their learning.

  15. What should be done next, based on what was learned? • Focus more on graphical estimation and interpretation skills earlier in the semester. • Improve lectures, activity, and homework to make a clearer distinctionbetween fixed variance result and limiting shape result for sampling distributions of sample means. • Continue to use technologywisely (especially computer simulations) to enhance teaching. • Modify assessment questions to easily target where and how students are reasoning incorrectly/correctly. • Compare our Math 300 students with our Math 400 students via the same assessment tool (see our paper*). *2005 – Lunsford, Rowell, and Tracy Goodson-Espy, in preparation..

  16. More Results! We have shown a small portion of our results! Please see our soon to be submitted paper* for: • More graphical results with more detail • Results from other parts of the assessment tool (fact recollection/computational) • Comparison of our Math 300 and Math 400 students using the same assessment tool (both courses taught in Spring ’04) • More observations and conjectures from our data! *2005 – Lunsford, et. al., in preparation..

  17. References • delMas, R., Garfield, J., and Chance B. (1999), A Model of Classroom Research in Action: Developing Simulation Activities to Improve Students' Statistical Reasoning, Journal of Statistics Education v7, n3, http://www.amstat.org/publications/jse/secure/v7n3/delmas.cfm. • delMas, R. (2001). Sampling SIM. On line at http://www.gen.umn.edu/faculty_staff/delmas/stat_tools/ • delMas, R., Garfield, J., and Chance, B. (2002), “Assessment as a Means of Instruction,” presented at the 2002 Joint Mathematics Meetings, online at http://www.gen.umn.edu/research/stat_tools/jmm_2002/assess_instruct.html • Hollins, E. R. (1999), “Becoming a Reflective Practitioner,” in Pathways to Success in School: Culturally Responsive Teaching, eds. ER Hollins and EI Oiver, Mahwah, NJ: Lawrence Erlbaum Associates. • Hopkins, D. (1993), A Teacher’s Guide to Classroom Research, Buckingham: Open University Press. • Lunsford, M. L., Rowell, G. H. Goodson-Espy, T. J. (2005), “Classroom Research: Assessment of Student Understanding of Sampling Distributions of Means and the Central Limit Theorem in Post-Calculus Probability and Statistics Classes,” in preparation. • Noffke, S., and Stevenson, R. (eds.) (1995, Educational Action Research, NY: Teachers College Press. • Rowell, G. H., Lunsford, M. L., and Goodson-Espy, T. J. (2003), “An Application of the Action Research Model for Assessment: Preliminary Report,” for the Joint Statistical Meeting Proceedings, Summer 2003.

  18. Contact Information M. Leigh Lunsford: lunsfordml@longwood.edu Ginger Holmes Rowell: rowell@mtsu.edu

More Related