1 / 28

Are They Learning What (We Think) We’re Teaching?

Are They Learning What (We Think) We’re Teaching?. A. David Klappholz, Ph.D. (presenter) Stevens Institute of Technology Steven J. Condly, Ph.D. (presenter) University of Central Florida Vicki L. Almstrum, Ph.D. University of Texas at Austin Peter Henderson, Ph.D. Butler University

akio
Download Presentation

Are They Learning What (We Think) We’re Teaching?

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Are They Learning What(We Think) We’re Teaching? A. David Klappholz, Ph.D. (presenter) Stevens Institute of Technology Steven J. Condly, Ph.D. (presenter) University of Central Florida Vicki L. Almstrum, Ph.D. University of Texas at Austin Peter Henderson, Ph.D. Butler University Presented at the 5th Annual WTST (Workshop on Teaching Software Testing) February 2-5, 2006 Florida Institute of Technology Melbourne, FL

  2. Discrete Math in CS/SE/IS • Dijkstra, Hoare, Knuth, Parnas, etc. stress its importance – including logic • Many (most?) CS/SE/IS faculty agree • DM is included in • CC2001 • IS2002 • SE2004

  3. But • Faculty do not agree on DM topics, sequencing, or integration with CS/SE/IS topics • Students don’t “get” DM’s relevance (to further studies or career) • Robert Glass says DM is irrelevant for CS/SE/IS majors going into industry (it certainly is relevant for those going into research careers…hmm…) • Does the typical CS/SE/IS student master DM? (Instructors in upper-level courses often think not) • Some say CS/SE/IS curricula are being “dumbed down” math-wise

  4. Questions • How are students’ attitudes toward DM affected by (i) choice of topics, (ii) sequencing (& (iii) integration) of DM with Introductory CS/SE/IS? • How is students’ learning of DM affected by their attitude toward DM? • How do (i) choice of topics, (ii) sequencing, and (iii) integration affect performance in development of CS/SE/IS skills? (transfer?...assuming relevance.) • How do answers to 1-3 affect • Overall retention • Retention of women & under-represented minorities

  5. Questions (cont.) • Q1. How to study these issues? • Q2. Have such issues been studied before? In other STEM fields? • A. Hestenes’ work on Introductory Physics

  6. The Force Concept Inventory • David Hestenes (physicist, ASU, ca. 1990): Do Intro (HS/College) physics students learn Newtonian Mechanics? • Hestenes didn’t think so! • How to prove it, especially to other physics faculty?

  7. FCI (cont.) • Devise a survey instrument: Force Concept Inventory • Cover basic Intro Physics topics • Multiple-choice (4 choices) • one right answer • three answers corresponding to common misconceptions

  8. FCI (cont.) • HS and college faculty didn’t believe Hestenes • Were ultimately convinced after administering FCI to tens of thousands of students • Changed (improved) Intro Physics instruction radically

  9. } “Container metaphor” “go power” = medieval concept of “impetus” Missed by 42% of junior & senior science majors in a physics class at Harvard (1991)

  10. 20% of physics graduate students “Conflict (or war) metaphor” & “dominance principle” The most active agents produce the greatest force Only active agents exert forces Obstacles exert no force

  11. _ – “bouyancy” – correct answer Most informative – Force requires an active agent! “Force is Action” metaphor vs. Newtonian “universality of force” as the only causal mechanism! Discriminating power of the FCI – from saliency of its distractors What the answers tell us: – No “passive forces” – book just gets in the way! _ – “under pressure”

  12. CI’s in General (from Wikipedia) A multiple choice instrument designed to evaluate whether a person has an accurate and working knowledge of a specific set of concepts. Concept inventories are built in a multiple choice format to insure that they can be scored in an objective manner. Unlike a typical multiple choice test, however, both the question and the response choices are the subject of extensive research designed to determine both what a range of people thinks a particular question is asking and what the most common answers are. In its final form, the concept question is presented [with] both a correct answer as well as distracters, that is, incorrect answers based on commonly held misconceptions.

  13. CI’s in STEM Fields • Have been developed in • Astronomy, Chemistry, Geosciences, Dynamics, Electromagnetics, Systems and Signals, Statistics, … • At least one STEM CI attempts to get at higher levels of Bloom taxonomy • None yet in CS/SE/IS • Could serve for ABET’s continuous quantitative self-assessment, not just to answer questions listed above

  14. DevelopingConcept Inventories • Creating a taxonomy • Develop a list of concepts for the topic (DM) • Validate for acceptance by general community conferences: SIGCSE , CSEE&T, ITiCSE, ICER, ICSE, ASEE, FIE mailing lists: various • Required background of participating faculty: • Have taught the topic many times • Have graded their own HWs and exams (so they know common misconceptions for designing good distracters) • Participation of a psychologist is crucial • Analyze to determine degree of cluster • Iterate: revise and refine

  15. “Views About” Surveys(from: Field-Tested Assessment Guidefor STEM instructors ) • The Views About Science Survey (VASS) • surveys student views about knowing and learning science • assesses the relation of these views to student understanding of science and course achievement (Grades 8-16) • Probes student views based on scientific and cognitive dimensions: • Three scientific dimensions pertain to the structure and validity of scientific knowledge, and to scientific methodology • Three cognitive dimensions pertain to learnability of science, reflective thinking, and personal relevance of science • In each VASS item • respondents are asked to balance two contrasting alternatives on a five-point scale called “contrasting alternative design” • to assess variability in student views in different disciplines, parallel forms of VASS for physics, chemistry, biology, general science, and mathematics

  16. DevelopingViews About Surveys • Far easier than developing CIs: • Base new VA on VASS (Halloun’s Views About Science Survey) • No right/wrong responses • “What do you think.” “What’s your perception.” “How do you feel about….” • Perform confirmatory factor analysis • Compute Cronbach’s alpha • Revise and refine

  17. A CAD Item in VASS Form P11 • The first thing I do when solving a physics problem is: (a) represent the situation with sketches and drawings. (b) search for formulas that relate givens to unknowns. • Answer Options 1 Only (a), Never (b); 2 Mostly (a), Rarely (b); 3 More (a) Than (b); 4 Equally (a) & (b); 5 More (b) Than (a); 6 Mostly (b), Rarely (a); 7 Only (b), Never (a); 8 Neither (a) Nor (b)

  18. Sample VASS Item Learning physics requires: (a). serious effort. (b). a special talent. What would each one of the five choices mean? 1. Mostly (a), rarely (b): Learning physics requires mostly a serious effort and rarely a special talent (or mainly the former and hardly ever the latter). 2. More (a) than (b): Learning physics requires more a serious effort than a special talent. 3. Equally (a) & (b): Learning physics requires as much a serious effort as a special talent. 4. More (b) than (a): Learning physics requires more a special talent than a serious effort. 5. Mostly (b), rarely (a): Learning physics requires mostly a special talent and rarely a serious effort (or mainly the former and hardly ever the latter).

  19. VASS Structure Scientific Dimensions 1 . Structure. Science is a coherent body of knowledge about patterns in nature revealed by careful investigation –– rather than a loose collection of directly perceived facts. 2 . Methodology. The methods of science are systematic and generic –– rather than idiosyncratic and situation specific. Mathematics is a tool used by scientists for describing and analyzing ideas –– rather than a source of factual knowledge. Mathematical modeling for problem solving involves more –– than selecting mathematical formulas for number crunching. 3 . Validity. Scientific knowledge is approximate, tentative, and refutable –– rather than exact, absolute and final.

  20. VASS Structure Cognitive Dimensions 4 . Learnability. Science is learnable by anyone willing to make the effort –– not just by a few talented people. Achievement depends more on personal effort –– than on the influence of teacher or textbook. 5 . Reflective thinking. For meaningful understanding of science, one needs to: (a) concentrate more on the systematic use of principles –– than on memorizing facts; (b) examine situations in many ways –– instead of following a single approach from an authoritative source; (c) look for discrepancies in one’s own knowledge –– instead of just accumulating new information; (d) reconstruct new subject knowledge in one’s own way –– instead of memorizing it as given. 6 . Personal relevance. Science is relevant to everyone’s life; –– it is not of exclusive concern to scientists. Science should be studied more for personal benefit –– than for fulfilling curriculum requirements.

  21. Work in Progress (funded) • Develop DMCI and VADM • Research group: Klappholz (CS/SE), Henderson (CS/SE), Almstrum (CS/SE/Computing Education), Condly (Educational Psychology) • Four Advisory Board members, including one who developed a CI in another field • Three additional DM Subject Matter Experts (Almstrum and Henderson have recently taught and graded DM) • Begin studies to investigate questions • Choice of topics, sequencing and integration • Contribution of DM to desired CS/SE/IS skills

  22. Ideas for DMCI Questions (courtesy of Peter Henderson) • Basic Set NotationThis question deals with understanding basic set notation, associated concepts and the empty set. • Let S be any well defined set and { } be the set containing no elements. Which of the following statements is always true? • { }  S • { } ⊆ S • { } ⊂ S • { } = { { } }

  23. Ideas for DMCI Questions (cont.) • Comment: • An open book, open notes final for a Foundations of Computing II course included a similar question where students could select more than one answer • Of the 11 students in the class • 2 gave the right answer, (b) only • For the incorrect answers • 7 of 11 selected (a) • 6 of 11 selected (c) • 1 of 11 selected (d) • These results demonstrate misconceptions • That students were rusty was not surprising since sets were covered in the prerequisite course Foundations of Computing I.

  24. Sample DMCI Questions (cont.) • Logical ImplicationStudents often struggle with the meaning of logical implication. • A teacher said to a student, “If you receive an A on the final exam, then you will pass the course.” The student did not pass the course. Which of the following conclusions are valid? • The student received an A on the final exam. • The student did not receive an A on the final exam. • The student flunked the final exam. • If the student passed the course, then he or she received an A on the final exam. • None of these conclusions is valid.

  25. Future Work • Develop instruments to study early core CS/SE/IS topic areas (see CC 2001): • three or more CIs for • Algorithmic thinking (AT) • Programming fundamentals (PF) • Computing environment (CE) • Three or more VA’s • A series of larger research studies, using the full suite of CI and VA instruments (i.e. for DM, AT, PF, and CE)

More Related