1 / 29

Using Structured Inquiry Based Assessments in a Quality Assurance Course

Using Structured Inquiry Based Assessments in a Quality Assurance Course. Assessing Connected Knowledge Murray Black Auckland University of Technology. Fragmented Knowledge Domain. Comparison of Statistical Techniques. Incomplete statistical literacy

kirtana
Download Presentation

Using Structured Inquiry Based Assessments in a Quality Assurance Course

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Using Structured InquiryBased Assessments in a Quality Assurance Course Assessing Connected Knowledge Murray Black Auckland University of Technology

  2. Fragmented Knowledge Domain Comparison of Statistical Techniques Incomplete statistical literacy Lack of knowledge and application of statistical techniques to reasoning and interpretation. • Statistical Literacy Knowledge and application of statistical techniques to reasoning and interpretation.

  3. Fragmented Knowledge Domain Example Comparison of Statistical Techniques Incomplete Statistical Literacy Standard Deviation Significance Testing • Complete Statistical Literacy Mean Boxplot

  4. Ideal Linkages Statistical Literacy Statistical Reasoning Applied in context to show appropriate data visualisation Calculated and interpreted correctly in context Appropriate test chosen and applied correctly in context • Boxplot • Mean and Standard Deviation • Significance Testing

  5. Thought Process • Concept applied to context • Context appropriate concept is applied

  6. SCHEMA as Structural Knowledge (Skemp 1987) • Schema – a mental storage mechanism that is structured as a network of knowledge (Marshall 1995). • Feature of Schema – Presence of connections • A schema results from repeated exposure to problem- solving situations that have features in common. • In an assessment the student extracts the most relevant of those features and either assimilates these features into existing schemas or creates new schemas.

  7. Levels of Statistical Reasoningapplied to a Structured Inquiry • Idiosyncratic Reasoning • Verbal Reasoning • Transitional Reasoning • Procedural Reasoning • Integrated Process Reasoning (Garfield 2002)

  8. Task Word Linkages • Idiosyncratic Reasoning remembering or recognising key word • Verbal Reasoning classifying with meaning • Transitional Reasoning interpreting locally in context • Procedural Reasoning interpretations partially integrated • Integrated Reasoning interpretations fully integrated

  9. Link to SOLO TaxonomyLevel of Statistical Reasoning(Biggs and Collis 1982) Multi-structural Relational Independent Aspects Integrated Aspects Treated Separately Treated as a Whole

  10. Assessment ProcessIndependent using Concept Knowledge Use Objectives Choice of Method Example: Calculate the mean turn-around time (TAT) of the following times (all in hours) of 2,5,6,7 and 8? Interpret your answer.

  11. Assessment ProcessIntegrated using Context Objectives Choice of Method Knowledge Use Example: How could we measure and interpret the typical turn-around time(TAT) for a blood sample diagnosis?

  12. The Study – Comparative Results Traditional Structure Assignments 1 and 3 with an Exam Inquiry Approach Assignment 2 and a Test

  13. Levels of InquiryBanchi and Bell (2008) • Confirmation – Students practice a specific inquiry skill. I.e. students collect and record data. • Structured – Teacher provides question and procedure. Students generate an explanation supported by the evidence. • Guided – Teacher provides research question and students design the procedure (method) to test the question. • Open – Students derive question, design and carry out investigation. Then they communicate their results.

  14. Types of Inquiry • Assignment: Identity all the quality control at LabPlus. (Guided) Note: Students need to identify and describe the following: • Types of quality tools • Use of quality tools • How are the tools used? • Test: Examine all the quality control at the packinghouse over the stages of raw materials control, in-process control and finished product inspection. (Structured) Note: Some evidence is provided at each stage.

  15. AssessmentsComparison of Two Approaches Traditional Structure Specific Testing of Concepts in the main areas below: • Graphs • Measures • Significance Testing Inquiry Approach Testing the Best Answer to a Question involving • Aspects of a Quality Assurance Programme • Functioning of Lab Plus • Elements of Kiwifruit Packing

  16. Characteristics of the Sample Background of Students • Level 2 Mathematics • Achievement Standards • Gaps in knowledge of Statistics Student Cohort • Science Majors in Food Safety of Food Science • Class in Second Year of BSc • Balance of Genders

  17. Analysis of ResultsStatistical Measures

  18. Observations • The mean for Assignment 2 was much higher and the scores were more consistentthan for Assignments 1 and 3. • The mean for the test was higher than the exam however the exam results were more consistent than for the test. • The mean and medians were similar which indicated very little skew.

  19. Analysis of ResultsStatistical Test of Hypothesis • In comparing assignments the Wilcoxon Signed Rank Test confirmed that the mean of assignment 2 was significantly higher than that of assignments 1 and 3 combined to a high level of significance (V = 131, p = 0.0001526 < 0.01).   • In comparing the test with the exam the Wilcoxon signed rank test confirmed that the mean of the test was significantly higher than that of the exam to a high level of significance (V = 103, p = 0.007278 < 0.01). • For the controlled assessments, test and exam, the scores were significantly correlated(r = 0.62925 > 0.4972 at 5% level of significance) which reflects reliability with respect to ability in statistics. • However for the uncontrolled assessments namely the assignments, the scores weren’t significantly correlated (r = 0.37123 < 0.4973 at 5% level of significance). This reflects the differing conditions under which uncontrolled assessments are attempted by the students compared to controlled assessments. Students tended to do better on some parts of assignments compared to other parts so there was a lack of consistency.

  20. Evaluation of Other OutcomesAffective Domain • “Pre” and “Post” versions of the SATS (Survey of Attitudes towards Statistics). • Twenty eight items assessed on a Likert scale ranging from 1 (strongly disagree) through 4 (neither disagree nor agree) to 7 (strongly agree). • Subscale scores are formed by summing the items below for each subscale. • Subscales are: Affect, Cognitive Competence, Value and Difficulty. (Schau,C.,Dauphinee,T.,& Del Vecchio,A.(1992)

  21. Subscales • Affect – I am scared by statistics; I will like statistics. • Cognitive Competence – I can learn statistics; I will have no idea of what’s going on in statistics. • Value – I use statistics in everyday life. Statistics is irrelevant in my life. • Difficulty – Statistics is highly technical; Statistics formulas are easy to understand.

  22. Outcomes

  23. Conclusion from Outcomes • The changes in the mean values of value and difficulty were significant. • The changes in the mean values of affect and cognitive competence were not significant. • The real life links were valued by the students. • The significant difference in the means of difficulty confirmed the scores.

  24. Conclusions • This paper contained several real world applications where a process could be followed through completely and several statistical concepts were “in play”. This meant that the teaching which preceded the assessments could follow an inquiry-based approach. • This research showed that the results were significantly better when structured inquiry-based approaches were used in both the assignment and test assessments. • It can be concluded that contextual links do play a positive role in a student’s ability to recall information about statistical concepts and hence increases their ability to apply that information to the problem at hand. • We have evidence that these contextual links allow a student to see the purpose before the application is decided upon. Hence an appropriate method is more likely to be remembered by the student in order to complete another problem.

  25. Implications for the Future • Preparation for University Study – certificate course, secondary school with the current level 1 mathematics review • Teaching Strategies- can achievement standards be linked? Opportunity presented itself with the NZ Scholarship Statistics examiner position (2004 to 2018). • Type of Assessments- inquiry-based, portfolio, tests? Assignments, develop own assignments e.g. queue data • Assessment Strategies- theme papers strategies, • Extending the Study – include range of subjects with/without team teaching

  26. Issues Raised • Transmission of right method and interpretation to other problems • Use of links such as quality tools. Common headings like raw materials, in-process control and finished product inspection • Meaning of structured – set of questions leading to a conclusion – is this the best product? • How do you cope with someone who prefers and defends the traditional thinking model? • Maths issues in Australia raised by Alan Finkel (Chief Scientist) and resonated with my description of the NZ situation with the plea for the strength of the STEM disciplines to be maintained. • What are your favourite examples to use? E.g. Kiwifruit from Orchard to Europe

  27. Observations from STEM Conference • Held every two years between QUT, Beijing and UBC (University of British Columbia). • Main theme was “Integration with STEM”. • Participants were from primary and secondary schools along with tertiary institutions. • Tertiary participants were mainly teaching first year education papers. • Presentations were timed at 25 minutes including questions. These presentations were education based with appropriate methodologies. • STEM participants were all based in education departments which gives us a point of distinction with a focus. • All research presented was about the design and delivery of both the curriculum and assessments. Not the curriculum itself.

  28. Further Observations • We were challenged to look for and use opportunities for STEM integration. • A degree option for an engineering degree was being trialled in Edinburgh between Napier University and industry. The effectiveness was being evaluation using researched tools. • One of the main speakers Felicity Furey divided us into groups of three during her lecture and had us work on a problem (design of the path of a new road) that illustrated the use of critical and creative problem solving skills. • Lyn English – Development of content knowledge and the adaptation and application of this content knowledge to the solution of new problems. • Felicity Furey – Engaging students by bringing the real world into the classroom. We need to communicate the “why” of what we do in STEM and not just the “how”. • Excellent Facilities – Layout of rooms, project poster screens

  29. Thank YouAny Questions? • murray.black@aut.ac.nz

More Related