1 / 16

Task-Based Assessment of Students' Computational Thinking Skills

This study aims to assess the computational thinking skills of 6th and 9th grade students through task-based assessment in visual programming or tangible coding environments. The study explores the difficulty level and targeted concepts in each task, as well as the validity of the assessment tool.

rayrobinson
Download Presentation

Task-Based Assessment of Students' Computational Thinking Skills

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Task-basedassessment of students’ computationalthinkingskillsdevelopedthroughvisualprogrammingor tangible codingenvironmentsTakam Djambong (Université de Moncton)Viktor Freiman(Université de Moncton)

  2. Objectives • Assess to what extent the scores of 6th and 9th grades’ students from one middle school, could be used to discriminate different tasks measuring computational thinking skills in programming learning environments with respect to: • the estimated level of difficulty of each task; • the type of targeted concepts related to computational thinking present in each task.

  3. Context • This study derives from the initiative launched in Canada in 2011 by the Social Science Research Council (SSHRC), calling for the identification of new ways of learning that Canada will need to succeed in the future society and labor market which will be increasingly • digital, • complex, • and interconnected(Beaudoin et al, 2014, p.40.)

  4. Rationale of the Study • This study seek to contribute to the need for a better conceptualization of the construct of computational thinking in terms of: • its conceptual and operational definition, • its mode of K-12 curricular integration (in terms of educational practices, devices and content), • Assessment tools or instruments.This study was discussed only in terms of the analysis of an example of assessment tool used to measure computational thinking skills in order to anticipate its possible validity

  5. Theoretical framework • This study, relied on the articulation of three main concepts: • Computational thinking (according to its definition proposed in 2009 by the CSTA); • Technology-rich learning environments (TRE) (according to the meaning given in the context of the 2012 PIAAC survey); • The problem-solving tasks (according to the task-based assessment methodology used in the design of BebrasInternational Challenge tasks whose construct validity is supported by several previous empirical research)(Dagiene, V., Stupuriené, G., 2015 Dolgopolovas, V., et al, 2016)

  6. Research questions • Two research questions have guided our study: • How do the estimated level of difficulty of each task, reflects the scores obtained by 6thand 9thgrade students during pre and post tests? • How do the computational thinking concepts (or their combination) present in each task, reflects the scores obtained by 6th and 9th grade students during pre and post tests?

  7. Methodology The methodology used was that of a quasi-experimental exploratory study with pre and post-test, single group; 9th grade students (n=14) and 6th grade students (n=10) participated in the experiment; The experiment lasted six weeks and students were exposed to problem solving activities in the Scratch; environments (K-6) and EV3 Lego Robotics Kit (K-9); 23 tasks were used to measure CT skills; Descriptive methods were used to analyze the results

  8. Results (1)

  9. Results (2)

  10. Results (3)

  11. Results (4)

  12. Discussion (1) The results of both the pre-test to post-test, demonstrate some balance between the scores obtained for a given task, with the level of difficulty estimated by designers of each of the proposed Bebras tasksThe tasks proposed in this study differ from each other both by: the type (AB, AL, DE or PR) of computational thinking concept present in each task, their number per task (which can be 1, 2, 3 or 4), the estimated level of difficulty (easy, medium or hard) of the task, and the nature of the task to solve (checking or performing a procedure).

  13. Discussion (2) It is possible that other factors not identified in this study may better explain the results of this study that need more strong empirical evidence, given the following limitations: the small size of research sample does not allow the analysis could lead to a generalization of observations; the selection of participants has not been subjected to random assignment was able to influence and bias the results. Influence of the time of the study (end of the school year) Influence of learning experience during the project

  14. Conclusion (1) In conclusion, this study shows that:there could be a link between the students' ability to solve the tasks proposed, the type of targeted skills related to computer thinking, and the degree of difficulty or complexity of the proposed tasks. But it needs be validated by a more solid empirical evidence, There would not necessarily be a link between the level of difficulty of a task and the number of concepts related to computational thinking involved

  15. Conclusion (2) However, this study justifies the need for further studies to establish the validation of the proposed tasks based on more solid empirical evidence, It might therefore be useful to examine in future research, the effect that the nature of the pedagogical intervention in programming environments (Visual compared to tangible) could have on the achievement of the proposed set of tasks.

  16. Aknowledgements This ongoing study is being conducted with the help of the Canadian Social Sciences and Humanities Research Council (Partnership Development Grant #890-2013-0062), New Brunswick Innovation Foundation (2016 Research Assistantship Program) and le Secrétariat aux Affaires Intergouvernementales Canadiennes du Québec (Programme de soutien à la Francophonie canadienne).  CRSH (Social Sciences and Humanities Research Council of Canada)

More Related