1 / 49

Lecture overview & topics

Lecture overview & topics. Blended learning = Face-to-Face learning + E-learning. PBL: problem-based learning. formative assessment = Test-steered e-learning ALEKS MyLabs Learning Analytics => Dispositional LA.

ipo
Download Presentation

Lecture overview & topics

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Lecture overview & topics Blended learning = Face-to-Face learning + E-learning PBL: problem-based learning formative assessment = Test-steered e-learning ALEKS MyLabs Learning Analytics => Dispositional LA

  2. Formative Assessment: the Dutch SURF project testing & test-driven learning National: improve transfer high school university, and diminish dropout 1st year, for math & stats, by creating adaptive learning paths using e-tutorials/online platforms for practicing & formative assessment. Local (Maastricht): do so for a large (1000/year), very international (75%) and very heterogeneous population of business & economics 1st year students.

  3. Maastricht’s blended learning • Face-to-Face component: Problem-based learning (adapted from McMaster University): collaborative learning based on social constructivist principles (with support of lecture cycles) • E-component: several test-based learning environments, such as: • Adaptive tutorial ALEKS: individual learning, with e-learning environment based on Knowledge Space theory (AI) • MyStatsLab, Pearson • BlackBoard based • This blended learning environment is our ‘math & stats buffet’ for the students. Prime characteristic is that students continually choose from the buffet: it is not a one-time allocation based on student characteristics at the start of the course, but a repeated choice over a 8 weeks course period. • Data for the several studies use 7 relative large (900/1000) cohorts of freshmen business / economics.

  4. Face-to-face component: PBL small group discussion problem * description of phenomena * prepared by a team of teachers * directs learning activities * what do we already know about the problem? * what do we still need to know about the problem? * using a specific problem solving technique (7-jump) exchange of information self study *learning resources *integration of knowledge from different disciplines * did we acquire a better under- standing of the processes involved in the problem?

  5. How PBL? Seven-Jump • Step 1: Read: clarify terms and concepts • Step 2: Problem definition • Step 3: Brainstorm • Step 4: Systematic inventory • Step 5: Formulate learning goals • Step 6: Self-study • Step 7: Report and synthesize

  6. Roles • Tutor: monitors the process and content • Discussion leader: leads the discussion/ process: summarises, activates, asks questions • Secretary: “memory”of the group, takes minutes • Group members: participate and prepare!!!

  7. The e-learning component of the blend:Adaptive e-tutorial, ALEKS, from 2003 on

  8. Knowledge Space theory shaping the “Ideal” individual learning-path • Based on outcomes of entry-assessment, a student could be evaluated at any point on the knowledge space of topic X. • Student A can have a different learning path than Student D to reach point f • Ideally, the learning materials and teachings methods should adapt to the knowledge/skills of each individual student.

  9. ALEKS learning path: choosing from outer and inner fringes • Knowledge State can be described by • All mastered items • Outer Fringe (=Ready to learn ) + Inner Fringe (=Most recently learned)

  10. Sample of an ALEKS assessment item

  11. Partial sample of an ALEKS learning report

  12. ALEKS: learning pie

  13. ALEKS: Ready to learn & Log

  14. ALEKS: Quiz report

  15. ALEKS: Question

  16. ALEKS: Explanation

  17. ALEKS: Question

  18. ALEKS: Explanation

  19. ALEKS: lecturer feedback

  20. Role of e-component in Math/Stats education • Part of Dutch SURFprojects: ‘Digital testing’ and ‘Testing and Test-directed learning’, that stimulates the use of digital tools for placement-, diagnostic- and formative-testing. Using both external software, as ALEKS and MyLabs, as well as test from the national SURF testbank, especially for entry yesting. • Replaces all ‘practicals’ • It adapts to the level of mastery of students, and thus takes into account prior statistics schooling, and in specific: lack of any prior schooling • Participation is optional, and most strongly advised for students with no prior schooling Stats, and weak prior schooling Math • Mastery is assessed in three Quizzes that allow students to achieve ‘bonus scores’ for their final exam. Strong students do not need such bonus, but for weaker students, it can be the difference between passing and failing.

  21. MyMathLab & MyStatLab learning environments • MyMathLab and MyStatLab belong to Pearson’s MyLabs: test-steered digital learning and practice program, with several types of feedback & support: check answer, help me solve this, view an example. • Applied from 2009 on. • Allows more student regulation of learning process.

  22. MyLab exercises in Homework mode Allow for wide range of scaffolding types: • View an example: student can ask for a fully elaborated example • Help me solve this: student can ask for help in specific steps in solving • Ask my instructor: evaluation errors

  23. Both open answer and mc type questions, parameter generated

  24. Learning analytics defined Learning analytics is the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs.

  25. Learning analytics, Acad. Anal. & Data mining

  26. Objectives of Learning analytics Duval distinguishes six objectives of learning analytics: • predicting learner performance and modeling learners, • suggesting relevant learning resources, • increasing reflection and awareness, • enhancing social learning environments, • detecting undesirable learner behaviors, • detecting affects of learners.

  27. SIGNALS http://www.itap.purdue.edu/learning/tools/signals/

  28. Dispositional Learning analytics Buckingham Shum & DeakinCrick propose a learning analytics infrastructure that combines learning activity generated data with intentionally collected data, such as self-report data stemming from student responses to surveys. In doing so, the opt for learning dispositions, values and attitudes measured through self-report surveys and fed back to students and teachers through visual analytics.

  29. Measuring dispositions: implicit theories meaning system In line with ‘blended learning student profiling’, we apply social-cognitive learning models (and their causal chains) to operationalize learning power/dispositions: • Implicit theories of intelligence: incremental vs entity view  • Effort beliefs: seeing effort as a positive vs negative thing  • Achievement goals: learning or outcome performance goals vs normative performance goals  • Academic motivation: autonomous vs controlled motivation  • Learning styles: deep vs surface learning, self-regulated vs external • Motivation & engagement: cognitions and behaviours of adaptive vs maladaptive types • Subject attitudes Meaning system hypothesizes ‘antecedence  consequence’ relations. And continuum of Adaptive  MalAdaptive

  30. Dweck’s model of self-theories: Theories of Intelligence scales: Subscale: entity theory • You have a certain amount of intelligence, and you can’t really do much to change it. • Your intelligence is something about you that you can’t change very much. • To be honest, you can’t really change how intelligent you are. • You can learn new things, but you can’t really change your basic intelligence. Subscale: incremental theory • No matter who you are, you can significantly change your intelligence level. • You can always substantially change how intelligent you are. • No matter how much intelligence you have, you can always change it quite a bit. • You can change even your basic intelligence level considerably.

  31. Dweck’s views on the role of effort in learning Dweck & Blackwell hypothesize that implicit theories determine how students view effort. In the entity-theory framework, (the need for) effort signals low intelligence, thus effort is viewed as a negative thing. In the incremental-theory framework, effort is the cue to learning, to enlarging one’s intelligence, and thus viewed as a positive thing. Subscale: Effort as a negative thing, exerting effort means you have a low ability • When I work hard at my schoolwork, it makes me feel like I’m not very smart. • It doesn’t matter how hard you work—if you’re not smart, you won’t do well. • If you’re not good at a subject, working hard won’t make you good at it. • If a subject is hard for me, it means I probably won’t be able to do really well at it. • If you’re not doing well at something, it’s better to try something easier. Subscale: Effort as a positive thing, exerting effort activates your ability • When I work hard at my schoolwork, it makes me feel I am learning a lot. • When something is hard, it just makes me want to work more on it, not less. • If you don’t work hard and put in a lot of effort, you probably won’t do well. • The harder you work at something, the better you will be at it. • If an assignment is hard, it means I’ll probably learn a lot doing it.

  32. Intrinsic & extrinsic motivation: Academic Motivation Scale Academic Motivation Scale (AMS; Vallerand et al., 1992), based upon Ryan and Deci’s (2000) model of intrinsic and extrinsic motivation. The AMS consists of 28 items, to which students respond to the question stem “Why are you going to college?” There are seven subscales on the AMS, of which three belong to intrinsic motivation scale and three to extrinsic motivation scale. For intrinsic motivated learning, the drive to learn is derived from the satisfaction and pleasure of the activity of learning itself; no external rewards come in play. • intrinsic motivation to know (learning for the satisfaction and pleasure to understand something new); • intrinsic motivation to accomplish (learning for experiencing satisfaction and pleasure to accomplish something), and • intrinsic motivation to experience stimulation (learning to experience stimulating sensations). Externally motivated learning refers to learning that is a means to an end, and not engaged for its own sake. The three extrinsic motivation subscales constitute a motivational continuum reflecting the degree of self-determined behaviour: • identified regulation, the component most adjacent to intrinsic motivation: the student comes to value learning as important and, therefore, performs it out of choice, but still for extrinsic reasons • introjectedregulation, formerly external source of motivation is internalized • external regulation, learning is steered through external means, rewards The last scale, a-motivation, constitutes the very extreme of the continuum: the absence of regulation, either externally directed or internally.

  33. Self-regulated learning: Vermunt’s learning styles Learning styles composed of four components: Learning Orientations: students’ learning related attitudes and aims: Personally interested, Certificate directed, Self-test directed, Vocation directed, Ambivalent Learning Conceptions: beliefs and views on learning: Construction of knowledge, Intake of knowledge, Use of knowledge, Experience Stimulating Education, Cooperative Education Cognitive Processing Strategies: Critical processing, Relating & Structuring (together: Deep strategies), Analysing, Memorising & Rehearsing (together: Stepwise strategies), Concrete Processing Metacognitive regulation strategies: Self-Regulation of learning process, Self-regulation of learning content (together: Self-regulation), External Regulation of learning process, External regulation of learning content (together: Externalregulation), Lack of Regulation Cognitive Processing Strategies and Metacognitive regulation strategies are hypothesised to distinguish deep learners (deep strategies, self-regulation), stepwise learners (stepwise strategies, external regulation) and undirected learners

  34. Example: Martin’s ‘Motivation & Engagement Wheel’ Four quadrants based on: • Thoughts (Cognitions)  Behaviours • Adaptive  MalAdaptive

  35. Basic learning analytics type of data:Students’ background characteristics • MathMajor students need less time to achieve better • Dutch students profit from better transfer • Entry test predicts MastScore • Females more active

  36. student data based on nationalities: Hofstede Cultural dimensions • Power Distance: • Individualism versus Collectivism: • Masculinity versus Femininity: very strong impact on hours, strong impact on scores • Uncertainty Avoidance: weaker impact, hours/score difference • Long versus Short -Term Orientation: equal impacts on hours and scores • Indulgence versus Restraint: very strong impact on hours, strong impact on scores

  37. Student data: Vermunt’s Learning styles • Learning processing strategies (deep, stepwise, concrete) and learning regulation strategies (self, external, lack of). • Stepwise learners and externally regulated learners are intensive e-learners.

  38. Motivation & engagement wheel: Martin Adaptive and maladaptive thoughts and behaviours => gender differences

  39. Pekrun’s control value theory of learning emotions • Control & emotions impact MathScore stronger than any other variable. • Positive emotions have positive effect, negative emotions a negative one. • Causality unclear, due to timing.

  40. Relationship tracking data & performance Very strong relationships between tool use and course performance • Most strong in quizzes (bonus) as performance • Most strong in mastery as tool use indicator Confirms the relevance of clustering students on tool use behaviour

  41. The ‘system generated’ data: tracking data MML and MSL generate: • Mastery%: completion rate weekly homework • Time: hours for weekly mastery • Efficiency: mastery/hour Using track data to monitor student progress: • Cluster 4: high achievers, 652 students, 66% • Cluster 3: decreasing tool intensity Week5, 103 students, 10% • Cluster 2: decreasing tool intensity Week3, 98 students, 10% • Cluster 1: opt out from all tool use, 133 students, 13%

  42. Predicting cluster membership How well can we predict membership of the four clusters on the basis of ‘early’ data: dispositions, early track data, or both combined? • Track data only sets Cluster1 apart, but cannot distinguish Clusters 2, 3, 4 • Disposition data does worse for Cluster 1, but somewhat better in predicting Clusters 2, 3 •  Combining the two provides best results, two rather independent set of predictors

  43. Feedback reporting In the experiment, feedback was organized: • System track data: continuous feedback to students, weekly overviews to tutors • Dispositional data: to students after due date, in absolute and relative format, using ‘simple’ graphical tools Limitations: Feedback for separate data sources, not integrated

  44. Simultaneous model: motivation & engagement, self-beliefs • Anxiety vs Self-Handicapping; Adaptive vs Maladaptive behaviours & thoughts

  45. Practical implications • In the experiment, dispositional data on 7 different instruments is used. For the learning analytics itself, one does not need such multitude: data is collinear. Motivation & Engagement Wheel plus Learning Styles do a good job in providing relevant learning feedback. However, for counseling purposes, we need all (what implicit theories cause …?) • In the current setup, we can investigate what practicing and formative assessment in e-tools add to the learning: how effective are digital platforms in raising performance. However, we cannot investigate the effectiveness of applying Learning Analytics: what students do with the learning feedback, remains mostly unobserved.

  46. Combining self-theories & Pekrun’s control-value theory of achievement emotions

  47. Simultaneous model: emotion & self-beliefs • Blendend vs Face-to-Face learning: emotions impact 1st, not 2nd.

  48. Conclusion Research aims: Providing students with a broad portfolio of learning tools: the blended environment & Providing students with LA based information to help them optimize their learning within the student-centered system & Push toward more adaptive forms of learning Further reading: Formative Assessment and Learning Analytics. In D. Suthers & K. Verbert (Eds.), Proceedings of the 3rd International Conference on Learning Analytics and Knowledge, 205-209. New York: ACM. DOI: 10.1145/2460296.2460337. How achievement emotions impact students' decisions for online learning, and what precedes those emotions. Internet and Higher Education. 15 (2012), 161-169. How Cultural and Learning Style Differences Impact Students’ Learning Preferences in Blended Learning. In E. Jean Francois (Ed.), Transcultural Blended Learning and Teaching in Postsecondary Education, (2013). P 30-51. The Role of Digital, Formative Testing in e-Learning for Mathematics: A Case Study in the Netherlands. In: “Mathematical e-learning” [online dossier]. Universities and Knowledge Society Journal (RUSC). vol. 9, no 1. (2012), UoC. Student Learning Preferences in a Blended Learning Environment: Investigating the Relationship Between Tool Use and Learning Approaches. In Piet Van den Bossche, Wim H. Gijselaers and Richard G. Milter (Eds.), Advances in Business Education and Training, 1, Volume 3, (2011), 195-212, Berlin: Springer-Verlag.

More Related