1 / 52

Bubble in The System:

Accommodating twice the students … and more than usual who aren’t meeting expectations. Bubble in The System:. A look at risk planning, what went wrong, how things progressed regardless, and what the future might hold…. Presenters: Simon Winberg & Robyn Verinder

marcy
Download Presentation

Bubble in The System:

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Accommodating twice the students … and more than usual who aren’t meeting expectations. Bubble in The System: A look at risk planning, what went wrong, how things progressed regardless, and what the future might hold… Presenters: Simon Winberg & Robyn Verinder Department of Electrical Engineering University of Cape Town November 2009

  2. Outline • The Context • Problems and what was done about them • How they happened • How we planned for these potential risks • What was done in practice • Will these problems go away? • Tracking the bubble • Reflections • Conclusions

  3. The Context • 1st year Computing for Electrical Engineers course • For students who did not do Computer Studies (CS) for matricOR • For students who did CS but achieved a lowCS mark (below a C). Students (w/o CS) but did wellfor math and science have thenoption to do 1st year computercourses in the CS dept.

  4. Focus of this study EEE1003W Computing for Electrical Engineers EE 1st year Intake CS100X Computer Science 1st year

  5. Class composition (2008) Computing for Electrical Engineers 87 registered (4 repeating) 84 wrote final exam (3 dropped out or transferred)

  6. Class composition (2009) Computing for Electrical Engineers 85 more students at start of 2009 68 more throughout 2009 than 2008  81% more students in 2009 Total 1st year intake for Electrical Engineeringwas 49% higher this year than average of 2006-8 167 registered (3 repeating) 152 wrote final exam (15 dropped out or transferred – 9% dropped vs. 3% in ‘08)

  7. Problem 1: Large intake Larger than anticipatedintake for EE 1st year intake upby 49% from 2008 Much of this 49% additional students seems to have gone into the Computing for Electrical Engineers course

  8. Enrolment per programme • 2006-8 the split between programmes remained fairly consistent; • 2009 more mechatronics 162 184 175 260

  9. Enrolment figures:2006 - 2009

  10. Total EE first year intake by nationality 262 184 175 162

  11. Problem 1 consequences • Finding a big enough lecture venue; • Lab space limitations & availability; • Difficulty finding enough good tutors. • Problems down the line…

  12. Problems downthe line… LECTURE 1 LECTURE 1 I’m John I’m Bob • Lab expansions • Size of 2nd & 3rd year laboratories increased • Acquisition of more equipment • Staffing • More contract staff(TAs, tutors, etc.) • More duplicated lectures, • More double-period lectures • Administrative headaches • Finding & focusing funding towards these ‘immediate’ needs

  13. Problem 2:Fundamental Problem • “A big bubble in the system” • Significantly weaker students • Possibly a once-off event • A larger portion of students found the course work more difficult than previously… although the level of the coursework has not changed much

  14. Is it a once-off bubble? But is this a bubble,i.e. a once-off event,or is it the new norm? Will the proportion ofweak students returnto what it was before? Perhaps… will be discussed later…

  15. 2010 Intake estimates:‘admission probable’ • The latest estimated intake figures above have been calculated by the EBE faculty based on provisional results. • The actual figures can only be confirmed once this year’s matric marks have been obtained and processed. • It might just be a once-off bubble *Figures provided by UCT EBE Faculty (20 Nov 2009)

  16. How did it happen? Oops! • Change in matric subjectsand examination • Removal of higher grade option • Change to 2 papers for math exam • Removed geometry paper • Matric results appear to be skewed • No longer a good predictor for performance in engineering, which higher grade math used to be.

  17. NationalBenchmark Tests (NBTs) Statistics from HEQC (2009). MacGregor, K. ‘South Africa: Shocking results from university tests’, World University News – Africa Edition. Issue 35, August (2009). • National benchmark tests confirm these experience in electrical engineering: • Only 7% proficient in math • 25% proficient quantitative literacy • 47% proficient academic literacy in English

  18. NBTs Should these tests still beused just to benchmarkmatric results, and notinfluence admission? OR could these be used todecide university admission?

  19. 2008 Risk planning related to Problem 1 (intake) * Based on email correspondence November 2008 • Faculty provided 2009 estimation: • Plan for 20% increase: i.e., 100 students (but not an 80% increase!) • Based on total EE intake estimations for past 5 years (giving average of 10% increase per year over 5 years) * • Risk planning for Problem 1: • Ensuring sufficient budget for twoadditional tutors. • More multiple choice / other techniquesfor faster marking

  20. 2008 Risk planning related to Problem 2(weaker students) • Mostly based on 2008 reflections… • Initial diagnostic assessment (IDA) • Held in 1st week to identifystudents needing extra support • Bi-weekly extra tuition sessions • TA to assist struggling students • Hot seat during lab times • Chief tutor / tutors’ tutor help students with more tricky theory-related questions or assist other tutors.

  21. What was done in practice… www.drollthings.com/?p=2384 Our risk planning was beneficial, but: The problem wasoutside theenvisagedboundaries

  22. Initial diagnostic assessment (IDA) Results • Students achieved basic requirements of: • Computer literacy requirements(use of MS Word, Google searches, etc.). • Basic computer skills (using MS Excel, etc.) forbasic engineering type problems. • But later showed difficulties in: • Mathematics: difficulty in more complexproblem-solving tasks; • Academic English proficiency: understandingproblem descriptions; articulating solutions.

  23. What was done • More tutors • Lab assignments madesmaller, given more time • 2008: 6 pracs • 2009: 8 pracs • Project separated into two • 2008: 1 project • 2009: 2 projects • Additional tests, to help improve the pass rate for June and Nov exams

  24. What was done … essentially a lot of WORK - chopping and changing aspects of the course …

  25. Throughput • Despite challenges, thepass rate was not too farbelow previous years… • Pass rate 2009: 82% (18% failed) • 141 wrote exam (11 did not get a DP) • 125 of 152 students passed. 8 borderline. • Pass rate 2008: 86% (14% failed) • 82 wrote final exam (2 did not get a DP) • 72 of 84 students passed. 2 borderline.

  26. But…Will the problems go away?

  27. Will the problems go away?   ± 1,500,000 learners started school in 1997 =100,000 learners ± 500,000 learners (30%) wrote matric in 2008  The bubble University Intake <100,000 (20%) achieved a university endorsement  … So, the problem is likely to remain for a while longer Approximate HEQC figures (2009)

  28. Students have basic computer ‘skills’ But lacking foundation knowledge (maths, advanced academic literacy) needed for programming-based solving of engineering problems Limited depth of knowledge Computer literacy, basic applications, basic academic English Mathematics, Problem-solving, high level academic literacy Adapted from: European Science Foundation (2002)

  29. Tracking the bubble...

  30. Tracking the bubble: how did the students progress? • Which of the weak become strong? • Methodology • Utilizing an adapted for of skills matrix presented by Nicholls (1995) • Matric results used to determine initial placement of averaged math & science mark (left = higher mark) • Initial diagnostic assessment used to determine initial placement of computer skill

  31. Knowledge and skills matrix Representation schema adapted from Nicholls, J. (1995)."The MCC decision matrix", Journal of Management Decision: 33(6): 4-5.

  32. Overview of results

  33. Knowledge and skills matrix Full steam ahead! Results based on cohort of 2009 students that exhibited good math & science matric grades, but demonstrated only mediocre computer proficiency in computer test 1.

  34. Knowledge and skills matrix Guide me! Results based on cohort of 2009 students that exhibited good math & science matric grades, but demonstrated only mediocre computer proficiency in computer test 1.

  35. Knowledge and skills matrix Full steam ahead! Potential traps Results based on cohort of 2009 students that exhibited comparatively lower math & science matric grades, but demonstrated good computer proficiency in computer test 1.

  36. Knowledge and skills matrix SOS! Results based on cohort of 2009 students that exhibited comparatively lower math & science matric grades, but demonstrated good computer proficiency in computer test 1.

  37. Tracking the bubble: How did the weaker students’ positions withinthe skills matrix change?

  38. Knowledge and Skills matrix – starting point    33 55    60 Failed IDA: 4 =10x students

  39. Knowledge and Skills matrix – progression    33 54 1    60 Failed IDA: 5 =10x students

  40. Knowledge and Skills matrix – progression  6 really good    33 54    60 Failed IDA: 5 =10x students

  41. Knowledge and Skills matrix – progression     33 54    60 Failed IDA: 5 =10x students

  42. Knowledge and Skills matrix – progression       84 54 51 passed   9 Failed IDA: 5 =10x students

  43. Knowledge and Skills matrix – progression       84 54  (2)   7 Failed IDA: 5 =10x students

  44. Knowledge and Skills matrix – progression      41 127 13  (5)   7 Failed IDA: 5 =10x students

  45. Knowledge and Skills matrix – final point      127 13  (5)   7 Failed IDA: 5 =10x students

  46. Reflections • Crooks, T. (1988). "The impact of classroom evaluation practices on students." Review of educational research 58(4): 438. • Zimmerman & Bandura, et al. (1992). "Self-motivation for academic attainment: The role of self-efficacy beliefs and personal goal setting." American Educational Research Journal 29(3): 663. • Students with low math/science and low computer skills generally achieved notably less progression than their class mates • Many reasons for this • Lack of motivation, lack of confidence(e.g., not daring to ask for help) • Insufficient support structures…see:

  47. Reflections • * Baron, J. and M. F. Norman (1992). "SATs Achievement Tests, and High-School Class Rank as Predictors of College Performance." Educational and Psychological Measurement 52: 1047-1047. • Students showing good math/science and lower computer skills generally performed successfully • Confirmed by many studies e.g. Baron & Norman (1992)* • This cohort had the necessary foundational knowledge to succeed. • BUT: why were 7 of the 60 left behind?

  48. Reflections * Bergin, S. and R. Reilly (2005). "Programming: factors that influence success." ACM SIGCSE Bulletin 37(1): 411-415. • Students showing low math/science but good computer skills generally performed successfully… but some didn’t. • This cohort had to build the necessary foundational knowledge, which more advanced knowledge depended on. • Starting with good computer skills, in the information age, is a likely facilitating factor to learning* • 13 of the 55 students failed

  49. Reflections • Students showing high math/science and good computer skills generally performed excellently. • This cohort had the advantage of the needed foundational knowledge, in addition to good computer skills to help them learn new material. • These students wentfrom good to great! • These are what we all hope for.

  50. Reflections • What could have caused the 7 of the 60 students that did reasonably for math & science to fail? • I would have expected good math and science marks to demonstrate that the students would quickly learn computer skills • Perhaps 7/60 (12%) isn’t a major concern • Something may be wrong with my approach • Could it be an effect of last year’s matric results?

More Related