1 / 33

EMPHASISED YESTERDAY

the 13 th Conference of Southern Africa Association for Educational Assessment (SAAEA) Cyprian Cele May 21, 2019, Gaborone, Botswana. EMPHASISED YESTERDAY. Decisions driven by research and assessment information Strengthening cooperation and harmonisation Rising cost of assessment

katinac
Download Presentation

EMPHASISED YESTERDAY

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. the 13th Conference of Southern Africa Association for Educational Assessment (SAAEA)Cyprian CeleMay 21, 2019, Gaborone, Botswana

  2. EMPHASISED YESTERDAY • Decisions driven by research and assessment information • Strengthening cooperation and harmonisation • Rising cost of assessment • Quality to back up quantity • Possibilities in assessment for educational quality • Outcome: Implementable solutions

  3. WHY ARE WE HERE? • Share our research and assessment products in relation to their use in improving the quality of education: • better instruction and learning • accountability purposes to better performance of education implementers and improvement of learning • educational policy formulation

  4. WHY HERE (CONT) • We are also looking at • how assessment practices and reports are being used to foster equity and inclusivity of learners • the extent to which ICT has been harnessed for better assessment practice and use • how learners who may demonstrate preference to follow different educational pathways could be served with alternative assessment.

  5. CONSISTENCY WITH SDGs • SDGs were born in Rio, 2012 • Replaced MDGs, effective 2016. • EFA in MDGs is being continued by goal number 4 in SDGs • EFA emphasized increase in primary enrolment; Goal number 4 of SDGs is emphasizing quality of education. • Quality cannot be attained without action, looking back, making necessary adjustments and going forward.

  6. DRIVING FORWARD • Vehicle Curriculum • Road Institutional infrastructure • Driver Teacher and Manager • Traffic Officers Educational planners/supervisors • Cameras Researchers and Assessors • GPS ASSESSMENT INFORMATION

  7. WHAT ASSESSMENTS ARE WE TALKING ABOUT? • Summative Examinations • Survey Studies • Other standardised tests not discussed in this presentation • As we review them focus on whether they are best suited for supplying information on our themes .

  8. SUMMATIVE EXAMINATIONS • End of primary • End of Junior Secondary • End of Senior Secondary • Technical • Business • Other tertiary levels relevant • Pre-primary?

  9. PURPOSES OF THE EXAMINATIONS • Gauge achievement levels of individuals • Selection • Certification • Other uses may be argued in

  10. EXAMINATION DESIGN • Written papers (free response or selection) • Practical • Coursework • Projects • Portfolio • Assessment on a continuous basis being advocated more and more

  11. OUTPUT • All the work done by a learner is processed to obtain scores • Scores may be raw marks or IRT scales • Scores converted to grades or proficiency levels according to rules • Grades in different subjects are combined to obtain a division/Some have no divisions

  12. COMMUNICATING TO STAKEHOLDERS AND PUBLIC • Individual performance is reported through the school, at subject level and overall or specified level • Percentages are computed • Subject level • Gender • District/region • Qualitative reports on performance of learners • Statistical analysis of item performance • Comparisons with previous years

  13. SURVEY STUDIES • National Examples • Lesotho National Assessment of Educational Progress (LNAEP) • National Assessment of Progress in Education (NAPE) in Uganda.

  14. REGIONAL EXAMPLES • Southern and Eastern Africa Consortium for Monitoring Educational Quality (SACMEQ), • Monitoring of Learning Achievement (MLA)

  15. INTERNATIONAL EXAMPLES • Progress in International Reading Literacy Study (PIRLS) • Trends in International Mathematics and Science Studies (TIMSS).

  16. SURVEY STUDY PURPOSES • Reporting achievement at system level • Evaluating program • International/regional comparisons • Trend studies

  17. DESIGN • Few subjects per cycle, often one or two • Literacy and Numeracy have often been studied • Often written and questionnaire • Selection and supply type items (emphasizing HOTs)

  18. PROCESS • A learning level is decided on, eg Standard Four • Representative Sample is obtained • With IRT application, large sample is required • Content agreed upon. Regional or international not targeting a particular curriculum (may be compromise)

  19. PROCESSING • Responses coded • Coded responses show weaknesses and strengths displayed by learners • Scores (IRT) obtained from the codes • Statistical analysis showing • Overall performance • Subgroup performance • Comparisons: • regional, • international, • trend

  20. OUTPUT • Scores (IRT) obtained from the codes • Statistical analysis showing • Overall performance • Subgroup performance • Comparisons: regional, international, trend • Profile of learners

  21. DO THE PROCESS AND PRODUCT HELP IN • improving instruction and learning • making accountability decisions • fostering equity and inclusivity • This presentation will concentrate on these themes • RESERVATIONS ON RELIABILITY AND VALIDITY ASSUMED NON-EXISTENT, otherwise press ‘HOME’ on the GPS.

  22. QUESTIONS AND ANSWERS • I am asking questions. • Questions point to what I do not know and would be happy to listen to papers and discussions covering them. • There may be more important questions you have addressed • Papers and discussions will show us • the extent to which we have used assessment reports • whether we should package our feedback differently? • research done but not harnessed • other issues on use of information from assessment

  23. INTERNAL ASSESSMENT AND LEARNING IMPROVEMENT • We may have papers documenting how teachers use internal assessment to improve on their teaching strategies in order to improve learning. • How do the teachers analyse learner responses to sieve instructional improvement points? • How is this practice enforced at school level? • Could teachers be supported better on this?

  24. EXTERNAL EXAMINATION REPORTS TO SUPPORT LEARNING • Do the reports reach schools • managers • heads of departments • classroom teachers • Are the reports understood? • How is the information used to improve teaching and learning? • Is the tail wagging the dog: teaching to the test? • Are the reports found adequate or improvement is needed? (±)

  25. EXAMINATION USE (CONT) • How is disaggregated data being used for instructional purposes: gender, region, etc. • Observed: 15-9 +5 -3; 15- (9+5) -3 • Anything more from the massive data? • Use of ICT to provide item level information: costs vs value? • Do teachers sieve their instructional deficiencies from the examination results? • Are there effective instructional strategies out there?

  26. SURVEY STUDIES (exams +) • Do sampled learners do their best – low-staked? • Do reports reach all, including ones not sampled? • Are the results used for instructional improvement? • Reforms stimulated by these studies.

  27. ACCOUNTABILITY AND LEARNING IMPROVEMENT • Accountability is about value for money. • Praise/reward if performance is good; otherwise punish • Teacher in the frontline. • Can she explain poor performance? • School manager oversees the teacher? • Should learner performance be part of performance contracting? • At what level should we draw the accountability line?

  28. ACCOUNTABILITY FOR LEARNING IMPROVEMENT II Are assessment reports enough for making accountability decisions? • Instances of application of accountability to drive the quality of education • What benefits/challenges have been encountered in applying accountability? • Does accountability have an impact on school based scores? – Any survival tactics? • Are there ways of applying accountability with minimal adverse effects?

  29. CHICKEN OR EGG FOR ACCOUNTABILITY • Make the teacher happy before accountability or vice versa: • housing • Medicals • scholastic provisions • salary

  30. TEACHER READINESS • How well was the teacher trained? • What in service training does the teacher get? • Corrective inspection

  31. EQUITY AND INCLUSIVITY • Does accommodation equate task demands on all learners: • Blind • Deaf • Physical disability • Do the accommodated tests measure the same constructs? • Do we cater for the specially gifted in our assessments?

  32. INCLUSIVENESS EXPANDED • Communities: • Do we accommodate for opportunity to learn? • Comparability within and between boards

  33. CONCLUDING REMARKS • Assessment is a powerful tool. • It is the eye of the educator. • The challenge is to act on what we see in order to guide education towards quality.

More Related