1 / 32

PISA Programme for International Student Assessment

PISA Programme for International Student Assessment. PISA team Department of Education – Ghent University – Belgium Beijing – July 24-25, 2009 http://allserv.ugent.be/~mvalcke/CV/CVMVA.htm Martin.Valcke@ugent.be. ( Partly based on Schleicher, A., 2006). Structure. Starting grounds

aglaia
Download Presentation

PISA Programme for International Student Assessment

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. PISAProgramme for International Student Assessment PISA team Department of Education – Ghent University – Belgium Beijing – July 24-25, 2009 http://allserv.ugent.be/~mvalcke/CV/CVMVA.htm Martin.Valcke@ugent.be (Partly based on Schleicher, A., 2006)

  2. Structure • Starting grounds • Objectives • Samples – Population • Quality benchmarks • Framework • Typical assessment approach • National Centres

  3. PISA: starting grounds • OECD: Organisation for Economical Organisation and Development • Basis of PISA: original work of OECD related to statistics and indicators about education • late 1980s • voluntary contributions • Member engagement through networks

  4. PISA: starting grounds • Networkon educational outcomes • Proposal to study educational outcomes • Formally started with 11 members in 1996 • Expanded in 1997 • New name PISA • Members bear costs and risks • Number of countries grows every PISA edition: 2000, 2003, 2006, 2009, …

  5. PISA 2000 OESO-landen: Australië, Oostenrijk, België, Canada, Tsjechische Republiek, Denemarken, Finland, Frankrijk, Duitsland, Griekenland, Hongarije, Ijsland, Ierland, Italië, Japan, Korea, Liechtenstein, Luxemburg, Mexico, Nederland, Nieuw-Zeeland, Noorwegen, Polen, Portugal, Spanje, Zweden, Zwitserland, Verenigd Koninkrijk, Verenigde Staten Partnerlanden (niet-OESO): Albanië, Argentinië, Brazilië, Bulgarije, Chili, Hong Kong-China, Indonesië, Israël, Letland, Voormalige Joegoslavische Republiek Macedonië, Peru, Roemenië, Rusland, Thailand

  6. PISA 2003 OESO-landen: Australië, Oostenrijk, België, Canada, Tsjechische Republiek, Denemarken, Finland, Frankrijk, Duitsland, Griekenland, Hongarije, Ijsland, Ierland, Italië, Japan, Korea, Liechtenstein, Luxemburg, Mexico, Nederland, Nieuw-Zeeland, Noorwegen, Polen, Portugal, Slovakije, Spanje, Zweden, Zwitserland, Turkije, Verenigd Koninkrijk, Verenigde Staten Partnerlanden (niet-OESO): Brazilië, Hong Kong-China, Indonesië, Letland, Macao-China, Rusland, Servië & Montenegro, Thailand, Tunesië, Urugay

  7. PISA 2006 OESO-landen: Australië, Oostenrijk, België, Canada, Tsjechische Republiek, Denemarken, Finland, Frankrijk, Duitsland, Griekenland, Hongarije, Ijsland, Ierland, Italië, Japan, Korea, Luxemburg, Mexico, Nederland, Nieuw-Zeeland, Noorwegen, Polen, Portugal, Slovakije, Spanje, Zweden, Zwitserland, Turkije, Verenigd Koninkrijk, Verenigde Staten Partnerlanden (niet-OESO): Argentinië, Azerbeidzjan, Brazilië, Bulgarije, Chili, Colombia, Kroatië, Estland, Hong Kong-China, Indonesië, Israël, Jordanië, Kyrgyzstan, Letland, Liechtenstein, Litouwen, Macao-China, Qatar, Montenegro, Servië, Roemenië, Rusland, Slovenië, Taipei, Thailand, Tunesië, Uruguay

  8. PISA country participation

  9. PISA: Objectives • A three-yearly global educational assessment • What did they learn? • Performance of 15-year-olds • key subject areas, and a range of educational outcomes • Additionally: • student attitudes to learning, self efficacy beliefs, and learning strategies • contextual data from students, schools, parents and systems: policy levers See further

  10. PISA: Objectives • Comparing performance within and between countries • Cross-cultural study • Central concept: LITERACY • Mathematical literacy • Scientific literacy • Reading literacy

  11. PISA:Objectives Example: scientific literacy Is defined in terms of an individual’s: • Scientific knowledge and use of that knowledge to… … identify scientific issues, … explain scientific phenomena, and … draw evidence-based conclusions about science-related issues • Understanding of the characteristic features of science as a form of human knowledge and enquiry • Awareness of how science and technology shape our material, intellectual and cultural environments • Willingness to engage with science-related issues .

  12. PISA: Population versus samples • Population of 15 year old pupils • National samples • Representative samples between 3,500 - 50,000 pupils • Most federal countries: regional samplese.g., Flanders versus Wallonia within Belgium • PISA covers roughly 90% of the world economy .

  13. Flanders Dutch speaking commmunity Wallonia French speaking commmunity Belgium Federal state (10,511.382 inhabitants) German speaking commmunity

  14. PISA sampling requirements • Population: all 15-year-olds in school • excludes 15-year-olds out of school • includes 15-year-olds in special education institutions • could exclude up to 5% of 15-year-olds in school • difficult to reach (e.g. remote schools) • non-participation • few countries fail to reach required sample in 2003 • NZ (5.1%), Denmark (5.3%), UK (5.4%), Canada (6.8%), Spain (7.3%), US (7.3%) • Sample • minimum of 150 schools per country • two random samples: schools and replacement schools • if school declines, replacement school is invited • requirements set by countries

  15. PISA: Networks • International expertise network building on the participating countries… • Instruments ~ input of > 40 countries (see next presentation) • Cross-national and cross-cultural validity • Analysis of results • International, natiopnal, regional analyses and reports • Country reviews • Consortium of research institutionsACER, CITO, ETS, NIER, WESTAT • Coordinated by OECD and international organisations

  16. PISA: Objectives • Focus on performance in subject areas: • Languages: Reading literacy • Using, interpreting and reflecting on written material. • Mathematics: Mathematical literacy • Recognising problems that can be solved mathematically, representing them mathematically, solving them. • Sciences: Scientific literacy • Identifying scientific questions, recognising what counts as scientific evidence, using evidence to draw conclusions about the natural world.

  17. PISA Objectives: cycles

  18. PISA: Objectives • Focus on performance in additional domains subject areas: • 2000: Problem Solving • 2003: ICT literacy • 2006: Attitudes towards science • 2009: ERA

  19. PISA key quality benchmarks • Overall performance of education systems • Equity in the distribution of learning opportunities • Measured by the impact students’ and schools’ socio-economic background has on performance… … not merely by the distribution of learning outcomes • Consistency of performance standards across schools • Gender differences • Foundations for lifelong learning

  20. PISA

  21. PISA framework Domain 1 Individual learner LevelA Outputs and Outcomesimpact of learning Quality and distribution of knowledge & skills

  22. PISA framework Student background variables Mediating variables Literacy

  23. PISA framework: complex interplay variables Student background variables Mediating variables Literacy

  24. PISA framework Domain 1 Individual learner LevelA LevelB Instructional settings LevelC Schools, other institutions Country or system LevelD Outputs and Outcomesimpact of learning Quality and distribution of knowledge & skills Quality of instructional delivery Output and performance of institutions Social & economic outcomes of education

  25. High Student performance Low Disadvantage Advantage PISA Index of social background

  26. PISA framework Domain 1 Individual learner LevelA LevelB Instructional settings LevelC Schools, other institutions Country or system LevelD Domain 2 Outputs and Outcomesimpact of learning Policy Leversshape educational outcomes Individual attitudes, engagement and behaviour Quality and distribution of knowledge & skills Teaching, learning practices and classroom climate Quality of instructional delivery The school learning environment Output and performance of institutions Structures, resource allocation and policies Social & economic outcomes of education

  27. PISA framework Domain 1 Individual learner LevelA LevelB Instructional settings LevelC Schools, other institutions Country or system LevelD Domain 2 Domain 3 Outputs and Outcomesimpact of learning Policy Leversshape educational outcomes Antecedentscontextualise or constrain ed policy Individual attitudes, engagement and behaviour Socio-economic background of learners Quality and distribution of knowledge & skills Student learning, teacher working conditions Teaching, learning practices and classroom climate Quality of instructional delivery The school learning environment Community and school characteristics Output and performance of institutions National education, social & economic context Structures, resource allocation and policies Social & economic outcomes of education

  28. Typical PISA assessment • Information collection • From students • 3½ hours of main domain assessment • 1 hour in relation to other subdomain • 2 hours on paper-and-pencil tasks (subset of all questions) • ½ hour for questionnaire on background, learning habits, learning environment, engagement and motivation • From school principals • Questionnaire (school demography, learning environment quality) • Indirect assssment of classroom variables (teacher, class)BEWARE!! Only adequate if grade based sampling has been applied

  29. PISA National Centre • Linking with international consortium • Implementation of framework ~national level • Reporting to consortium • Representation during international meetings

  30. PISA International • Strong prescriptive framework • Framework, timing, procedures, tools • National data gathering • International processing of dataPriority in international release of results • National (regional) processing of datanext priority level • Secondary analysis of data: data available

  31. New developments • Towards electronic assessment • 2009: first full scale trial • ERA Electronic Reading Assessment

  32. PISAProgramme for International Student Assessment PISA team Department of Education – Ghent University – Belgium Beijing – July 24-25, 2009 http://allserv.ugent.be/~mvalcke/CV/CVMVA.htm Martin.Valcke@ugent.be (Partly based on Schleicher, A., 2006)

More Related