1 / 49

Literacy for the 21 st Century: Problem solving in a technology-rich environment

Literacy for the 21 st Century: Problem solving in a technology-rich environment 24 September 2013. Agenda. PIAAC: the assessment of problem solving in technology-rich environments. William Thorn, OECD william.thorn@oecd.org. My presentation.

julius
Download Presentation

Literacy for the 21 st Century: Problem solving in a technology-rich environment

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Literacy for the 21st Century: Problem solving in a technology-rich environment 24 September 2013

  2. Agenda

  3. PIAAC: the assessment of problem solving in technology-rich environments William Thorn, OECD william.thorn@oecd.org

  4. My presentation • Provide an overview of the assessment of problem solving in technology-rich environments in PIAAC and other information the survey provides relevant to the digital age • Background to the survey, • Features of the computer-based assessment • Features of the assessment of problem solving • Digital reading • Information on use of ICTs

  5. Origins of PIAAC • Work on PIAAC began in early 2000s • Updating measures to increase relevance to the digital world • Expansion of the range of skills about which information collected (e.g. ‘generic’ skills) • Interest in the ‘demand’ for skills in addition to supply • Measurement of ‘human capital’ rather than ‘literacy’

  6. Objectives • Design of PIAAC finalised in 2007 • Broad objectives : • Provide high quality comparable information on the level and distribution of key information processing skills in the adult population • Show the relationship of these skills to individual and social ‘outcomes’ • Better understand the processes through which skills are gained, maintained and lost over the lifecycle

  7. Design features: content • Direct assessment of key information processing skills • Literacy (including reading components), numeracy, problem solving in technology-rich environments (PS-TRE) • Linked to IALS and ALL in domains of literacy and numeracy • Information on the use of literacy, numeracy and problem solving at work and elsewhere • Information on use of a range of other generic skills at work • Interaction, organisation (self and others), learning and physical skills • Information on antecedents and outcomes

  8. Design features • Target population – 16-65 year olds resident in national territory • Sample: probability sample representative of target population • Household survey • Computer delivery • BQ – CAPI • Assessment - CBA

  9. Participation • Round 1 (2008-2013) • 24 countries • Australia, Austria, Belgium (Flanders), Canada, Czech Republic, Denmark, Estonia, Finland, France, Germany, Ireland, Italy, Japan, Korea, Netherlands, Norway, Poland, Slovak Republic, Spain, Sweden, UK (England, Nth Ireland), US, Cyprus**, Russian Federation • Round 2 (2012-2015) • 10 countries • Chile, Greece, Indonesia, Israel, Lithuania, New Zealand, Singapore, Slovenia, Turkey

  10. **

  11. What is assessed? • Literacy • Reading components • Numeracy • Problem solving in technology-rich environments

  12. A computer-based assessment • Background questionnaire delivered in CAPI mode • Assessment taken by most people on a laptop using a purpose built application • Paper and pencil test available for respondents with no or low computer skills (literacy and numeracy only)

  13. The ICT core • Tested the basic skills necessary to complete the assessment on computer • Use of mouse • Scrolling • Drag and drop • Highlighting

  14. Problem solving in TRE • Defined as: • using digital technology, communication tools and networks to acquire and evaluate information, communicate with others and perform practical tasks”. • The first cycle of the Survey of Adult Skills focuses on “the abilities to solve problems for personal, work and civic purposes by setting up appropriate goals and plans, and accessing and making use of information through computers and computer networks” • Three dimensions define the construct • Content (text types) • Cognitive strategies (what one does to gain meaning) • Context (situation in which reading takes place)

  15. Content • Technology: • Hardware devices • Software applications • Commands and functions • Representations (e.g. text, graphics, video • Tasks: • Intrinsic complexity • Explicitness of the problem statement

  16. Strategies and Context • Cognitive strategies • Set goals and monitoring progress • Plan • Acquire and evaluate information • Use information Contexts • work-related • personal • Society and community

  17. Digital reading • The reading assessment covers the reading of both print and digital texts • Digital texts • Screen based displays • Hyperlinks • Specific navigation features (scrolling, clicking) • Unbounded information space

  18. Information on use of ICTs • Variety, frequency and complexity • ICT use (at work and elsewhere) • Ever used a computer or other digital device • Currently use a computer • Use email • use the internet in order to better understand issues related to your work • conduct transactions on the internet, for example buying or selling products or services, or banking • use spreadsheet software, for example Excel • use a word processor, for example Word • use a programming language to program or write computer code • participate in real-time discussions on the internet, for example online conferences, or chat groups

  19. Information on use of ICTs • Match/Mismatch of ICT skills • level of computer use needed to perform job • possesses sufficient computer skills to do job well • lack of computer skills has affected chances of promotion or pay rise

  20. Reporting • Results presented on a 500 point scale • test items and test-takers located on the same scale • Difficulty of items • Proficiency of persons

  21. Proficiency levels • To help interpret results the scale divided into proficiency levels • Descriptors developed to summarise the underlying characteristics of items in each level in terms of the literacy framework • Descriptive not normative • 4 proficiency levels in problem solving

  22. Proficiency levels

  23. Level 1 At this level, tasks typically require the use of widely available and familiar technology applications, such as e-mail software or a web browser. There is little or no navigation required to access the information or commands required to solve the problem. The problem may be solved regardless of the respondent’s awareness and use of specific tools and functions (e.g. a sort function). The tasks involve few steps and a minimal number of operators. At the cognitive level, the respondent can readily infer the goal from the task statement; problem resolution requires the respondent to apply explicit criteria; and there are few monitoring demands (e.g. the respondent does not have to check whether he or she has used the appropriate procedure or made progress towards the solution). Identifying content and operators can be done through simple match. Only simple forms of reasoning, such as assigning items to categories, are required; there is no need to contrast or integrate information.

  24. Level 3 • At this level, tasks typically require the use of both generic and more specific technology applications. Some navigation across pages and applications is required to solve the problem. The use of tools (e.g. a sort function) is required to make progress towards the solution. The task may involve multiple steps and operators. The goal of the problem may have to be defined by the respondent, and the criteria to be met may or may not be explicit. There are typically high monitoring demands. Unexpected outcomes and impasses are likely to occur. The task may require evaluating the relevance and reliability of information in order to discard distractors. Integration and inferential reasoning may be needed to a large extent.

  25. A comprehensive database Data is relevant to a range of questions regarding the capacity of adults to manage information in digital environments • The digital divide • What proportion of the population has no or very low computer skills? Who are they? • How proficient are adults at managing information in digital environments? • Differences between countries • Distribution in population (by age, gender, education etc.) • What are the characteristics of poor performers • What kind of tasks can the generality of the population successfully complete? • Literacy and the digital world • A literacy divide as much as a digital divide? • Computer use and the labour market • What skills are required, what skills do people have, returns to digital skills

  26. Output • October 2013 • International Report • Public use data set • Data explorer • Data analyser • 2014-2015 • Series of thematic reports

  27. The first international report • Skills Outlook will contain six chapters • Context: skills and trends in technology, the labour market an society • Cross-country comparisons of the level and distribution of adult skills • The distribution of proficiency among various socio-demographic groups in different countries • The skill proficiency of workers and the use of their skills in the workplace • Developing and sustaining information processing skills • The link between information processing skills and outcomes • Readers companion • Overview of what is measured and how the survey was implemented

  28. Thematic Reports • 6 over 2014-2015 • Will be a report on managing information in technology rich environments that explores the data I have described today

  29. Information accessible at: http://www.oecd.org/site/piaac/

  30. Thank you william.thorn@oecd.org

  31. Building Healthy Adult Skills in Ireland Inez Bailey, Director, National Adult Literacy Agency

  32. Building Healthy Adult Skills in Ireland Using PIACC to achieve a shift in understanding from ...

  33. National Digital Strategy‘Doing More with Digital’ Across Europe only 1 job in 10 will not need a digital skill by 2015. • The initial phase will focus on: • Stimulating the indigenous economy by helping small Irish business to expand on-line • Support in preparing the next generation for future jobs • 3. Making sure everyone in society benefits from digital

  34. Health of the Nation’s Skills • PIAAC – A health check on adult skills in Ireland: • Current skill levels • Skill loss • Skills mismatch • It will inform three key groups: • Policy Makers • Providers and Practitioners • People • alongside other data and information.

  35. InformingPolicy Makers

  36. Informing Providers and Practitioners

  37. Informing People

  38. Questions and Answers

  39. Further information NALA Sandford Lodge Sandford Close Ranelagh Dublin 6 Tel: (01) 412 7900 Website: www.nala.ie Distance learning website: www.writeon.ie http://facebook.com/nalaireland http://twitter.com/nalaireland http://www.youtube.com/user/nationaladultliterac

More Related