1 / 44

TVET Australia

TVET Australia . Assessment, validation and moderation A power point presentation developed by the NQC to support information sessions on assessment, validation and moderation . Contact . NQC Secretariat TVET Australia Level 22/ 390 St Kilda Road Melbourne Vic 3004

welcome
Download Presentation

TVET Australia

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. TVET Australia Assessment, validation and moderation A power point presentation developed by the NQC to support information sessions on assessment, validation and moderation

  2. Contact NQC Secretariat TVET Australia Level 22/ 390 St Kilda Road Melbourne Vic 3004 Telephone: +61 3 9832 8100 Email: nqc.secretariat@tvetaustralia.com.au Web: www.nqc.tvetaustralia.com.au

  3. Disclaimer This work has been produced on behalf of the National Quality Council with funding provided through the Australian Government Department of Education, Employment and Workplace Relations and state and territory governments. The views expressed in this work are not necessarily those of the Australian Government or state and territory governments

  4. Acknowledgement This presentation was designed to support the interactive information sessions that formed part of the NQC’s communication and dissemination strategy: NQC products: validation and communication. Reports and materials which focus on validation and moderation may be downloaded from the NQC website at http:www.nqwc.tvetaustralia.com.au/nqc_publications This work was produced for the National Quality Council by Andrea Bateman, Quorum QA Australia Pty Ltd Chloe Dyson, Quorum QA Australia Pty Ltd

  5. Quality of assessment

  6. Setting the Scene • Concerns about the quality of assessments and comparability of standards across the VET sector • OECD (2008) Reviews of VET (Australia) • NQC (2008) Industry Expectations of VET • Service Skills SA (2010) VETiS Project

  7. Today’s workshop • Assessment • Developing Assessment Tools • Competency Mapping • Simulated Assessment • Engaging industry • Assessment Quality Management Framework • Validation and moderation • System considerations • Diverse settings

  8. NQC resources • Guide for Developing Assessment Tools • Assessment Facts Sheets • Simulated Assessment • Making Assessment Decisions • Peer Assessment and Feedback • Quality Assuring Assessment Tools • Assessor Partnerships • Systematic validation • Assessor Guide: Validation and Moderation

  9. Session 1: What is assessment? • Purposeful process of systematically gathering, interpreting, recording and communicating to stakeholders, information on student performance.

  10. Assessment Purposes • Evaluative • designed to provide information to evaluate institutions and curriculum/standards – primary purpose is accountability • Diagnostic • Produce information about the candidate’s learning • Formative • Produce evidence concerning how and where improvements in learning and competency acquisition are required • Summative • Used to certify or recognise candidate achievement or potential

  11. Assessment Purposes Assessment for learning occurs when teachers use inferences about student progress to inform their teaching (formative) Assessment as learning occurs when students reflect on and monitor their progress to inform their future learning goals (formative) Assessment of learning occurs when teachers use evidence of student learning to make judgements on student achievement against goals and standards (summative) http://www.education.vic.gov.au/studentlearning/assessment/preptoyear10/default.htm

  12. Session 2: Developing Assessment Tools NQC Products • Guide for Developing Assessment Tools • Assessment Facts Sheets • Simulated Assessment • Making Assessment Decisions • Peer Assessment and Feedback • Quality Assuring Assessment Tools • Assessor Guide: Validation and Moderation http://www.nqc.tvetaustralia.com.au/nqc_publications

  13. Impact Changes to definitions within the NQC publications • AQTF 2010 User Guide documentation; and the • Training Package Development Handbook Reliability Validity Assessment tool Validation Moderation

  14. Key Stages – developing assessment tools identify and describe the purposes for the assessment identify the assessment information that can be used as evidence of competence/learning identify a range of possible methods that might be used to collect assessment information define the contexts for interpreting assessment information in ways that are meaningful for both assessor and candidate determine the decision making rules define procedures for coding and recording assessment information identify stakeholders in the assessment and define their reporting needs.

  15. Essential Characteristics - Assessment Tool An assessment tool includes the following components: • The context and conditions for the assessment • The tasks to be administered to the candidate • An outline of the evidence to be gathered from the candidate • The evidence criteria used to judge the quality of performance (i.e., the assessment decision making rules); as well as the • The administration, recording and reporting requirements.

  16. Ideal Characteristics • The context • Competency mapping • The information to be provided to the candidate • The evidence to be collected from the candidate • Decision making rules • Range and conditions • Materials/resources required • Assessor intervention • Reasonable adjustments • Validity evidence • Reliability evidence • Recording requirements • Reporting Requirements

  17. Competency Mapping • The components of the Unit(s) of Competency that the tool should cover should be described. This could be as simple as a mapping exercise between the components within a task (eg each structured interview question) and components within a Unit or cluster of Units of Competency. The mapping will help determine the sufficiency of the evidence to be collected as well as the content validity. Advice regarding competency mapping can be found in the NQCAssessor Guide: Validation and Moderation

  18. Competency Mapping

  19. Competency Mapping: Steps in the process • Step 1: Unpack the unit of competency to identify its critical components. • Step 2: For each assessment method, list the tasks to be performed by the candidate. • Step 3: For each assessment method, map the critical components of the unit to each assessment task. Refer to NQCAssessor Guide: Validation and Moderation

  20. Level of specificity in mapping – Risk Assessment Risk can be determined by consideration of: • Safety (eg potential danger to clients from an incorrect judgement) • Purpose and use of the outcomes (eg selection purposes) • Human capacity (eg level of expertise and experience of the assessors) • Contextual (eg changes in technology, workplace processes, legislation, licensing requirements and/or training packages)

  21. Decision Making Rules • The rules to beused to: • Check the quality of the evidence (i.e. the rules of evidence) • Judge how well the candidate performed on the task according to the standard expected • Synthesise evidence from multiple sources to make an overall judgement Additional advice – refer to Fact Sheet 2

  22. Reasonable Adjustments • This section of the assessment tool should describe the guidelines for making reasonable adjustments to the way in which evidence of performance is gathered without altering the expected performance standards (as outlined in the decision making rules).

  23. Simulated assessment • For the purposes of assessment, a simulated workplace is one in which all of the required skills are performed with respect to the provision of paid services to an employer or the public can be demonstrated as though the business was actually operating. • In order to be valid and reliable, the simulation must closely resemble what occurs in a real work environment. • The simulated workplace should involve a range of activities that reflect real work experience. It should allow the performance of all of the required skills and demonstration of the required knowledge. Ref: AQTF definition (refer to Activity Handout), Assessment Fact Sheet 1

  24. Activity 1: Engaging Industry • In your groups discuss what input employers (you might wish to specify a vocational area) could provide to develop valid assessment tools and processes. • For the following scenarios, note down 2/3 questions you could ask employers and how the responses will inform the development or review of assessment tools and/or processes. • Relevant NQC support materials: • Industry Enterprise & RTO Partnership • Assessment Fact Sheets: Assessor Partnerships • Assessor Guide: Validation and Moderation

  25. Activity 2: Self Assessment • In groups of 3, review the assessment tool using the self assessment checklist from the NQC (2009) Implementation Guide (Template A.1, p. 45). • Identify any gaps in the tool? • Discuss the pros and cons of including such additional information within the tool?

  26. Tool Review • Has clear, documented evidence of the procedures for collecting, synthesising, judging and recording outcomes (i.e., to help improve the consistency of assessments across assessors [inter-rater reliability]). • Has evidence of content validity (i.e., whether the assessment task(s) as a whole, represents the full range of knowledge and skills specified within the Unit(s) of competency. • Reflect work-based contexts, specific enterprise language and job-tasks and meets industry requirements (i.e., face validity). • Adheres to the literacy and numeracy requirements of the Unit(s) of Competency (construct validity). • Has been designed to assess a variety of evidence over time and contexts (predictive validity). • Has been designed to minimise the influence of extraneous factors (i.e., factors that are not related to the unit of competency) on candidate performance (construct validity).

  27. Tool Review • Has clear decision making rules to ensure consistency of judgements across assessors (inter-rater reliability) as well as consistency of judgements within an assessor (intra-rater reliability). • Has a clear instruction on how to synthesise multiple sources of evidence to make an overall judgement of performance (inter-rater reliability). • Has evidence that the principles of fairness and flexibility have been adhered to. • Has been designed to produce sufficient, current and authentic evidence. • Is appropriate in terms of the level of difficulty of the task(s) to be performed in relation to the skills and knowledge specified within the relevant unit(s) of Competency. • Has outlined appropriate reasonable adjustments that could be made to the gathering of assessment evidence for specific individuals and/or groups. • Has adhered to the relevant organisation assessment policy.

  28. Quality Checks • Panel • Pilot • Trial Refer to Fact Sheet 4, Quality assuring assessment tools

  29. Session 3: Assessment Quality Management NQC Products • Code of Professional Practice: Validation & Moderation • Implementation Guide: Validation and Moderation • Assessment Facts Sheets • Quality Assuring Assessment Tools • Systematic Validation • Assessor Partnerships • Assessor Guide: Validation and Moderation http://www.nqc.tvetaustralia.com.au/nqc_publications

  30. Validation • Validation is a quality review process. It involves checking that the assessment tool produced valid, reliable, sufficient, current and authentic evidence to enable reasonable judgements to be made as to whether the requirements of the relevant aspects of the Training Package or accredited course had been met. It includes reviewing and making recommendations for future improvements to the assessment tool, process and/or outcomes. NQC Implementation Guide: Validation and Moderation 2009

  31. Outcomes of validation Recommendations for future improvements • Context and conditions for the assessment • Task/s to be administered to the candidates • Administration instructions • Criteria used for judging the quality of performance (e.g. the decision making rules, evidence requirements etc) • Guidelines for making reasonable adjustments to the way in which the evidence of performance was gathered to ensure that the expected standard of performance specified within the Unit(s) of Competency has not been altered • Recording and reporting requirements.

  32. Moderation • Moderation is the process of bringing assessment judgements and standards into alignment. It is a process that ensures the same standards are applied to all assessment results within the same Unit(s) of Competency. It is an active process in the sense that adjustments to assessor judgements are made to overcome differences in the difficulty of the tool and/or the severity of judgements. NQC Implementation Guide: Validation and Moderation 2009

  33. Outcomes of moderation Recommendations for future improvement and adjustments to assessor judgements (if required) and Recommendations for improvement to the assessment tools Adjusting the results of a specific cohort of candidates prior to the finalisation of results and Requesting copies of final candidate assessment results in accordance with recommended actions.

  34. Validation vs Moderation

  35. Types of Approaches - Statistical • Limited to moderation • Yet to be pursued at the national level in VET • Requires some form of common assessment task at the national level • Adjusts level and spread of RTO based assessments to match the level and spread of the same candidates scores on a common assessment task • Maintains RTO-based rank ordering but brings the distribution of scores across groups of candidates into alignment Strength • Strongest form of quality control Weakness • Lacks face validity, may have limited content validity

  36. Types of Approaches - External • Types • Site Visit Versus • Central Agency Strengths • Offer authoritative interpretations of standards • Improve consistency of standards across locations by identifying local bias and/or misconceptions (if any) • Educative Weakness • Expensive • Less control than statistical

  37. Types of Approaches – Assessor Partnerships • Validation only • Informal, self-managed, collegial • Small group of assessors • May involve: • Sharing, discussing and/or reviewing one another’s tools and/or judgements • Benefit • Low costs, personally empowering, non-threatening • May be easily organised • Weakness • Potential to reinforce misconceptions and mistakes Ref: Implementation Guide, Assessment Fact Sheet 5

  38. Types of Approaches - Consensus • Typically involves reviewing their own & colleagues assessment tools and judgements as a group • Can occur within and/or across organisations • Strength • Professional development, networking, promotes collegiality and sharing • Weakness • Less quality control than external and statistical approaches as they can also be influenced by local values and expectations • Requires a culture of sharing

  39. Systematic Validation (consensus)

  40. System considerations • What is the most appropriate approach to validation?

  41. Assessment Quality Management

  42. Quality management in diverse settings • Identified barriers: • Structural (i.e., the organizational and resource aspects) – financial, variations of definitions across key documents • Process (i.e., the practices and activities that take place) – rolling enrolments, partnering arrangements, workloads • Personal factors (i.e., the attitudinal, assessment literacy and expectations of the key players). • Strategies deployed by RTOs Refer to Handout – Quality management processes in diverse settings.

  43. Activity 3: Assessment Quality Management • Activity 4: Assessment Quality Man

  44. Chloe Dyson Director Education Consultant Quorum QA Australia Pty Ltd Email: chloed@alphalink.com.au Phone: 0408124825 Andrea Bateman Director Education Consultant Quorum QA Australia Pty Ltd Email: andrea@batemangiles.com.au Phone: 0418 585 754 Principal author Associate Professor Shelley Gillis Deputy Director Work-based Education Research Centre Victoria University Email: shelley.gillis@vu.edu.au Phone: 0432 756 638

More Related