1 / 12

Community Briefing Event - 24th June 2008

Community Briefing Event - 24th June 2008. Assessment Demonstrators: Overview Myles Danson e-Learning Programme Manager. Joint Information Systems Committee. Supporting education and research. Personalised learning. Lifelong learning. Learning & Teaching Practice. Technology & Standards.

magnar
Download Presentation

Community Briefing Event - 24th June 2008

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Community Briefing Event - 24th June 2008 Assessment Demonstrators: OverviewMyles Dansone-Learning Programme Manager Joint Information Systems Committee Supporting education and research

  2. Personalised learning Lifelong learning Learning & Teaching Practice Technology & Standards Learning Resources and Activities E-Assessment Technology Enhanced Learning Environments E-Administration E-Portfolios Strategy & Policy Work-based learning Widening participation Context JISC Circular 08/08: Projects in the areasof curriculum delivery, assessmentand course advertising

  3. Demonstrator Project - Assessment ToolsAdministrative Summary • Let via a JISC Circular: “HE institutions funded via HEFCE and HEFCW, and by FE institutions in England that teach HE to more than 400 FTEs” • Up to 4 projects, each up to a maximum of £45,000 (inc VAT, expenses). Each project with an available £15,000 additional funding for the toolkit provider(s) to support the project. • Limited to institutions/organisations not having lead those original toolkit/demonstrator projects being the focus of the proposed demonstrator project. • Work to be completed by end March 2009 • “risk-managed setting” • The deadline for receipt of proposals is no later than 12.00 noon on 1 August 2008.

  4. Demonstrator Project - Assessment Tools Aim • “composite applications using existing components from existing JISC Toolkits and elsewhere” • validate the toolkits through application in an institutional setting outside of that which developed the original software. • At least 2 toolkits/components, not necessarily components from previous JISC projects • leverage existing assessment content resources • Broaden the uptake and usage of QTI2 and support the developer community.

  5. Demonstrator Project - Assessment Tools Previous Work • Projects may wish to take account of the existing toolkit work, such as: • ASDEL (QTI2 delivery service) • AQURATE (QTI2 authoring service) • Minibix (QTI2 item bank service) • MCQFM (QTI2 authoring service) • XMARKS (mark and assessment exchange service) • PeerPigeon (peer review resource submission and distribution service) • SPAID (QTI2 packaging service) • PyAssess (QTI2 response processing and migration) • Previous demonstrator work, such as • Serving Maths (mathematical assessment delivery demonstrator) • ASSIS (assessment simple sequencing integration services) • ASAP (automated programming system) • CATS (Automated assessment construction)

  6. Demonstrator Project - Assessment Tools Previous Work (2)

  7. Demonstrator Project - Assessment Tools Previous Work (3) • Projects may wish to take account of content services, such as; • UKCDR • HEA Physical Sciences • JORUM • Question Mark Perception SWAP • Respondus Test Bank Network • The Open University OpenLearn Initiative • COLEG/COLA resources • This is not a definitive list of related work. Projects may build on resources not listed here.

  8. Demonstrator Project - Assessment Tools Possible Applications Possible Applications could include: • Workflow applications for item production and banking processes such as quality assurance, editorial, etc. • Orchestration of test creation and assembly applications • Loading/converting of existing item bank content to a QTI2 item bank • Wider assessment applications (eg online surveys) • Adapting components/groups of components to alternative requirements for example formative assessment delivery applications • Multi-standard player application (eg SCORM2004, QTI1.x, QTI2.x) • Integration of/connection of existing components into other information systems, eg learning environments, administration and student record systems • Integration with e-portfolio services

  9. Demonstrator Project - Assessment Tools Requirements • Projects will need to: • Identify a number of JISC toolkits / demonstrators, institutional systems and content assets and develop composite applications for e-assessment through exploitation of web services and where necessary the creation of web service APIs in existing applications. • Deploy an instance of the demonstrator application or service and investigate its performance with full consideration to all risks. • Engage with the emerging QTI v2 developer and user communities (and other communities as appropriate) and contribute to its ongoing development.

  10. Demonstrator Project - Assessment Tools Proposal Evaluation Criteria • Relevance and transferability (30%) • A demonstration that the methodology will lead to the aims, objectives and the deliverables being met (30%) • A credible, realistic and achievable plan of work (30%) including • Staff with appropriate experience and sufficient time to deliver the project; • appropriate plans for project management including management of partners if the proposal is made by a consortium; • the analysis and management of risks to successful completion of the study; • Track record of undertaking comparable work • Experience of developing within the JISC e-Framework • Value for money (10%)

  11. Assessment Demonstrators: OverviewMyles Dansone-Learning Programme Manager QUESTIONS Myles Danson JISC Programme Managerm.danson@jisc.ac.uk 07796 336319 The full text of the circular is available at: http://www.jisc.ac.uk/fundingopportunities/funding_calls/2008/06/circular808.aspx

  12. Key Interfaces Core System Components Item writer (Wysiwig and/or player) Workflow Production (WYSIWYG or +player) Item Authoring System Pre-release QA and review Item Export/Import Items released for editing(not always necessary) Published items Item & test performance mgmt Item and Test Banking System Test assembly & rules mgmt Bank lifecycle management Item Export/Import Dynamic and Static Test Content Item Response & test usage data Integration with learning platform Test Delivery System Item & test player Test administration Candidate administration Results view and export

More Related