1 / 22

The Impact of Evaluation - supporting change in VET institutions by action research

The Impact of Evaluation - supporting change in VET institutions by action research. Ludger Deitmer, University of Bremen Institute of Technology and Education. Programme Administrator (ITB). 21 Pilot Projects. 4 Research Studies. 11 Single Partner Projects. 8 Multi Partner Projects.

feng
Download Presentation

The Impact of Evaluation - supporting change in VET institutions by action research

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The Impact of Evaluation - supporting change in VET institutions by action research Ludger Deitmer, University of Bremen Institute of Technology and Education

  2. Programme Administrator (ITB) 21Pilot Projects 4 Research Studies 11 Single PartnerProjects 8 Multi Partner Projects 2 Research Projects 14 States (Laender) 100 VET-schools 14.000 students 500 VET-Teacher 18 VET Research Institutes Duration: 01.10.1998 to 30.09.2003 The Programme Financed by the Ministry of Education and Research (bmb+f) and the participating Federal States (Bundesländer) Fundsapprox. 13 Mio Euro

  3. What did the programme? • Change of perspective: from subject organised curricula and teaching towards work processes related curricula • Building project networks: several schools work on the same topic and in different states; this should improve the cross Laender transfer and dissemination of best practice into different regions • Teachers work on the whole context of learning processes in the VET system, occupational analysis in industry, didactical design of (complex) learning situation, curriculum design, implementing into the institutional structures, the evaluation of learning processes, the use of multi-media learning arrangements

  4. Some Implications for Modellversuche • Modellversuche rely on a good working partnership under internal and external actors and strong leadership • Our experience is that project partners pursue different orientations: without making these very clear! • Some examples for Malfunctioning elements; goal understanding, use and distribution of resources, project management not working, communication is seldom,.... • Missing diagnosis on performance of Modellversuch • Need: interactive and user centered evaluation as an integral part of the innovation process

  5. Measures undertaken by action research (AR) in the view of the pilot actors (teachers involved)

  6. R&D Innovation in HRD or VET Inputs (human resources, money, equipment etc.) Outputs (products, patents etc.) Transaction processes (Innovation process as such) Most Evalaution methods refer to: Input or Output (quantitative or quantifiable data): External evaluators doing summative evaluations Focus on the innovation process itself is missing Formative Evaluation to support the innovation process and the actors involved Portfolio of Evaluation methods

  7. Using reflexive and dialogical approach in the evaluation sessions • A moderated and guided (questionnaire) group discussion among participants in a project networks took place (self-evaluation) • Those involved in the project evaluate their own work and/or (interim) results of their project under the direction of the moderator. • Participants from these network were asked to do an individual and collective weighting of criteria: How important is the criteria for you (in %)? • Following this the participants were asked to judge the criteria: How far have these criteria been achieved by the improvements made in the project network? • Different weightings and judgements of the criteria by the project partners is used to start a discussion on consensus or dissent.

  8. Three step approach • Evaluation workshop (group discussion) with individual and collective weighting and scoring of the main and sub criteria (self evaluation as core element) • Concise summary of the workshop by analysing strength and weaknesses, preparing quantitative & qualitative data • Feed-back on results & work out PPP, Modellversuche and OD project prospect: recommendations for follow up processes

  9. Construction of questionnaire

  10. 8 main criteria for innovation objectives (A, B, C, D) and innovation effects (E, F, G, H) Objectives Work process orientation and the relationship between learning and working (A) Self-directed and self-organised learning (B) Vocational and shaping Competence (C) Holistic forms of learning (D) Effects Internal transfer (E) External transfer (F) Improving the scientific knowledge base and the educational planning (G) New teaching practice und professionalisation (H)

  11. Measuring single Modellversuch performance by innovation spider

  12. Main methodological contributions from the evaluation approach for action research • provision of data on Modellversuch performance in relation to important success factors • analyze data by synthesizing the outcomes into an Evaluation spider • Spider allows the description of the specific strength and weaknesses by its shape • compare spiders • relating the empirical examples to the existing research on network structures • generating ideal typed functions and problems of networks

  13. Factors of the evaluation design. Adequateness (1, 2 and 4) and Quality (3, 5, 6) (1) Fitting of criteria definition process (4) Right actors (2) Values of evaluation (5) Timing and time (3) Completeness of evaluation (6) Competence of the evaluation actors

  14. Enablers and constrains for change via self evaluation Organisation context Goals, Motivations, structure, fields of responsibility Transferring projects to a wider field Communication and Co-operation structures in projects and the organsations Resources: time, personal, financial

  15. Relation of Evaluation & Learning Adequateness Quality standards Evaluation methods Learning Effects Internal transfer External Transfer VET Projects

  16. 25,0 What was helpful during pilot project realisation? 20,0 15,0 10,0 5,0 0,0 Co-operation in the pilot team Co-operation with other schools, other pilots, industry Co-operation researchers Further training Evaluation More resources

  17. School internal cooperation after pilot project has be finalised (only teacher) 80 Little cooperation Not continuous cooperation Intensive cooperation n 67,95 65,33 62,67 61,84 60 53,75 50,00 frequency [in%] 40,00 37,50 40 33,33 32,00 22,37 17,95 20 15,79 14,10 12,50 6,25 4,00 2,67 0 school department School management External With others subject (n = 75) Under all school teachers (n = 80) (n = 78) (n = 75) (n = 76) (n = 8)

  18. Interfaces Actors who work in the organisation the innovation is going to be implemented Project actors

  19. The problem of interfaces!

  20. Conclusion: Evaluations can trigger change, but a set of factors have to be meet: • Autonomy of participants has to be secured • Different exspectations and interest under partners have to be made transparent • Spread out the evaluation to the interfaces • Internal organisational barriers and malfunctioning elements have to be made explicit • Pilot projects have to be combined with organisational development throughout the organisation • A quality management process has to be established within the school organisation • Formative evaluation can help to ensure that QM and innovation processes are seen in context

  21. Thank you for your interest, more information under about BLK Modellversuche under http://www.itb.uni-bremen.de/projekte/blk/programmtraeger.htm

More Related