1 / 30

From Evaluation to Research Description of a continuum in the field of Global Education

From Evaluation to Research Description of a continuum in the field of Global Education. Overview. Evaluation and Research From Evaluation to Research – a continuum Models of Evaluation in the field of Global Education and their relevance for Research

gibson
Download Presentation

From Evaluation to Research Description of a continuum in the field of Global Education

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. From Evaluation to ResearchDescription of a continuum in the field ofGlobal Education

  2. Overview • Evaluation and Research • From Evaluation to Research – a continuum • Models of Evaluation in the field of Global Education and their relevance for Research • The analysis of learning effects as a challenge for Evaluation and Research

  3. Evaluation and Research

  4. Research “The activity of scientific research is to establish rates or systems of propositions and to systematically review, in the empirical sciences in particular hypothesis, theory, systems which are installed and checked against experience by observation and experiment." (Popper, 1966: 3) Research … • is a systemic activity (about a "system of records" or a "system of knowledge"); • is directed towards reality (empiricism); • works with the help of hypotheses, theories, and other means, and • arrives at general statements ("systems theory" be established and verified, or "causal relationships and regularities")

  5. Research Standards: 1) The results need to be provable. 2) The results need to be intersubjectively verifiable.

  6. Evaluation Evaluation is … • is a systemic activity • is directed towards reality (empiricism); • works with the help of pre-determined quality standards • arrives at project-related statements (assessment of strengths and weaknesses of a certain project reality) Standards: • Utility • Feasibility • Propriety • Accuracy

  7. Evaluation and Research

  8. From Evaluation to Research

  9. From Evaluation to Research

  10. Models of Evaluation and their relevance for Research

  11. Models of Evaluation We can distinguish three models… • Concept Evaluation • Performance Evaluation • Impact Evaluation

  12. Concept Evaluation Examples: • Evaluating the quality of an educational programme’s concept … • Evaluating the quality of teaching material … …with normatively defined quality standards/criteria Possible aspects of concept evaluations: • Objectives • Content • Educational Methodology ...

  13. Concept Evaluation Possible questions: • Concerning the objectives: Do the educational objectives of an activity relate adequately to the target group? Are these objectives expressed in an understandable way? • Concerning the content: Does the content relate appropriately to the objectives / the target group? Does the chosen content and its didactic format comply with the quality criteria of Global Education?

  14. Performance Evaluation Examples: • Evaluating the implementation of a Global Education project in a school • Evaluating the organisational structures of an institution Possible aspects of performance evaluations: • Project management • Transparency • Educational Methodology ....

  15. Performance Evaluation Possible questions: • Concerning the project management: Does the allocation of responsibilities correspond to the working conditions? • Concerning the educational methodology: Is the learning material used properly / adequately for the particular target group?

  16. Impact Evaluation Examples: • Evaluating the widespread impact of a certain material • Evaluating the impact of an activity / project on a particular target group Possible aspects of impact evaluations: • Spread • Satisfaction • Learning effects …

  17. Impact Evaluation Possible questions: • Concerning satisfaction: In which aspects are the stakeholders / target groups content with the performance / conduction of the project? • Concerning the learning effects: How do the target groups assess their own attainment of knowledge? What real learning effects can be recognized after a certain project / activity?

  18. Concept Evaluation and its relevance for Research Concept Evaluation: comparison of concept and quality standards Research: comparison of concept A and quality standards comparison of concept B and quality standards comparison of concept C and quality standards To which extend do the practical concepts comply with the theoretical discourse or the normative quality standards/ criteria in the field of GE/DE? Sociological interest in the correspondance of norms of a practical field

  19. Performance Evaluation and its relevance for Research Performance Evaluation:How is the project being carried out? Has the performance been aim-oriented? Research: Performance Evaluation A Performance Evaluation B Performance Evaluation C Interest from the perspective of theories on organizational development and theories of project management Organizational Development Project Management

  20. Impact Evaluation and its relevance for Research • In the field of GE/DE 3 different types of impact can be distinguished: • the widespread impact through a project or activity, e.g. the number of people reached, the number of material distributed, the number and type of media coverage etc., • the impact on the institutional level, e.g. the strengthening of a field of activity or the extension of projects in GE/DE within a NGO, • the impacts on the level of the target groups; in this case, another distinction can be made between the effects on the level of satisfaction statements, self-mentioned/reported learning and real learning effects.

  21. Impact Evaluation and its relevance for Research Impact Evaluation: Widespread impact Impact on institutional level Self-identified/reported learning (Cognitive) Learning effects Research: Impact Evaluation A Impact Evaluation B Impact Evaluation C Interest from the perspective of communication and media research Interest from the perspective of research into teaching and learning distribution/acceptance self-reflective competencies real learning effects

  22. The analysis of learning effects as a challenge for Evaluation and Research

  23. Analysis of learning effects as a challenge Individual preconditions Learning activities e.g. active use of time Input e.g. quality of material Teacher/ Trainer/ Educator Mediators e.g. motivation Impact Context

  24. Analysis of learning effects as a challenge • Learning is very complex. • There is no linearity of teaching and learning. • Learning (esp. change of human behaviour) isn’ t always a logical response to a certain activity. • Learning has different dimensions. • - Knowledge • - Competencies concerning values and judgement • - Competencies/Strategies concerning communication and action

  25. Analysis of learning effects as a challenge

  26. Analysis of learning effects as a challenge Analysing the acquisition of knowledge:

  27. Analysis of learning effects as a challenge Analysing the acquisition of competencies concerning values and judgement: Decision and action alternatives by giving short tales concerning topics, which the participants have to weigh against each other or state reason for their choice Focus: To what extent does an activity create the necessary cognitive preconditions / qualifications (!) for the intended change in values and judgement to become most probable?

  28. Analysis of learning effects as a challenge Analysing the acquisition of competencies concerning communication and action / strategies: Scenic challenge-situations for which future solutions have to be found Focus: Examination of certain prevailing conditions like e.g. - the capability / ability to predict future developments - the capability / ability to set individual goals - the capability / ability to shape change processes

  29. Conclusion We need … … to gain experience with the synopsis of evaluations which are carried out according to similar criteria. … to complement evaluation with research by taking out small areas that can be dealt with according to the standards of research. To do so it is necessary … … to find more adequate methods concerning different evaluation targets … that evaluators and researchers get into contact with each other.

  30. Thank you very much!

More Related