1 / 45

Roles and Responsibilities for Evaluation in Foreign Language Programs

This article explores the roles and responsibilities involved in evaluating foreign language programs. It discusses the various stakeholders and their responsibilities, as well as different perspectives and approaches to evaluation. The article also highlights the importance of considering evidence, motivations, and uses of evaluation in language programs. Overall, it provides insights into the changing landscape of language program evaluation.

wrightjane
Download Presentation

Roles and Responsibilities for Evaluation in Foreign Language Programs

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Roles and Responsibilities for Evaluation in Foreign Language Programs John M. Norris & Yukiko Watanabe University of Hawai‘i at Mānoa

  2. What are the roles & responsibilities in language program evaluation? Responsibilities Roles Request evaluation Initiator Implement evaluation Facilitator Collect information Methodologist Provide information Evaluand Use information Audience Respond to evaluation Stakeholder Who plays these roles and assumes these responsibilities in language programs?

  3. Who plays a role in language programs? Funders Professional Organizations Policy makers Administrators Staff Teachers Learners Professional Evaluators Parents Academics, Researchers Society – The ‘Public’ Commercial Publishers

  4. Received view of language program evaluation Scriven (1997): Evaluation is… “the process of determining the merit, worth, and value of things” Who makes the determination? When does it happen? And where? On the basis of what kinds of evidence? Who gathers the evidence? Which criteria determine value? Then what happens? And why?

  5. Who evaluates language programs? JIJOE (Jet-in, Jet-out Expert) External evaluators hired by funders (e.g., ODA) Report to the funders Terminate or continue? Cost-effectiveness?  Pre-set assumptions about the program A priori criteria & methods set by the funders & external evaluators  External evaluator spends a few days on site  Collects necessary information via language tests  Makes judgments & leaves!

  6. Who evaluates language programs? Funders Professional Organizations Policy makers Administrators Staff Teachers Learners Professional Evaluators Parents Academics, Researchers Society – The ‘Public’ Commercial Publishers

  7. Who evaluates language programs? Accountability evaluation Policy makers Testers Perspectives Values IMPOSED What counts as “evidence”? Motivations for evaluating? Uses of evaluation?  Only measurable outcomes  Politics, public scrutiny • Forced compliance with educational policy

  8. Who evaluates language programs? Funders Professional Organizations Policy makers Administrators Staff Teachers Learners Professional Evaluators Parents Academics, Researchers Society – The ‘Public’ Commercial Publishers

  9. Who evaluates language programs? Managerial evaluation Administrators evaluate (Directors, Chairs, Principals)  Bureaucratic reporting demands (accreditation) Bean-counting, monitoring, observation emphasized  Exercising organizational control  Financial considerations are paramount  Findings are mostly reported but seldom used; occasionally pejorative (e.g., personnel actions)

  10. Who evaluates language programs? Funders Professional Organizations Policy makers Administrators Staff Teachers Learners Professional Evaluators Parents Academics, Researchers Society – The ‘Public’ Commercial Publishers

  11. Received view of language program evaluation • Characteristics… • Few decision-makers • External impetus • Generic focus on worth • Measurement-driven • Decontextualized • Product-oriented • Summative judgmental • Focus on what and how Problems… Accuracy of findings? Relevance of findings? Acceptance/resistance? Participation? Impact on L2 learning? Use of findings? Learning from process? Under these familiar approaches, evaluation gets done efficiently, but it generally meets only program-external bureaucratic or political needs; evaluation is done to programs, not with programs.

  12. Changing landscapes in language program evaluation

  13. Revised views of language program evaluation Patton (1997): “Program evaluation is undertaken to inform decisions, clarify options, identify improvements, and provide information about programs and policies within contextual boundaries of time, place, values, and politics.” (p. 24) Focus is on particular programs and what is meaningful to them…

  14. Revised views of language program evaluation “Evaluation is the gathering of information about any of the variety of elements that constitute educational programs, for a variety of purposes that include primarily understanding, demonstrating, improving, and judging program value; evaluation brings evidence to bear on the problems of programs, but the nature of that evidence is not restricted to one particular methodology.” Norris (2006, p. 579) Process Frame and focus Impetus/Use Prioritization

  15. Revised views of language program evaluation UTILITY: The Utility Standards are intended to ensure that an evaluation will serve the practical information needs of intended users. Joint Committee on Educational Evaluation (1994) Focus on who and why…

  16. Revised views of language program evaluation Major purposes for evaluation Summative purpose: To judge merit, worth, value, and effectiveness of programs Formative purpose: To develop and improve programs (An iterative process of reflection and innovation) Develop Improve Judge Transform Fund Hold accountable Knowledge construction purpose: generate theory, share perceptions & understandings about programs Empowerment purpose: educate, empower, encourage ownership among program constituents via participation Generate knowledge Test theory Empower Educate Illuminate Advocate

  17. Revised views of language program evaluation • U.S. college FL survey respondents desired increaseduse of evaluation for: • Understanding & improving program outcomes • Understanding & improving program functions • Improving FL education on the whole • Understanding & Improving the worth of the program • Raising awareness about FL programs

  18. Revised views of language program evaluation NCATE - TESOL and ACTFL standards for language teacher development programs: Teachers understand and can use a variety of methodologies for gathering information about learners Teachers use information for improving teaching and learning Japan Evaluation Society, Certification of School Evaluators Evaluators understand design and methodology options Evaluators implement evaluations in schools and organizations Evaluators report and utilize results

  19. Revised views of language program evaluation “If evaluation in English Language Teaching is to be effective, we will see a stronger integration of evaluation within practice, as part of an individual’s professionalism, and an increase in collaborative activity where teachers (and other relevant participants) are actively engaged in the monitoring process.” (Rea-Dickins, 1994, p. 84)

  20. Revised views of language program evaluation • Characteristics… • Multiple stakeholders • Variety of users • Variety of uses • Multiple methods • Context-specific • Process + Product • Formative + Summative Implications… Represent stakeholders Internal genesis for eval Ensure participation Uses determine method Support L2 learning Maximize utility Who does what when? With traditional views of evaluation, it was easy (though not particularly helpful) to be evaluated—someone else took responsibility. With revised views, if evaluation is to play a different and useful role in language programs, we all will have to take responsibilities for doing and using evaluation.

  21. Who evaluates language programs? Funders Professional Organizations Policy makers Administrators Staff Teachers Learners Professional Evaluators Parents Academics, Researchers Society – The ‘Public’ Commercial Publishers

  22. Use-driven & participatory language program evaluation

  23. Backwards buildup: Use-driven evaluation What kinds of decisions or actions will be taken on the basis of evaluation findings? Additional considerations? Who is implied? What do they do? Program chair • Anticipate and discuss some of the contextual constraints in taking an action. Additional resources? Time? Expertise? Strategize how to overcome those constraints. • Prioritize the action items. • Who are responsible for what task by when? •  Strategize tasks each user group can act on. • Do you know enough about intended uses by specific intended users to be able to monitor their actual uses of findings? • Do the action items also include Curriculum coordinators Teachers Program administrators Facilitation Engage user groups in decision making. Assure that planned action is manageableand concrete. Evaluator

  24. Backwards buildup: Use-driven evaluation What kinds of decisions or actions will be taken on the basis of evaluation findings? Lang program example Who is implied? What do they do? Process Learning • Iterative meetings to creategoal statements and plan for curricular alignment (state learning outcomes for each grade) & improvement. • Unexpected use: - Resource sharing system - Revised stud. handbook - Pedagogical task bank • Built knowledge about curriculum. • Culture of professional teacher community emerged through the project. • Built sense of accountability & responsibility towards public and students. Eng prog chair Curriculum & instruction Eng curriculum coordinators Full-time Eng teachers Principle Publicity • Articulated goals in the publicity materials. HS admission officers Evaluation consultant Lack of access to resources  Followed-through on needed resources (e.g., example syllabi & tasks)

  25. Backwards buildup: Use-driven evaluation How will findings be reported? Additional considerations? Who is implied? Audience? What do they do? • Solicit what the purpose of reporting is from the PIUs? •  Consensus building? •  Information session? • Find if there are any reporting demands from distinct users? • Negotiate the timing of reporting (maximally useful timing for the PIUs) • Negotiate the format of reporting aligning with the reporting purpose. Evaluator Prog chair • How do you ensure PIU’s sense of ownership of the report? • Do not surprise the PIUs in the report format and content. • Strategies: - Create Mock report • - Communicate • preliminary results • - Involve PIUs in • analysis & • interpretation Curriculum coordinators Teachers Prog admin Accreditors Students PTAs

  26. Backwards buildup: Use-driven evaluation How will findings be reported? Lang program example Who is implied? Audience? Audience? Additional considerations? Eng prog chair Consider how the audience can get most out of the report. Program administrators Evaluation consultant Curric. coordinators Other curricular area teachers Eng Teachers What did the consultant do? What did the PIUs do? • Sent preliminary tables & charts before the data interpretation workshop. • Conducted a workshop on how to analyze and interpret the results. • Provided other data sources (document analysis) for triangulation. • Created a short summary of the results and reported them at the teachers’ meeting. The PIUs provided justification for necessary resources they anticipate in taking action and planning and implementing the next evaluation cycle.

  27. Backwards buildup: Use-driven evaluation How are data analyzed and interpreted? Who is implied? What do they do? Evaluator PIUs & evaluators decide… (a) Who should be involved in data analysis (b) How to best organize & analyze data (c) How to maximize trustworthiness of analysis (d) How will interpretation be checked (e) Any triangulation of sources and perspectives necessary? (f) What is the basis for judgment? Any pre-set criteria? (g) Exercise care in extrapolating about program implications. (h) Who gets to draw implications & make recommendations? Facilitation Prog chair Curriculum coordinators Teachers • To what extent does your program have the expertise to thoroughly understand the data? Any strategiesfor ways of enhancing their understanding via the data analysis and interpretation process? • What if the data collection methods were not completed rigorously? • Are the data likely to be debated? How might tensions be resolved? Additional considerations?

  28. Backwards buildup: Use-driven evaluation How are data analyzed and interpreted? Lang program example Additional considerations? Who was implied? What did they do? Evaluation consultant • Decision making for data organization, • analysis, interpretation • Data organizing & analysis • Who?: Only the evaluator • How?: Visually understandable charts and tables • Trustworthiness?: No coding, just raw qualitative data •  Integrated capacity building • Conducted a workshop on how to analyze and interpret the results. • Interpretation & recommendations • Who?: All Eng teachers • Triangulation?: Use of multiple data sources How to build PIU’s capacity to interpret data in a limited amount of time available. Prog chair Curriculum coordinators English teachers

  29. Backwards buildup: Use-driven evaluation What evidence is needed to answer questions? What indicators make sense? Who is implied? What do they do? Evaluator PIUs & evaluators negotiate… Indicators Discuss information (evidence) that is necessary to answer the evaluation questions and that is meaningfulto the PIUs. Methods (a) Select feasible methods that will provide trustworthyinformation about the target indicators. Consider technical, time, and resource demands imposed by different methods. (b) Provide clear and reasonable justifications for choosing each methodology. (c) Data sources and key informants (d) How to recruit the key informants (e) The best time to collect data Facilitation Prog chair Curriculum coordinators Teachers Additional considerations? • To what extent was your determination of indicators limited by • feasibility of collection, or by existing notions of program • ‘measures’? • How do you ensure that the data will be collected accurately?

  30. Backwards buildup: Use-driven evaluation What evidence is needed to answer questions? What indicators make sense? Lang program example Who is implied? What do they do? Evaluation consultant • Defined indicators • Who?: All teachers, facilitated by the evaluator • What?: Identified multiple indicators and sources • How was it solicited?: Conducted a meeting led by the program chair + evaluator followed up with emails. • Choose methods • Current students and teacher surveys and document analysis as the most feasible method by the teachers and the evaluator. Due to feasibility, alumni was not included. • Created a survey instrument • Who?: The evaluator, chair, and coordinators • How?: A survey instrument informed by previous research on high school student needs. The survey was drafted by the evaluator based on teacher identified constructs, and was reviewed by the prog chair & coordinators.  Coordinators also consulted Eng teachers. Prog chair Curriculum coordinators English teachers

  31. Backwards buildup: Use-driven evaluation Which questions need to be answered? Who is implied? What do they do? Evaluator • Carefully examine the immediate issues surrounding a specific program (a program within a program?). •  Who do we solicit from? • Create a list of tentative evaluation questions to be answered. • Create a general overarching question  Specify the components of the question to create sub-questions. • Prioritize the questions in relation to feasibility, urgency, and importance. Which question should be answered first, and why? Facilitation Prog chair Curriculum coordinators Teachers • Why did you prioritize these particular evaluation uses and questions? Will there be agreement from other stakeholders, and how do you know? • Are there resource constraints that will limit what you can feasibly undertake in the current evaluation project? Additional considerations?

  32. Backwards buildup: Use-driven evaluation Which questions need to be answered? Lang program example Who is implied? What do they do? Evaluation consultant • Separate meetings with the school principle and available part-time teachers. Facilitation • Solicited the issues surrounding instruction and curriculum at a English faculty meeting. • Negotiated priorities for evaluation. Discussed need, timeliness, relevance, likely impact/use, and capacity. Prog chair Curriculum coordinators English teachers

  33. Backwards buildup: Use-driven evaluation What is the purpose and use of evaluation? Who is implied? What do they do? • Identify any internal or external mandate to evaluate, impetus, problem or issues in a program, etc. • Examine what motivated the initiator to bring in and take on an evaluation project? • Find out to what extent evaluation is already occurring for what purpose. Evaluator Facilitation Prog chair Curriculum coordinators Teachers Additional considerations? Are there any conflicting interests among stakeholders? If so, how are you going to come to an agreement?

  34. Backwards buildup: Use-driven evaluation What is the purpose and use of evaluation? Lang program example Who is implied? What do they do? Evaluation consultant Showcased examples of evaluation case studies with different evaluation purposes. Prog chair Discussed and unanimously agreed that the pressing issue is the un-clear goals and non-alignment of curriculum among grade levels and between high school and the affiliated university English curriculum. Curriculum coordinators English teachers Pressure from the affiliated university’s admission committee. Strong Eng program = school publicity Principal HS admission officers Program improvement & articulation Use: Set goals and map learning outcomes • Textbook = curriculum • Very vague goal: “Students will achieve the English ability that the Japanese society expects”  What does this mean?

  35. Backwards buildup: Use-driven evaluation Who initiates the evaluation? Who else participates in designing the evaluation? What do they do? Who is implied? Evaluator • Identify primary intended users of evaluation • Together with the initiator(s), map stakeholders. • Decide which stakeholders should be represented on an evaluation committee? Why? •  What role do they play in the program? •  How are they affected by the evaluation? •  Who among them will actually use evaluation findings? • Determine primary intended users, considering their ability to use evaluation, power relationships, availability, understanding, and trust. • Confirm commitment by PIUs to all phases of evaluation, within available time and resource constraints. Initiator? Prog chair Curriculum coordinators Teachers Additional considerations? How are you going to get buy-ins from your PIUs?

  36. Backwards buildup: Use-driven evaluation Who initiates the evaluation? Who else participates in designing the evaluation? Lang program example What do they do? Who is implied? Initiator • Evaluator consultant contacted the program chair and the school principal. Evaluation consultant Prog chair Negotiated how much commitment primary intended users (program chair, curriculum coordinators, and full-time teachers) can be engaged. Curriculum coordinators Full-time Eng teachers

  37. Applying use-driven & participatory thinking to your language program evaluation

  38. Application activity: Individual  In your language program, who are the key stakeholders with a vested interest in evaluation?  What might any of these stakeholders want to see from evaluation of your program? How might they want to use evaluation data (what would they do with it?) • At this point in time, which of these stakeholders might actually initiate an evaluation, and why would they do so? SEE PAGE 6 OF THE BOOKLET.

  39. Application activity: Pair/group  Compare what you found with someone else in the room.  Are there similarities/differences in the stakeholders you identified and their uses for evaluation? • Can you articulate why a particular evaluation focus by a particular stakeholder group might need to be prioritized over others?

  40. Taking responsibility for language program evaluation

  41. Implications for diverse stakeholders Provide resources necessary for participation in evaluation Funders Avoid legislation by measurement only; consider impact of evaluation policy Advocate for evaluation that benefits language education Professional Organizations Policy makers Administrators Staff Teachers Learners Contribute opinions to school-based evaluation; call for useful evaluation Facilitate participation, use, and learning in evaluation Professional Evaluators Parents Investigate evaluation practices that ensure and enhance L2 education Academics, Researchers Monitor the impact of evaluation on L2 learning and use; learn from evaluation Society – The ‘Public’ Incorporate evaluation of materials & learning into instructional guides Commercial Publishers

  42. Implications for diverse stakeholders Funders Professional Organizations Policy makers Administrators Staff Teachers  Understand how evaluations are being used, whether they are internally or externally generated; ensure understanding of other program-insiders  Seek out ways of participating in evaluation, such that key concerns and ideas from within/across the program are made salient. Learners  Initiate evaluation for the purposes of understanding and improving program effectiveness, but also for demonstrating program value to other stakeholders. Professional Evaluators Parents Academics, Researchers Society – The ‘Public’ Commercial Publishers

  43. Program evaluation concerns Lack of time, overburdened teachers Available instruments and procedures Use, usefulness, follow-through on evaluation Comparability of evaluation data Willingness of teachers to participate Fear, misuse by external forces Institutional support, funding, help Understanding, knowledge, expertise

  44. A few immediate strategies  Evaluate the evaluations that are already going on. Build on, improve on existing practice.  Start small—we do not need to evaluate all things all the time. Focus on a priority challenge at hand.  Lead by example. One positive use of evaluation often facilitates learning, change, and the creation of professional space.

  45. Find out more: http://www.nflrc.hawaii.edu/evaluation

More Related