1 / 23

Higher Engineering Education Alliance Program (HEEAP)

Higher Engineering Education Alliance Program (HEEAP). Formative Evaluation Findings and Recommendations. Randolph Flay Director, Office of Program Development U.S. Agency for International Development Vietnam. Agenda. 1 . What is HEEAP ? 2. Why we do evaluation?

milos
Download Presentation

Higher Engineering Education Alliance Program (HEEAP)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Higher Engineering Education Alliance Program (HEEAP) Formative Evaluation Findings and Recommendations Randolph Flay Director, Office of Program Development U.S. Agency for International Development Vietnam

  2. Agenda 1. What is HEEAP? 2. Why we do evaluation? 3. Methodology 4. Selected Key Findings & Recommendations 5. Questions and Comments

  3. WHAT IS HEEAP? • First of three Global Development Alliances (GDAs) in higher education from USAID/Vietnam. • Aims to transformVietnam’s engineering education from “passive, theory-based to active, project-based instruction” & produce “work-ready” graduates for the country’s booming high-tech sector • Implemented since 2010 by Arizona State University (ASU) with financial contribution from USAID, Intel, and a number of other private sector partners • In 2012, expanded to include the Vocational and University Leadership and Innovation Institute (VULII) or HEEAP 2.0. Aims to develop modern institutional strategic planning capacity; increase institutional research, evaluation, management principles, financial planning, assessment and quality assurance.

  4. Why we do evaluation? USAID Evaluation Policy 2011 • To ensure accountability by measuring project effectiveness, relevance and efficiency • To inform and improve project and strategy design and implementation • To inform resource allocation decision and future programming

  5. Why Evaluate Heeap? To assess … • success in advancing instruction and curriculum, and improving undergraduate learning outcomes & institutional support for reform • how results are seen & measured by program funders, and whether indicators reflect project performance • contributions of the private sector partnership to program results

  6. Framework for Methodology: Donald Kirkpatrick’s Four Levels of Evaluation:

  7. Methodology • Evidence-based & data-driven findings • Quantitative & Qualitative information gathered • Instruments developed to collect data: • Key Informants (Institutions, ASU/HEEAP, Intel, GVN) • Focus Groups : HEEAP & non-HEEAP Faculty • Focus Groups: HEEAP & non-HEEAP Students • Classroom observations: HEEAP & non-HEEAP • Online survey of 123 faculty trained at ASU • Lab visits • Document Review • Visited 3 of 8 target institutions (HUST in Hanoi, HCMUT in HCM city and a vocational college - HVCT in HCM City)

  8. Profile of Surveyed Population • Key informants 25 • Faculty Focus Groups 6 • Student Focus Groups 8 • Returned Trained Faculty (survey) 92 (75%) of 123 • little bias in sampling by characteristics, location, gender & and cohort • high return rate • quantitative & qualitative data • data collected from all 8 target institutions

  9. Selected Findings & Recommendations I. Project impact Assumption: Increased use of Active Learning will improve student outcomes Finding 1: Considerable positive impact especially on individuals trained Evidence • higher student motivation • students focus on learning vs. grading, increased soft skills & taking initiatives • faculty using many Active Learning methods • curricula & syllabi revised closer to ABET Student performance has definitely gotten better. In past students [were] passive…As they see more lecturers change, less - A faculty lecturer

  10. Selected Findings & Recommendations I. PROJECT IMPACT • lab equipment & software improved • between 87% & 99% responded HEEAP training useful/highly useful • women found all training components more useful than men Before HEEAP, we talked, they listened. Now we know how to effectively transfer knowledge to students. - Faculty lecturer/HEEAP Participant

  11. Selected Findings & Recommendations Recommendation 1 To sustain positive impact and results from expansion, HEEAP 2.0 & VULII can … • seek to move from individualimpact to institutionalchange • increase use of Vietnamese experts in Vietnam to accelerate local capacity building & leverage results • Use Intel’s systems approach for educational change in Vietnam

  12. Selected Findings & Recommendations Finding 2: Implementing English & Active Learning as joint objectives limits results Evidence: • Vocational faculty at ASU trained through interpreters • English as selection criterion affects outcome • Returned faculty preference for Advanced Programs • Most courses still taught in Vietnamese in VN

  13. Selected Findings & Recommendations Recommendation 2 Separate the two objectives into: • Objective One: Graduates have English & are more work-ready • Ideas for implementation: • Establish an English for Engineers program • Build on existing on-campus English programs • Select faculty participants for ASU 6 months in advance to allow for faculty to improve English

  14. Selected Findings & Recommendations • Objective Two: Student learning increased with improved teaching • Ideas for implementation: • Use Vietnamese as the HEEAP training language in Vietnam to improve teaching & leverage change • Design training interspersed between learning and applying • Design effective mentoring in Vietnamese

  15. Selected Findings & RecommendationsII. Monitoring and Evaluation Finding 3:HEEAPgoals align well with partner/ stakeholder objectives; however, the M&E system measured different results Evidence: • Stakeholders view HEEAP positively & express satisfaction • M&E indicators measure inputs more than results, inadequately link to decision making & improvement • No baseline established in 2010 • Uneven follow-up on faculty projects • Inadequate achievement & impact reporting to donors& government

  16. Selected Findings & Recommendations Recommendation 3 • Increase information-sharing and collaborative planning on project results with stakeholders • Conduct M&E more frequently & effectively in Vietnam with local expertise • Drive decision-making with measurable results linked to indicators

  17. Selected Findings & Recommendations III. Global development alliance/Public private partnership Finding 4: Private Sector Partnership works well Evidence: • Increased commitments & project expansion • Sustainabilityand scale are increased significantly • Lead private-sector partner (Intel) provided credibilitywith other industry partners and USAID. • As of 2013, partners contribution value reached around $40million • Helped MOET, MOLISA and target institutions with their strategies to link education with industry demand • GDA flexible mechanism enhanced likelihood of greater impact HEAAP significantly increases process of active learning and teaching application nationwide

  18. Selected Findings & Recommendations Recommendation 4 • Improve communication and coordination among all stakeholders • Practice transparency among all partners • Strengthen HEEAP Vietnam office with appropriate mix of positions &roles & responsibilities for scale-up

  19. How USAID and ASU are using the findings ? • Share findings with partners and stakeholders • Identify program actions needed with timelines and responsibilities • Review and adjust project work plan where appropriate • Adopt and apply relevant recommendations for VULII (HEEAP 2.0) implementation

  20. HOW Universities can use & support recommendations? • Increase institutional commitment & effective support to returned faculty • Increase dialogue & active communication within and between universities • Separate plan for English proficiency from active learning • Select only top emerging leaders from in-country training for U.S. training; Clear criteria for choosing participants • Address systematic policy issues by support change management – “policy to encourage change” • Involve QA departments in more depth and consistency to strengthen project M&E

  21. Defining Capacity Building Capacity Building is … • Institutional, rather than individual • aims to change performance rather than focus on acquisition of knowledge, skills, and abilities (KSAs)

  22. How can we better scale up impact at the institutional level? Q&A

  23. USAID/NSF Partnerships for Enhanced Engagement in Science (PEER) • PEER Science provides support for researchers on a wide range of development-related topics, including food security, climate change, water, biodiversity, disaster mitigation, renewable energy, and others. • Applicants to PEER Science must be working in collaboration with a U.S. partner who holds an active award from the National Science Foundation (NSF). • http://www.nationalacademies.org/peer/ • Deadline of December 16, 2013

More Related