1 / 62

Program development process at Queen’s University to demonstrate graduate attributes Brian Frank Director (Program De

Program development process at Queen’s University to demonstrate graduate attributes Brian Frank Director (Program Development) Faculty of Engineering and Applied Science Queen's University. Focus.

alia
Download Presentation

Program development process at Queen’s University to demonstrate graduate attributes Brian Frank Director (Program De

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Program development process at Queen’s University to demonstrate graduate attributes Brian Frank Director (Program Development) Faculty of Engineering and Applied Science Queen's University

  2. Focus “The institution must demonstrate that the graduates of a program possess the attributes under the following headings... There must beprocesses in place that demonstrate that program outcomes are being assessedin the context of these attributes, and that theresults are applied to the further development of the program.”

  3. CEAB Instructions Describe the processes that are being or are planned to be used. This must include: • a set of indicators that describe specific abilities expected of students to demonstrate each attribute • where attributes are developed and assessed within the program… • how the indicators were or will be assessed. This could be based on assessment tools that include, but are not limited to, reports, oral presentations, … • evaluation of the data collected including analysis of student performance relative to program expectations • discussion of how the results will be used to further develop the program • a description of the ongoing process used by the program to assess and develop the program as described in (a)-(e) above Engineering Graduate Attribute Development (EGAD) Project

  4. Approach • Short term objectives (2010-2011): • Set up a comprehensive process limited to a small number of courses to help programs understand the process • Use data to help faculty see value in outcomes assessment for program improvement • Long term: • Comprehensive assessment of all attributes throughout programs • Evaluate validity of data • Students take responsibility for demonstrating some attributes

  5. Queen's University timeline • Summer 2009: Working groups of faculty, students, topical experts created specific program-wide indicators (next slide, and in Appendix 3.1A) • Summer 2009: Setup learning management system (Moodle) to manage assessments • Sept 2009-April 2010: Piloted assessment in first year • Sept 2010-April 2011: Piloted assessment in first year, faculty wide second year, and fourth year (common across programs) • April – July 2011: Student surveys and focus groups, curriculum mapping, data analysis Curriculum planning happening throughout

  6. Why initial emphasis on first year? • First year is faculty-delivered, core to all students • Provides opportunity to pilot a process • Help disseminate outcomes assessment procedures to other instructors • Long term: assessment process continue in first year program to inform development

  7. Aside: Idealistic course development process Overall Improvement Identify course objectives and content Create and Execute a Plan Student input Create specific outcomes for each class Analyze and evaluate data Deliver, grade, seek feedback Map to experiences (lectures, projects, labs, etc.) Identify appropriate tools to assess (reports, simulation, tests,...) Engineering Graduate Attribute Development (EGAD) Project

  8. Program-wide assessment process flow Create a Program Improvement Plan Defining Purpose and Outcomes Program & Course Improvement Analysis and Interpretation Stakeholder input Identifying and Collecting Data Program Mapping Engineering Graduate Attribute Development (EGAD) Project

  9. Human capital • Director, Program Development to manage process • Faculty member from each program • Other experts as appropriate (economics, information management, etc.) Currently separate from faculty-wide curriculum development committee

  10. Resources/time commitment • Creating assessment criteria: 7 committees of approximately 5 people who each met about 4 times • Mapping criteria to a course and creating rubrics for assessment: ~ 10 hours • Large scale curricular changes: ~10 person committee, most of whom had 1 course relief bought out by dean • Coordination (resource gathering, planning, curricular planning): ~30% of a position

  11. Academic and curricular structure Dean Faculty-wide curriculum committee Associate Dean (Academic) Dean’s Retreat Curriculum Review Committee (DRCRC) Director (Program Development) Graduate attribute assessment committee NSERC Design Chair DuPont Canada Chair in Engineering Education

  12. What are indicators? Lifelong learning An ability to identify and address their own educational needs in a changing world in ways sufficient to maintain their competence and to allow them to contribute to the advancement of knowledge Can this be directly measured? Would multiple assessors be consistent? How meaningful would the assessment be? Probably not, so more specific measurable indicators are needed. This allows the program to decide what is important Engineering Graduate Attribute Development (EGAD) Project

  13. Indicators: examples Lifelong learning An ability to identify and address their own educational needs in a changing world in ways sufficient to maintain their competence and to allow them to contribute to the advancement of knowledge Graduate attribute The student: Critically evaluates information for authority, currency, and objectivity when referencing literature. Identify gap in knowledge and develop a plan to address Indicators Uses information ethically and legally to accomplish a specific purpose Describes the types of literature of their field and how it is produced Engineering Graduate Attribute Development (EGAD) Project

  14. Establishing Indicators Level of expectation (“describes”, “compares”, “applies”, “creates”, etc.) Content area • A well-written indicator includes: • what students will do • the level of complexity at which they will do it • the conditions under which the learning will be demonstrated Critically evaluates information for authority, currency, and objectivity in reports. context Engineering Graduate Attribute Development (EGAD) Project

  15. Assessment criteria Graduate attribute Linkage to OCAV UDLEs levels categories Engineering Graduate Attribute Development (EGAD) Project

  16. Rubric example • Creating defined levels (“scales”) of expectations reduces variability between graders, makes expectations clear to students threshold target

  17. Sample First year indicators for problem analysis and design Engineering Graduate Attribute Development (EGAD) Project

  18. Sample fourth year indicators for Problem analysis and Design Engineering Graduate Attribute Development (EGAD) Project

  19. Program-wide assessment process flow Create a Program Improvement Plan Program & Course Improvement Defining Purpose and Outcomes Analysis and Interpretation Stakeholder input Identifying and Collecting Data Program Mapping Engineering Graduate Attribute Development (EGAD) Project

  20. Curriculum mapping

  21. Student surveys and focus groups • Provides student input: • implementing attribute assessment in program • perceptions on where attributes are developed within the program as complement to curriculum mapping via faculty survey • perception of importance within program

  22. Questions • What do you think are priorities within the program? • What courses contribute to development of attribute {}? • Which attributes are difficult to demonstrate? • How would you recommend that attributes be developed?

  23. Self reported demonstration at program entry • Top five Grad Attributes where students reported a rating of 2 or 3 (yes or to a great degree) out of 3 • Individual and Team Work 88.73% • Communication Skills 78.17% • Professionalism 69.02% • Problem Analysis 61.26% • Investigation 60.56% Potential for students to perceive little value in learning activities directed toward developing these attributes

  24. First year program supports: Attributes in students’ top five responses Individual and Team Work* 94.97% Knowledge Base in Engineering 93.53% Problem Analysis* 93.53% Professionalism* 85.58% Investigation* 82.48% Design 80.58% Impact of Engineering on Society 80.58% *Identified as a strength coming in to the program

  25. First year program supports Bottom three responses Ethics and Equity 64.03% Economics and Project Management 69.56% Lifelong Learning 73.19% These three are a significant focus in APSC-100, embedded in various activities.

  26. Attributes perceived to be program priorities

  27. Graduating students: low priority attributes in program

  28. Focus group suggestions • Communicate graduate attributes and draw attention back to them • What is lifelong learning”? • Professionalism and ethics and equity should be focused on in upper years

  29. Program-wide assessment process flow Create a Program Improvement Plan Defining Purpose and Outcomes Program & Course Improvement Analysis and Interpretation Stakeholder input Identifying and Collecting Data Program Mapping Engineering Graduate Attribute Development (EGAD) Project

  30. Assessment in 2010-2011

  31. Analyze and evaluate… • Histogram of results by level (did or did not meet expectations) • Histogram of results by student (how many indicators did each student fall below • Trend over time • Triangulation: examination of correlation between results on multiple assessments of the same indicator data with exam results)

  32. First year:Second year of pilot

  33. time

  34. Knowledge base: Mathematics Calculus instructor asked questions on exam that specifically targeted 3 indicators for “Knowledge”: • “Create mathematical descriptions or expressions to model a real-world problem” • “Select and describe appropriate tools to solve mathematical problems that arise from modeling a real-world problem” • “Use solution to mathematical problems to inform the real-world problem that gave rise to it”

  35. Indicator 1: • The student can create and/or select mathematical descriptions or expressions for simple real-world problems involving rates of change and processes of accumulation (overlaps problem analysis) Context: calculating Intersection of two trajectories

  36. Indicator 2: Students can select and describe appropriate tools to solve the mathematical problems that arise from this analysis Context: differentiation similar to high school curriculum

  37. Indicator 2: • Students can select and describe appropriate tools to solve the mathematical problems that arise from this analysis Context: implicit differentiation, trig inverse

  38. Program-wide assessment process flow Create a Program Improvement Plan Defining Purpose and Outcomes Program & Course Improvement Analysis and Interpretation Stakeholder input Collecting Data Program Mapping Engineering Graduate Attribute Development (EGAD) Project

  39. All first year indicators over time

  40. # Students falling below expectations in first year

  41. Graduating year

  42. Graduating year • Starting point: histograms • Very few students falling below threshold level in capstone courses for most indicators

  43. Area for improvement in graduating year: technical literature

  44. Data evaluation • Across multiple capstone courses, students scoring lower on indicators involving: • Evaluating validity of results • Evaluating techniques and tools • Evaluating effectiveness of results • Evaluating information • Pattern: evaluation

  45. Curriculum Mapping: CurriKit • Curriculum mapping software developed by U Guelph • Provides information to identify: • the courses which develop each graduate attribute • what assessment is done and when • which instructional approaches are used

  46. Program-wide assessment process flow Create a Program Improvement Plan Defining Purpose and Outcomes Program & Course Improvement Analysis and Interpretation Stakeholder input Collecting Data Program Mapping Engineering Graduate Attribute Development (EGAD) Project

More Related