1 / 56

2004 Second Annual Summer Assessment Seminar

2004 Second Annual Summer Assessment Seminar. Outcomes Assessment of Student Learning: Highlighting Assessment Approaches at Texas Tech. Outcomes Assessment of Student Learning. Presenters: Matt Baker & Chad Davis – Agricultural Education and Communication

inez-logan
Download Presentation

2004 Second Annual Summer Assessment Seminar

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. 2004 Second Annual Summer Assessment Seminar Outcomes Assessment of Student Learning: Highlighting Assessment Approaches at Texas Tech

  2. Outcomes Assessment of Student Learning Presenters: Matt Baker & Chad Davis – Agricultural Education and Communication Jon Bredeson- Electrical & Computer Engineering David Driskill & Glenn Hill – Architecture Phil Marshall – Political Science

  3. Outcomes Assessment of Student Learning Purposes of Seminar: • Provide an overview of outcomes assessment of student learning • Present approaches to outcomes assessment from academic programs at Texas Tech

  4. Outcomes Assessment of Student Learning Outcomes Assessment defined: “. . . The systematic collection, review, and use of information about educational programs undertaken for the purpose of improving student learning and development.” Palomba & Banta, Assessment Essentials (1999)

  5. Outcomes Assessment of Student Learning • Overarching Goal: Achieving excellence in undergraduate and graduate programs • Assessment: Demonstrating that programs have achieved their intended purposes Key to effective student outcomes assessment is faculty leadership at the departmental level

  6. Outcomes Assessment of Student Learning What do they already know and what skills do they already possess? Entrance exams and incoming surveys What have they learned (what do they know, do and value)? Comprehensive exams Capstone courses/experiences Course embedded assessments Performance-based mastery tests Portfolios from internships and student teaching Certification tests Satisfaction surveys and focus groups Post-graduation surveys of graduates & employers

  7. Outcomes Assessment in Agricultural EducationMatt BakerChad Davis

  8. Anticipated Outcomes • We want our program to enhance certain disciplinary competencies • We want our program to enhance graduates’ ability to think critically and creatively • We want our program to improve graduates’ quality of life

  9. Self-Assessment of Knowledge • Teaching Strategies • Application • Foundations • International

  10. Self-Assessment of Ability • Communication • Idea Generation and Reasoning • Attentiveness and Quantitative Abilities • Perceptual, Spatial, & Memory Abilities

  11. Self-Assessment of Skills • Content • Process • Social • Resource Management • Complex Problem Solving • Systems Skills • Technical Skills

  12. Critical Thinking Richard Paul • Critical thinkers use a set of intellectual standards • Intellectual standards guide the thinking process • Think about thinking for the purpose of improving the thought process

  13. California Critical Thinking Disposition Inventory (CCTDI) • Result of a Delphi study of critical thinking experts • Seven constructs are measured • analyticity, self-confidence, inquisitiveness, maturity, open-mindedness, systematicity, and truth-seeking • the total score is calculated by adding the construct scores

  14. CCTDI Constructs • Analyticity - reasoning based on facts • Self-confidence - secure in intellectual judgement • Inquisitiveness - intellectual curiosity • Maturity - awareness that problems are complex

  15. CCTDI Constructs • Open-mindedness - tolerance of diverse views • Systematicity - organized, diligent inquiry • Truth-seeking - knowing the truth is more important than winning the argument

  16. Typology of Creativity • Expressive • Productive • Inventive • Innovative • Emergenative Taylor, 1959

  17. Torrance Test For Creative Thinking (TTCT) For this three-part timed test, subjects are asked to construct a picture, complete a series of incomplete drawings, and complete drawings from sets of parallel lines.

  18. TTCT Constructs • Fluency - the ability to produce a large number of figural images • Originality - unusualness or rarity of response • Elaboration - ability to develop, embroider, embellish, carry out, or otherwise elaborate ideas.

  19. TTCT Constructs • Abstractness - the ability to produce good titles and to capture the essence of information involved • Resistance to Closure - the ability to keep a figure open and delay closure long enough to make the mental leap that makes original ideas possible

  20. Quality of Life • Quality of Life Profile (QOLP) • generic measure of well-being • developed and validated by multidisciplinary research team, University of Toronto • relationship between individual/environment • assumes QOL is a judgement • Focuses on three fundamental areas of life common to all human beings

  21. Being Belonging Becoming QOL - Domains Individual’s Quality of Life

  22. Physical Psychological Spiritual QOL - Being Body & Health Being Domain (Who person is as an individual) Thoughts & Feelings Beliefs & Values

  23. Social Community Physical QOL - Belonging Live & Spend Time Belonging Domain (How environments and others fit with person) People Around You Access to Things

  24. Leisure Growth Practical QOL - Becoming Daily Things Becoming Domain (What person does to achieve hopes, goals, aspirations) Things For Enjoyment Improve & Change

  25. Physical Physical Being Belonging Social Psychological Spiritual Community Becoming Leisure Practical Growth QOL Profile Individual’s Quality of Life

  26. OUTCOME ASSEMENTSJon Bredeson, Chair Electrical and Computer Engineering

  27. ABET • Accreditation Board for Engineering and Technology • Very crucial to be accredited • Criterion 2. Program Educational Objectives • Detailed published educational objectives • Process where objectives are determined and periodically evaluated

  28. ABET • A curriculum and processes that prepare students for the achievement of these objectives • A system of ongoing evaluation that demonstrates achievement of these objectives and uses the results to improve the effectiveness of the program

  29. ABET • Criterion 3. Program Outcomes and Assessment • What students are expected to know or be able to do at graduation from the program • Engineering programs must demonstrate that their graduates have a through k • Each program must have an assessment process with documented results.

  30. ABET • Evidence must be given to the further development and improvement of the program • The assessment process must demonstrate that the outcomes of the program are being measured • Institute of Electrical and Electronic Engineers role

  31. ECE at Texas Tech • Senior exit interviews. Initial form followed by individual interview with Chair • All courses assessed, by faculty and all students with forms • Institutional Research data used • ABET & Curriculum Committees interpret data and make recommendations

  32. ECE at Texas Tech • First EC 2000 visit in 1999 at Texas Tech • Report need for all programs this coming year • Next visit Fall 2005 for Electrical Engineering and Computer Engineering

  33. Assessment of Student Outcomes College of Architecture David A. Driskill, AIA Glenn E. Hill, AIA

  34. Guide to Student Performance Criteria1998 (revised 9/2003) The National Architectural Accrediting Board “The program must ensure that all its graduates possess the skills and knowledge defined by the performance criteria set out below, which constitute the minimum requirements for meeting the demands of an internship leading to registration for practice.”

  35. Three Levels of Accomplishment • “Awareness: familiarity with specific information, including facts, definitions, concepts, rules, methods , processes or settings. Students can correctly recall information without necessarily being able to paraphrase or summarize it.” • “Understanding: assimilation and comprehension of information. Students can correctly paraphrase or summarize information without necessarily being able to relate it to other material or see its fullest implications.” • “Ability: skill in relating specific information to the accomplishment of tasks. Students can correctly select the information that is appropriate to a situation and apply it to the solution of specific problems.”

  36. Demonstration of Accomplishment • Awareness is demonstrated with evidence that the material is covered in lectures and readings. • Understanding is demonstrated with evidence that the material is tested. • Ability is demonstrated with evidence from studio and course projects and writings.

  37. Graphic Matrix Required Course Numbers /Student Outcomes 1 through 37

  38. Team Room & Criteria Folders

  39. Team Room & Criteria Folders

  40. Team Room & Criteria Folders

  41. Team Room & Criteria Folders

  42. Ongoing Assessments • Internal Reviews at the end of each semester • Learning Walls throughout the Building

  43. Use of Questionnaires Phil Marshall Political Science (Brian Cannon – Earl Survey Research Lab)

  44. Several steps are important to assessment projects • Project planning and questionnaire design • Data collection and management • Data analysis and reporting

  45. Project planning and questionnaire design • Determining the population of interest and assembling the sample. For Political Science, undergraduate alumni were of interest. E-mail addresses were obtained from the Arts & Sciences Office of the Dean Development Officer. • Writing questions that address topics of interest. In most cases, someone has already done something similar to what you are doing– this can be used as a starting point. • Refining the questions to ensure that the data you get will be pertinent to your situation. Keep it simple.

  46. Data collection and managementEarl Survey Research Laboratorywww.ttu.edu/~esrl • Deciding on a method for obtaining data. For Political Science, a web-based survey was selected: quick, efficient, minimal labor required. • Collecting the data. An e-mail was sent to the sample asking each person to participate in a brief survey online. A follow-up e-mail was sent a week later to remind those who had not yet responded. • Data were stored at the ESRL.

  47. Data analysis and reporting • Results were compiled by the ESRL and presented in summary form. • The results are descriptive and provide a baseline for future assessment efforts. • The basic set of questions used in the Political Science survey have also been used by other departments at TTU, with additional questions added or modified for a customized survey. On the following slides, sample survey questions are presented.

  48. Sample Survey Question1. To what extent is your TTU undergraduate major related to your current occupation? Very closely related Somewhat related Not closely related Not related at all

More Related