1 / 15

How Do We Know If Students Are Learning?*

How Do We Know If Students Are Learning?*. Maria Aristigueta and Kim Bodine University of Delaware NASPAA Meeting-October 2003 PLEASE DO NOT CITE OR DISTRIBUTE WITHOUT PERMISSION OF THE AUTHORS!. Characteristics of Effective Assessment in Higher Education (Angelo, 1993).

von
Download Presentation

How Do We Know If Students Are Learning?*

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. How Do We Know If Students Are Learning?* Maria Aristigueta and Kim Bodine University of Delaware NASPAA Meeting-October 2003 PLEASE DO NOT CITE OR DISTRIBUTE WITHOUT PERMISSION OF THE AUTHORS!

  2. Characteristics of Effective Assessment in Higher Education(Angelo, 1993) • Focuses on the processes as well as on the products of instruction. • Assesses what we teach – and what we expect students to learn. • Actively involves both teachers and students. • Uses multiple and varied measures. • Provides information for improving learning.

  3. Effective Assessment (cont.) • Is carried out at various key points. • Provides useful, timely feedback to those being assessed and those most affected. • Is an intrinsically educational activity – one that reinforces and furthers the teaching and learning goals on which it focuses.

  4. Formulate statements of intended learning outcomes Discuss and use assessment results to improve learning Develop or select assessment measures Create experiences leading to outcomes The Assessment Process (Huba & Freed, 2000)

  5. Formative (for improvement) Learning measures Decentralized Most student centered Direct measures Soundest evidence Least public Summative (for accountability) Performance measures Centralized Least student centered Indirect measures Weakest evidence Most public Purposes for Assessments may Differ

  6. How do we ‘know’ students are learning? • We all learn to construct knowledge • Important to build connections between learner’s prior knowledge / experience and new information or skill • Important to make connections between discipline-based knowledge and general skills • Learning can be in the cognitive (stuff in heads), affective (feelings and emotions), psychomotor (physical skills) domain.

  7. Guiding Questions • What is the purpose? What do you hope to learn from this assessment? • What will be assessed? (learning objectives) • Who will be assessed? • How will it be assessed? Time frame for assessment? • In what setting will the assessment be conducted? • How will the results be analyzed? • How will the results be used? How will the results be helpful to you and your students? • To whom will the results be communicated? • How will the results be communicated? Examples include website, newsletter, meeting.

  8. The Levels of Assessment

  9. UD’s MPA Program Mission The mission of the University of Delaware’s Master of Public Administration program is to provide diverse, talented graduate students with specific competencies for leadership and management, including the knowledge, skills and values essential to accountable and effective practice. The MPA program contributes directly to solutions to public challenges of our times through research and public service projects that involve students in experiential learning. The program also seeks to develop relationships with practitioners, fostering a professional focus and approach to public administration and non-profit management and furthering the values of the field.

  10. Goals 1. Emphasize the values of the profession in coursework, publications, and professional activities. 2. Continue to enhance the excellence and diversity of our student body through recruiting efforts at colleges and universities in the region. 3. Maintain and continue to enhance a set of core courses that require students to master essential knowledge, skills, and values. 4. Maintain and continue to enhance a set of areas of specialization that include courses requiring students to master the essential knowledge, skills, and values of the field. Academic areas of specialization will be based upon active research, public service, and professional achievements of the faculty. 5. Provide students with experiential learning through out the research and service centers of the College, especially as research assistants in the Institute of Public Administration, Center for Community Research and Service, Health Policy Research Group, and Center for Applied Demography and Survey Research. 6. Develop and maintain a nationally recognized internship program, integrated in and supported by the Institute for Public Administration. 7. Maintain and establish relationships with government and non-profit organizations that contribute to the mission of the program. 8. Encourage faculty and students to conduct applied research and public service, and to communicate the results of this research to both the practitioner and academic communities

  11. Outputs and Outcomes Defined (Hatry 1999) • Outputs are products and services delivered by the program. • Outcomes are events, actions, or behaviors that occur outside the program and in ways that the program is attempting to affect.

  12. Outcome Chart Development **Make sure there is a clear mission and clear goals to support the mission for the program. 1. Chart Development a. Develop Draft Outcome Chart (Develop additional surveys to address outcomes) b. Share draft outcome chart with faculty and ask for faculty comments c. Review draft outcome chart and make appropriate changes. d. Pre-test survey instrument Review by select group of stakeholders and make appropriate changes e. Administer surveys f. Analyze data and prepare summary. 2.Program Planning a. Provide draft report to faculty for comments b. Discuss survey data and take appropriate action at MPA meeting. 3. Overview of Assessments a. Exit Surveys b. Alumni Focus Groups, surveys

  13. Outcome Chart

  14. Direct Measures Research Assistantships Course Papers / Exams Analytical Papers Presentations in the Field Case Studies Supervisor Evaluations Indirect Measures Paper / pencil surveys Database Information Existing Reports Info from graduates Info from employers Data Collection Methods Used By MPA Program

  15. References • Angelo, T. & Cross, K. (1993). Classroom assessment techniques: A handbook for college teachers (2nd ed.). San Francisco: Jossey-Bass. • Bauer, K and Bauer, G 2002. A General Education Institute Session, July, University of Delaware. • Hatry, H. 1999. Performance Measurement: Getting Results. Washington, DC: The Urban Institute Press. • Huba, M. & Freed, J. 2000. Learner-centered assessment on college campuses. Needham Heights, MA: Allyn & Bacon. • *Note: much of this material was adapted from the University of Delaware’s; Guidelines and Methods for Assessing Student Learning, Bauer and, authors. *

More Related