1 / 27

TracDat Update revised September, 2009

TracDat Update revised September, 2009. Office of Academic Affairs Robbie Teahen Kim Wilber. One Approach to Outcomes. http://www.youtube.com/watch?v=DRBW8eJGTVs. Purpose of Assessment.

jaxon
Download Presentation

TracDat Update revised September, 2009

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. TracDat Updaterevised September, 2009 Office of Academic Affairs Robbie Teahen Kim Wilber

  2. One Approach to Outcomes http://www.youtube.com/watch?v=DRBW8eJGTVs

  3. Purpose of Assessment Simply: To advance the quality of student learning through careful elaboration of intended learning, meaningful measures of students’ learning achievements, and systematic collection of data that informs instructional and other improvements – at the level of courses, programs, colleges, and institutions. Assessment involves going beyond the evaluation of individual student performance (Teahen, 2008).

  4. The Learning Design Cycle Determine learning Needs Analyze learner needs Curriculum Development Assess performance Specify learning outcomes Plan learning activities

  5. Session Outcomes As a result of participating in today’s session, you will: • Log into the system • Change your password • Describe key terms, such as “reporting” or “assessment” units and “learning outcomes” • Create an appropriate learning outcome • Enter an assessment plan(e.g. program or course) • Enter assessment results • Locate implementation plans/timelines • Produce a curriculum map • Create an ad hoc report • Be prepared to continue to enter target outcomes/goals, assessment plans, and your results on an ongoing basis

  6. Goals of the Assessment Tracking Implementation • Create a repository for conserving assessment information • Stimulate cross-discipline sharing of learning and institutional effectiveness outcomes • Focus unit-level efforts on specifying and monitoring assessment outcomes • Encourage use of assessment results to inform course, program, and institution level learning enhancements • Streamline reporting requirements for academic units • Contribute to a data-informed, data-driven culture

  7. Topics to be Included Today • User roles • Security Access • Changing the password • Entering data – outcomes, plans, and results • Attaching documents • Linking assessment unit outcomes to reporting unit standards or plans • Producing standard and custom reports • Resources

  8. Roles and Security • Role Options: • User • Reports-Only • Program-level administrator • The primary difference is that the program-level administrator is responsible for the Assessment Unit Tab. (See next slide) • Ferris Administrator (Kim Wilber) • Access and Security levels and permissions: • Form (see Academic Affairs assessment website) • Approvals by Department Head and Dean • Submit requests to Maureen Milzarski

  9. Addressing Common Concerns • Comparing instructor performance . . . • Outcomes and measures are expected to be highly unique, so it will not be possible to compare “performance” across programs or disciplines • Instructor Evaluation • Instructors will continue to be evaluated by existing methods. If they want to use this information to demonstrate their effectiveness, it’s their option. • Determining What Information is Reported • Inclusion is decided at the appropriate level. . . What the program personnel put in is what they decide to enter. Most reporting will be at the program level, and course-level reported is expected to evolve at the • This Too Shall Pass. . . • Assessment of student learning has been required since 1995. Expectations have heightened. • I Don’t Have Time (It’s too much work) . . . • Assessment is integral to the work of a professional educator. It is a part of the cycle of instructional design and the continuous improvement approach of “plan, do, check, act.”

  10. Assessment Unit Information • Show/Tell (using the Demo Site) • Components included • Unit Name • Mission • Sites Offered • Accreditation Entity, Date of next accreditation visit • Certification and Licensing • Online status and plans • Advisory Board? • Next Academic Program Review Date

  11. Assessment Plan Information • Show/Tell/Do (using Demo or Sandbox) • Outcome name • Outcome statement • Assessment Method • Criterion for Success • Related Goals • Additional assessment methods

  12. Outcome Checklist Learning Outcomes: • describe one of the major skills that is an intended outcome for a course or program • represent a skill that a competent individual would use outside the context of the course • begin with an action verb describing what the learner will be able to do upon completion of this course/program • are measurable and observable • require use of skill, knowledge, or attitude/value – at a level of application or above on Bloom’s taxonomy • present a clear, concise, and precise statement describing the action • specify a single performance/outcome, not a combination • describe learner performance, not the instructor's activities, learning plans, or instructional strategies

  13. Sample Program-Level Outcome and Measure(s) • Outcome: Technology Use • Outcome Statement: Learners will demonstrate their use of common functions associated with software relevant to the discipline (e.g. MS Office, SPSS, CAD, etc.) • Measure(s): • Capstone project assignments will incorporate the utilization of common software applications associated with the field. Rubrics will be provided for each that address learner performance in technology use. • Exams in the second-year major course will incorporate timed tests utilizing identified software to produce documents appropriate to meet external performance requirements. Standards of the profession will be utilized to assess learners’ performance. • Throughout the program, individual course requirements will incorporate and report on technology-use performance by students, as appropriate

  14. Components to Enter in Plans • Outcome Name • Outcome • Measures • Outcome Type • Status (active or not) • Asterisks represent required fields – over time, more will be required

  15. Don’t Forget! SAVE CHANGES (button at bottom)

  16. “Means of Assessment” • Relating to a particular outcome, specify: • Method Category • Offer your suggestions • Method (Description) • Criterion • What will success look like for this program or course? • Schedule • When will assessment method(s) be implemented? Frequency? • Multiple Measures • Especially at the program level, multiple measures should be used.

  17. Sample Program Results and Action Plan Learners will demonstrate their use of common functions associated with software relevant to the discipline (e.g. MS Office, SPSS, CAD, etc.) Measure(s): • Capstone project assignments will incorporate the utilization of common software applications associated with the field. Rubrics will be provided for each that address learner performance in technology use. Results: • Review of 32 capstone projects for students in the X program during the spring of 2008 revealed that 95% of the learners were able to perform all specified functions within MS Word and Powerpoint, but just 62% could demonstrate their abilities to perform specified functions within Access. Further, AutoCAD design capabilities were rated to be at an average level of 3 on a scale of 1 to 5, with 10% of the soon-to-graduate students not meeting minimum standards for the profession. • Review of 18 capstone projects in spring 2009 . . . . Action Plan: • Faculty within the major will meet in August 2008 to examine the curriculum to determine where and how Access and CAD are introduced and reinforced and development supplemental units to assist students to more adequately achieve intended outcomes. Faculty meetings will address this performance concern and monitor curricular changes and end-of-year performance of students in spring 2009. Results from the Spring 2009 will be reported and determination made about whether additional curricular reform is required. • During the fall of 2009, faculty will incorporate more practice assignments in each software-related course and utilize ITAP students to support instructors in labs where enrollment exceeds 24 students.

  18. Linking Documents and Goals • Create folders and Attach documents. . . • Examples: • Rubrics • Standards • Assignments • Comprehensive analyses as backup to summary results • Link to standards or criteria • Such as accreditation standards • Program outcomes • General education outcomes • Industry standards, such as Microsoft Certification

  19. Reporting • A reporting unit is a group of two or more assessment units for which individuals may want to produce reports. Examples include: • College of Business • Department of the Built Environment • General Education: Global Consciousness • English and Writing-Intensive Courses • Your Liaison should let us know what Reporting Units you need and which assessment units should be linked to it.

  20. Report Types • Standard (refer to list) • Ad Hoc (Custom Reports)

  21. Current Status of Timeline • All Unit information should be entered • All program outcomes and assessment plans should be entered • Results for at least one program outcome should be entered • Course assessment plans for multi-section and general education courses should be entered soon (original plan was spring 09, but there were delays in getting courses loaded) • At least • Enter outcomes as available for course-level assessments

  22. User Will See:

  23. Admin will see . . .

  24. Curriculum Mapping • Courses must be entered before you can produce the curriculum map. • Purposes include: • identify gaps, • Identify unnecessary redundancies, • Identify appropriate progression across the curriculum (i.e., Introduction precedes reinforcement and assessment) • Identify actionable improvements when evaluating program-level outcomes

  25. Role of Liaison. . . • Primary communicator of needs to Academic Affairs (AA) office • Responsibility for maintaining currency of Assessment Unit page • Provide assistance to users within College • Participate in occasional meetings of AA office regarding process improvements • Assist in unit-level reporting as required or appropriate • See list on website

  26. By December 2009 • 90% of Ferris courses will be in TracDat with clear, measurable outcomes • 80% of Ferris courses will have effective assessment methods with criteria for success • The courses in 75% of the programs will be integrated into a curriculum map to program outcomes by December 2009. • All faculty will be engaged in active assessment at the course level to enhance student learning. Find entire assessment plan here: http://www.ferris.edu/htmls/administration/academicaffairs/assessment/plan0809.htm

  27. Regarding the Learning Paradigm: "The result of this paradigm shift is a college where faculty are the designers of powerful learning environments, where curriculum design is based on an analysis of what a student needs to know to function in a complex world rather than on what the teacher knows how to teach, where the college is judged, not on the quality of the entering class, but on the quality of aggregate learning growth possessed by its graduates, where compartmentalized departments are replaced by cross-disciplinary cooperatives, and where every employee has a role to play and a contribution to make in maintaining a learner-centered environment (p. 5).” -Bill Flynn, Palomar College, 1998

More Related