Sam houston state university s online assessment tracking database oat db
This presentation is the property of its rightful owner.
Sponsored Links
1 / 33

Sam Houston State University’s Online Assessment Tracking Database (OAT Db) PowerPoint PPT Presentation


  • 43 Views
  • Uploaded on
  • Presentation posted in: General

Sam Houston State University’s Online Assessment Tracking Database (OAT Db). Institutional Research & Assessment (IRA) Office Jeff Roberts, Assessment Coordinator Rita Caso, Ph.D., Director Information Technology (IT) – Web Development Group Colt Ramsden, Lead Programmer

Download Presentation

Sam Houston State University’s Online Assessment Tracking Database (OAT Db)

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript


Sam houston state university s online assessment tracking database oat db

Sam Houston State University’s Online Assessment Tracking Database (OAT Db)

  • Institutional Research & Assessment (IRA) Office

    • Jeff Roberts, Assessment Coordinator

    • Rita Caso, Ph.D., Director

  • Information Technology (IT) – Web Development Group

    • Colt Ramsden, Lead Programmer

    • Office of the Vice President for Academic Affairs

    • Mitchell Muehsam, Ph.D., Associate Vice President & SACS Liaison


What is the oat db

What is the OAT Db?

OAT Db stands for “Online Assessment Tracking Database,” a 24/7 web-based application for collecting, managing, storing and viewing academic and non-academic assessment information from all university units.

TAIR 2008 Conference, 2/5-6/08


History of the oat db

History of the OAT Db

  • Fall ‘05

    • New SHSU Institutional Research & Assessment (IRA) Director begins drafting designs for web-accessible assessment planning and documentation tools.

  • Winter ‘05-’06

    • Anticipating 2009 SACS reaffirmation, Academic Affairs Assoc. V.P. (SACS Liaison) endorses investigation of commercial, online assessment applications.

    • Interim Action: With Advisory Committee approval, IRA Director develops & distributes standardized template for prompting, organizing, documenting & collecting yearly outcomes assessments from all units, using Blackboard.

TAIR 2008 Conference, 2/5-6/08


History of the oat db1

History of the OAT Db

  • Spring ‘06

    • SHSU IT Web Development (Web Dev) Supervisor reviews and evaluates WEAVEonline with IRA Director

      • Data maintained off campus

      • Specific feature and function limitations (at that time)

      • Start-up fee & yearly fee

    • SHSU IT Web Dev Team recommends local development of online assessment-tracking database application, and projects optimistic timeline

    • With support and input from the SACS Liaison/Assoc. VP for Academic Affairs & IRA Director, SHSU Web Dev Team begin to develop OAT Db (Interim assessment template on Blackboard provides guide )

TAIR 2008 Conference, 2/5-6/08


History of the oat db2

History of the OAT Db

  • Fall-Winter ‘06-’07

    • Revisions of OAT DB interface by Web Dev Team with intensive input from SACS Liaison/ Assoc. VP Academic Affairs and IRA Director

    • IRA conducts focus-group testing of Beta OAT Db

  • March ‘07

    • Group training sessions for every major university unit, by IRA

      • Session 1: Introduction & overview of assessment bare-essentials and OAT DB (12-40 people per event)

      • Session 2 & 3: Hands-on computer lab training on use of OAT Db and essentials of assessment planning, implementation & documentation (8-15 people per event)

    • The OAT Db is opened for use by entire campus community

      • IRA provides continuing individual and small-group hands-on training, coaching, and review, on demand and as needed.

      • Web Dev responds to technical problems on demand, through IRA liaisons

TAIR 2008 Conference, 2/5-6/08


Why locally developed oat db

Why Locally Developed OAT Db?

  • Easy, assured access to data over years

    • All the data entered is stored in our databases, rather than elsewhere

    • No yearly fees

  • Application Support and Improvements

    • Local Development team can provide better support

    • Local Development team can continue to custom-modify and improve the application over time as needed

TAIR 2008 Conference, 2/5-6/08


What does oat db collect

What does OAT Db collect?

  • Key Elements of each unit’s assessment process

    • Goals (for each unit)

    • Outcome Objectives (Learning & Performance)

    • Indicators

    • Criteria for satisfying Objectives

    • Findings & Conclusions (Results of Assessment)

    • Actions (based upon Findings & Conclusions)

TAIR 2008 Conference, 2/5-6/08


Oat db elements

OAT Db Elements

  • Goals

    • The unit’s broadly stated intentions, aspirations, or ambitions.

  • Objectives

    • Unit’s specifically stated, desired outcomes -- related to one or more of the entered goals.

      • All objectives must be measurable.

  • Indicators

    • Specific, observable and measurable evidence of whether or not an objective was achieved or satisfied. Related to a particular objective

TAIR 2008 Conference, 2/5-6/08


Oat db elements1

OAT Db Elements

  • Criteria

    • Specific, predetermined targets, standards, or benchmarks for a particular Indicator that must be met in order to indicate success

      • Determined prior to the collection of data.

  • Findings

    • Related to a specific Criterion, these are the results or conclusions derived from the assessment process.

  • Actions

    • Specific actions taken in response to the Findings, in relation to a single Objective.

TAIR 2008 Conference, 2/5-6/08


What does it do

What does it do?

  • Prompts input & editing of useful assessment information

  • Organizes & relates information (across assessment Elements & across Levels of related units)

  • Stores (across years)

  • Retrieves and reports (across years)

  • Encourages learning & sharing assessment procedures across units

  • “Rolls” previous reporting period’s Goals, Objectives, Indicators & Criteria into new reporting period, on demand

TAIR 2008 Conference, 2/5-6/08


Who uses oat db

Who Uses OAT DB?

  • Everyone with SHSU computer account I.D. has viewing access to all information in OAT DB

  • Units designate authorized person to input and edit information for the unit

  • Every academic & non-academic unit documents its assessment information

  • IRA can provide targeted assessment coaching to units through OAT DB

  • External reviewers and guests use special I.D.s for OAT DB viewing access

TAIR 2008 Conference, 2/5-6/08


How is oat db used

How is OAT DB Used?

  • SACS reaffirmation process

  • Professional and disciplinary accreditation processes

  • Regular internal planning, accountability-tracking & reporting , i.e.,

    • Provides indicators, targets & outcomes for President’s Performance Indicator Reports

    • Informs University Strategic Planning

  • Documenting periodic and continuous self-assessment for unit improvement & justification of change

  • Assessment and evaluation of research & intervention projects (internally or externally funded)

TAIR 2008 Conference, 2/5-6/08


Technical aspects of the oat db

Technical Aspects of the OAT Db

  • Written in PHP scripting language

  • Written using PDO in PHP, which is an abstraction layer for a variety of different relational databases

  • Currently working with MIMER DB

    • Can work with variety of relational DBs such as as Oracle, SQL, mySQL, etc.

TAIR 2008 Conference, 2/5-6/08


Technical aspects of the oat db1

Technical Aspects of the OAT Db

  • Minor integration into University’s “SamWeb”

    • Program is designed so that it can work without it

  • Plug-in design for all actions (e.g. "Add Indicator")

    • Additional "plug-ins" can be developed and linked to within the OAT Db with little effort

  • Built-in ability to store assessment data for multiple institutions

    • Terminology customizable for each institution

TAIR 2008 Conference, 2/5-6/08


Introduction to oat db screen

Introduction to OAT Db (screen)

(The next 4 slides will explain this screen)

TAIR 2008 Conference, 2/5-6/08


Levels in the oat db

Levels in the OAT Db

  • Levels define the “familial” standing of units in relation to each other

    • i.e., College is Parent to its Departments.

    • Department is Parent to all of its Degree Programs, which are siblings to each other

    • Degree Programs are Children of the Department and Grandchildren of the College

TAIR 2008 Conference, 2/5-6/08


Levels in the oat db1

Levels in the OAT Db

  • Users enter unit information in their own specific levels, but Parent and Grandparent units may edit their Children’s and Grandchildren’s information

    • Users see links to their Parent and Child levels on their main OAT Db page

    • Goals of Child-Level units should reflect the goals of their Parent, or Grandparents

    • Objectives of Child-Level units should reflect their own or their Parent, or Grandparents’ goals

TAIR 2008 Conference, 2/5-6/08


Accessing oat db screen

Accessing OAT DB(screen)

TAIR 2008 Conference, 2/5-6/08


Accessing oat db screen1

Accessing OAT DB(screen)

TAIR 2008 Conference, 2/5-6/08


Entering information into oat db

Entering Information Into OAT Db

  • 24/7 Access from link on SHSU main web page with SHSU’s universal login ID

  • Users authorized to input, revise & delete unit information are recognized by their SHSU login ID.

  • Parent-level users authorized to edit Child-level programs are recognized by their SHSU login ID

TAIR 2008 Conference, 2/5-6/08


Starting page for oat db unit input screen

Starting page for OAT Db unit input(screen)

TAIR 2008 Conference, 2/5-6/08


Inputting a new objective screen

Inputting a new Objective (screen)

TAIR 2008 Conference, 2/5-6/08


Inputting a new objective screen1

Inputting a new Objective (screen)

TAIR 2008 Conference, 2/5-6/08


Entering information into the oat db

Entering Information Into the OAT Db

  • Input unit assessment information into each Element of the OAT Db by clicking active links and using drop-boxes

  • Each Element is linked with previous elements

  • Supporting documents can be uploaded and attached to Indicators, Criteria, Findings, and Actions.

    • i.e., Sample Surveys; Reports; Writing Samples; Student Test Samples; Scoring Rubrics; Sample Portfolios; Sample Videos or Sound Recordings; Meeting Minutes, etc.

  • At any time, authorized users can edit, modify, or delete any information they have entered.

TAIR 2008 Conference, 2/5-6/08


Objective centered assessment documentation

Objective-centered assessment documentation

  • Objective-centered approach to assessment tracking

    • Every Objective must be related to one or more Goals, and must have at least one associated Indicator, Criteria, Finding, and Action.

TAIR 2008 Conference, 2/5-6/08


Learning outcome objectives vs performance outcome objectives

Learning Outcome Objectives vs. Performance Outcome Objectives

  • The OAT Db captures assessment information for both Learning Outcomes Objectives and Performance Outcome Objectives.

    • Assessing either type of outcome objective is about observing and measuring the desired impact.

TAIR 2008 Conference, 2/5-6/08


Learning outcome

Learning Outcome

  • A desired behavior, knowledge, or attitude that someone will be able to demonstrate as a result of activities intended to promote learning

    • Most often associated with intentional instructional experiences offered by academic programs

    • Not exclusive to academics

    • Non-academic programs have learning outcome objectives re. staff development, client or community development, and advisement

TAIR 2008 Conference, 2/5-6/08


Performance outcome

Performance Outcome

  • A particular achievement, or level of attainment in operations that an office, department or program expects to accomplish

    • Most often associated with efficiency or productivity levels by which administrative and support services seek to improve processes, products and services.

    • Underlying purpose to improve infrastructure or operations that help make it possible for university mission to succeed.

TAIR 2008 Conference, 2/5-6/08


Common pitfalls for oat db users

Common pitfalls for OAT Db users

  • Insufficient familiarity with outcome assessment

    • Trouble distinguishing between Goals and outcome Objectives

    • Difficulty specifying Indicators for outcome Objectives

    • Difficulty pre-specifying Criteria

  • Insufficient understanding of need for outcome Indicators measures to be..

    • Consistent

    • Replicable, beyond the judgment of a single instructor

    • Recognizable to professional peers

  • Confusion about use of class-embedded student assessments as measurements & indicators of learning outcome Objectives

TAIR 2008 Conference, 2/5-6/08


Oat db work in progress

OAT Db Work-in-Progress

Coming soon

  • Automated reports and searches

    To be scheduled

  • Customizable reports and searches

    Under consideration

  • Additional input and storage fields for..

    • Introductory unit descriptiosn

    • Descriptions of HOW learning and performance outcome Objectives will be achieved

TAIR 2008 Conference, 2/5-6/08


Hurdles

Hurdles

  • Remaining unfamiliarity with good assessment processes and good practices in assessment documentation among SHSU units

  • Residual resistance to using OAT DB

    • Some negative technical experiences with early OAT Db

    • Some infrequent technical problems with current OAT Db

    • Some resistance to standardized documentation of assessment information

  • Competition for local development resources make additional OAT Db progress slower than wished

TAIR 2008 Conference, 2/5-6/08


Conclusions

Conclusions

  • Increased awareness of assessment’s role in quality management and improvement efforts

  • More involvement in assessment and its documentation across all university units

  • Information about the assessment activities of university units is much more accessible

  • Assessment processes and their documentation are much more consistent

  • OAT Db users are more knowledgeable about the goals, objectives and assessment efforts of other units in the university

TAIR 2008 Conference, 2/5-6/08


Contact information

Contact Information

  • Mitchell Muehsam, Associate Vice President, Academic Affairs

    • [email protected]

      • 936 294 1031

  • M. Rita Caso, PhD, Director, Institutional Research & Assessment

    • [email protected]

    • 936 294 3618

  • Jeff L Roberts, Assessment Coordinator , Institutional Research & Assessment

    • [email protected]

    • 936 294 4321

  • Colt T Ramsden, Analyst , SHSU IT/Web Development

    • [email protected]

    • 936 294 4488

TAIR 2008 Conference, 2/5-6/08


  • Login