1 / 21

CBU Online Survey System

CBU Online Survey System. MASEC March 31, 2006 John Ventura. Teacher Evaluations. Project Description. Current methods at CBU Paper survey Count results by hand or scantron I.T.S. worker creates a web page Takes a lot of time and effort. Project Description.

berke
Download Presentation

CBU Online Survey System

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CBU Online Survey System MASEC March 31, 2006 John Ventura

  2. Teacher Evaluations

  3. Project Description • Current methods at CBU • Paper survey • Count results by hand or scantron • I.T.S. worker creates a web page • Takes a lot of time and effort

  4. Project Description • Easy to use web-based evaluation tools: • Create, edit and administer surveys from a web site • Export and import information to and from a personal computer • Manage users and groups, with usernames and passwords • Analyze results with different types of table and graphs, and compare information for a survey over several terms.

  5. ABET • Accreditation Board for Engineering and Technology (ABET) • Web-based Evaluation Process • Data base for developing trends

  6. Project Goals • Analyze available products on the market. • Design a software system to create and manipulate data in a survey. • Implement software system suitable for the Christian Brothers University environment.

  7. Important Points • Levels of user access • Department Head and Administration • Faculty and Staff • Students • Alumni and Guests

  8. Important Points • Security • Access based on user login and password similar to CBU Web mail • Import a text file with student usernames • Create students one at a time • User identification used only for login and not to indicate which surveys have been taken

  9. Additional Functionality • User login verification • Import / export data • Adding, editing and deleting response sets • Help pages and tutorials

  10. Research Method (a) establish an ECE Curriculum Committee, (b) expand the responsibilities of the ECE Advisory Board, (c) develop Web-based instruments to measure the achievement of program objectives of graduates and learning outcomes of students, and (d) provide a model that integrates input from constituents, assessment committees, and results formulated from the results of measuring achievement of students and graduates into an evaluation process.

  11. Procedures • Employee surveys/questionnaires to measure the achievement of program objectives using Web-based technologies • Establish an ECE Curriculum Committee that contains members from local engineering societies and organizations, the student chapter of IEEE, and faculty • Expand the responsibilities of the ECE Advisory Board to include the examination and validation of the evaluation processes • Provide constituents with instruments that measure the level of achievement of program objectives • Provide results of surveys/questionnaires to the ECE Advisory Board and ECE Curriculum Committee that enabled them to validate the survey instruments and the model for evaluation

  12. ECE Curriculum Committee • A member of the executive board of the Memphis Chapter of IEEE • A member of the executive board of the Memphis Chapter of the Tennessee Society of Professional Engineers (TSPE) • A member of the Organizing Committee of the Memphis Area Engineering and Sciences Conference • Chair of the IEEE Student Chapter at CBU • ECE faculty • Dean of Engineering • Chair of the Master of Engineering Management • A faculty member from the School of Sciences

  13. Rating Scale

  14. Grading Criteria for ABET Learning Outcomes for Industry Survey Grade Criteria • A+ >= 80% of responses in categories 3 and 4; > = 50% rated as 3 • A >= 80% of responses in categories 3 and 4; > = 37.5% rated as 4 • A– >= 80% of responses in categories 3 and 4; < 37.5% rated as 4 • B+ 60 to < 80% in categories 3 and 4; >= 37.5% rated as 4 • B 60 to < 80% in categories 3 and 4; >= 25% rated as 4 • B–60 to < 80% in categories 3 and 4; < 25% rated as 4 • C+ Highest frequency of ratings for category 2 and 3 but < =60% in category 2 and 3; number of (3+4) > number of (1+2) • C 60 to < 80% in category 2 and 3 • C– Highest frequency of ratings for category 2 and 3 but < =60% in category 2 and 3; number of (1+2) > number of (3+4) • D+ < 90% to >= 70% in categories 1 and 2; < 25% are in category 1 • D < 90% to >= 70% in categories 1 and 2; >= 25% to < 37.5% are in category 1 • D– < 90% to >= 70% in categories 1 and 2; >= 37.5% are in category 1 • F >= 90% are in categories 1 and 2

  15. ECE Advisory Board and ECE Curriculum Committee • Validate instruments that contain the expectations of constituents, especially graduates and employers. • Assess the quality and performance of the evaluation process. • Assess the methods used for gathering information in a Web-based environment.

  16. Evaluation • ECE Advisory Board evaluates the level of achievement of program objectives and evaluation processes • ECE Curriculum Committee evaluates survey instruments and evaluation processes • ECE Curriculum Committee evaluates the level of achievement of learning outcomes

  17. Formative Evaluation • Formative evaluation processes should be undertaken due to the unfavorable factors inherent in information obtained from student self-assessment • Summative processes based on self-assessment is questionable • Improvements in programs should be based on trends developed from data obtained over several semesters rather than a summative assessment of asingle cycle of an evaluation process

  18. Online Survey System Demonstration

  19. Any Questions?

More Related