1 / 56

North Carolina Computer Skills Tests

North Carolina Computer Skills Tests. CTE Summer Conference Koury Convention Center Greensboro July 27, 2005 Presenter: Randy Craven Information Technology Manager, Technical Outreach for Public Schools NC State University. Session Purpose.

Download Presentation

North Carolina Computer Skills Tests

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. North CarolinaComputer Skills Tests CTE Summer Conference Koury Convention Center Greensboro July 27, 2005 Presenter: Randy Craven Information Technology Manager, Technical Outreach for Public Schools NC State University

  2. Session Purpose This session will provide an overview of the North Carolina Computer Skills Tests to be used during 2005-06 including: • 1992 & 1998 Curriculum Tests • 2004 Curriculum Online Test • 2004 Curriculum Alternate Assessment

  3. CAUTION This session will contain factual information concerning an actively developing process. Statements made will be representations of current plans for the near future. End results may turn out to be different.

  4. The Requirement • In May 1991, the North Carolina State Board of Education (SBE) adopted a policy which required that all students, beginning with the class of 2000, demonstrate computer proficiency in order to receive a high school diploma (Feature C of the Quality Assurance Program). In October of 1995, the SBE modified the requirement which made it effective beginning with the graduating class of 2001.

  5. Curricula - Tests • 1992 Curriculum Tests for Students Who Entered Grade 8 From 1996-1997 through 1999-2000 • 1998 Curriculum Tests for Students Who Entered Grade 8 from 2000–2001 through 2004-2005 • 2004 Curriculum Tests for Students Who Entered Grade 8 from 2005-06 and beyond

  6. One Requirement • 1992 & 1998 Curricula • Passing two separate tests needed to fulfill computer proficiency requirement for students entering 8th grade during school years 1996-97 to 2004-05 • Multiple-Choice • Performance • 2004 Curriculum • Passing only one test required

  7. What - Tests • 1992 Curriculum Multiple Choice Scored Locally • 1992 Curriculum Performance Scored Centrally • 1992 Portfolio Scored Locally • 1998 Curriculum Multiple-Choice Scored Locally • 1998 Curriculum Performance Scored Centrally • 1998 Portfolio Scored Locally • 2004 Curriculum Online Combination Scored Centrally • 2004 Curriculum Alternate Assessment Field Test Scored Centrally • 2004 Curriculum Alternate Assessment Scored Locally

  8. When - Schedule • Now - 2005-06 School Year • 2004 Curriculum Alternate Field Test – ASAP at the beginning of the school year for 9th Graders • 1992 & 1998 Curriculum Tests – Early October - Mid November • 2004 Curriculum Online Primary Window – Mid October – Mid January • 2004 Curriculum Alternate – Early November – Mid January • 1992 & 1998 Curriculum Tests – Early February – Mid March • 2004 Curriculum Standards Analysis – Mid January – Mid February • 2004 Curriculum Standards Set By SBE • 2004 Curriculum Results Reported Early March • 2004 Curriculum Online & Alternate Secondary Window – Late March – Mid June

  9. Grades Affected • Grade 8 • Online Test • Alternate Assessment • Grade 9 • Alternate Assessment Field Test • Grades 9 – 12 (for those not yet meeting the standard) • 1998 Curriculum Tests • 1998 Curriculum Portfolio • Students who entered grade 8 between 1996-97 through 1999-2000 (seeking diploma – not yet meeting the standard) • 1992 & 1998 Curriculum Tests • 1992 & 1998 Curriculum Portfolio

  10. Administration Times* • 1992 curriculum • Multiple Choice – 105 minutes • Performance – 90 minutes • 1998 curriculum • Multiple Choice – 110 minutes • Performance – 133 minutes • 2004 curriculum • One Online Test Estimated 120 minutes • One Alternate – Time not known currently *Note: Administration times do not include distribution of material, printing and organizing of student printouts [performance], packaging, shipping, and other logistical activities

  11. More About… • 1992 & 1998 Curricula Tests • 2004 Curriculum Online Test • 2004 Curriculum Alternate Test • NCDesk • NCRegistration • 2004 Curriculum Test Development History • Other Items

  12. 1992 & 1998 Test Details • Pencil-Paper-Computer-Printer processes • 2 Separate Tests with different standards to be passed • Multiple-Choice Scored Locally • Performance Scored Centrally • Must be given once per year to students who have not yet met the standard

  13. Old Paper Trail • Printing of test materials • Test booklets • Answer documents • Header sheets and shipping lists • Excess Ordering (~100% overage) • Printing of student work for performance • Handling of materials

  14. Test Files Support • Currently supporting 23 different software packages for performance test • Distribution of test files to field • Equity of administration questionable • Online Test delivers required files to students as test is in progress

  15. Scoring Performance • Currently performance test booklets are hand scored by a scoring contractor at a central location • Time • Fall administration – approximately 2 months • Spring administration – approximately 1 month • Costs • High • Scorers, staffing, spacing, supplies

  16. Scoring Performance (cont.) • Scores are reliable and valid • Inter-rater reliability is high [93% +] • Reliability monitoring, qualified scorers But… • Potential for human error still exists • Student work does not always provide evidence that student used correct methods to accomplish task

  17. 1992 & 1998 Test Details • End of Section • 2004 Curriculum Online Test • 2004 Curriculum Alternate Test • NCDesk • NCRegistration • 2004 Curriculum Test Development History • Other Items

  18. 2004 Online Test Details • Tests the adopted 2004 curriculum • Uses NCDesk as the client interface • Delivered through the Internet from central testing server

  19. Goals of the Online Test • Merge two tests into one… • Reduce the administration, testing, scoring, and logistical time required… • Provide a universal delivery to increase equity for a “standardized” test… • Eliminate costs of printing and reduce paper waste… • Decrease handling so much paper… • Maximize reliability and validity of scores… • Return scores more efficiently and timely… • Keep pace with the changing face of technology, testing, and scoring…

  20. The Online Test • 72 items total • 4 sections • 18 items per section • Items not delivered randomly • Not divided into strand specific sections • Not divided into performance and multiple-choice sections

  21. The Online Test • Different forms automatically delivered to each student • Performance vs. Multiple-Choice • 50% performance based • 50% multiple-choice based • Embedded field test items for future test development

  22. NCDesk Client • NCDesk Integrated Java Applications Suite • Delivery of multiple-choice items • Delivery of performance items within testing environment • Includes: • editor/word processing application • database application • spreadsheet application • e-mail composer application • window management application

  23. Online Test Environment • Secure • Encrypted • Save function disabled in NCDesk during test • Self-contained • No “surfing” of Internet within NCDesk • No “cut, copy, paste” functionality outside of test • Data Warehousing • Student responses stored on server when moving between sections and questions • Allows for recovery of test and data if workstations crash or other technical problems are encountered

  24. Online Test Screen • Restore, flag control and position text • Item stem • Foils [multiple-choice] or application [performance] • Separate scroll bars for each • Navigation buttons • Pause button

  25. Online Test Navigation • Section number identified on each item page [i.e., Section 1] • Item number within section identified on each item page [i.e., Question 2 of 18]

  26. Online Test Controls • Restore • Clears item of changes and restores to original format • Flag • Identifies item on navigation bar at section end with a red question mark as an indicator that student may need to re-visit prior to exiting section • Student can still exit a section if items are flagged

  27. Online Test Navigation • Students can move forward and backward within a section • Returning to section is not permitted once section is completed • Navigational buttons [PREV (previous), NEXT] • Linear movement backwards or forwards within section

  28. Online Test Functions • End Section • Links to section summary page • Pause • Pauses the test at the immediate location for recovery without exiting test environment • 15 minute timeout – test administrator will have to log student back into the test if paused longer than 15 minutes.

  29. Online Test Navigation • Section Summary • Navigation bar at end of section • Non-linear movement to any item in section • Displays which items have been flagged • Displays which items have been answered • Moving to next section displays warning

  30. Online Test Details End • End Section • 1992 & 1998 Curricula Tests • 2004 Curriculum Alternate Test • NCDesk • NCRegistration • 2004 Curriculum Test Development History • Other Items

  31. Online Test Development History • Feasibility Study/Trial Fall 2003 • Feasibility Study/Trial Fall 2004 • Spring 2005 Field Test

  32. Feasibility Studies • Conducted research into feasibility of delivering Internet-based test environment • Fall 2003 • Fall 2004 • Conducted research into performance of local and central technology during delivery • Conducted research into overall performance of test environment and applications • Conducted research into item performance within test environment and applications • Received feedback and implement debugging, redevelopment, or new development

  33. Feasibility Studies • Fall 2003 • Volunteer sites • Adults only • 1,926 starts: 1,351 finishes • 62 LEAs represented, 193 schools • Fall 2004 • At minimum, 10 locally chosen students per school containing eighth-grade students • 5,620 starts: 4,783 finishes

  34. Field Testing • Spring 2005 • Continued research into overall performance of test environment and applications • Conducted research into item performance • Data used to construct operational form(s) • Receive feedback and implement debugging or redevelopment where needed. • Note: implementation of change can only occur where not affecting performance of item

  35. Spring 2005 Field Test • Sampled population of schools and students • Window: April 11 – June 15 • 8,510 students chosen for sample • 6,361 starts: 6,198 finishes • Alternate Assessment also field tested • Window: May 9 – June 15 • 2000 students chosen for sample

  36. Online Test Development History End • End Section • 1992 & 1998 Curricula Tests • 2004 Curriculum Online Test • 2004 Curriculum Alternate Test • NCDesk • NCRegistration • Other Items

  37. Computer Skills Alternate Assessment • Results of feasibility study and Federal mandates required development of an alternate assessment instrument for two distinct populations: • Students with special needs who could not access the online test using available accommodations • Students who could not access the online test as a result of technical/technology limitations [i.e., unable to meet minimum requirements for bandwidth, computer, etc.]

  38. Computer SkillsAlternate Assessment • Field tested Spring 2005 • ~2000 students sampled • 1338 total documents processed • 1323 contained multiple choice data • 765 contained performance data • Different delivery from online test, but equal rigor of standard [item difficulty level, thinking skills, etc.] • One test consisting of two distinct sections

  39. Computer SkillsAlternate Assessment • Multiple-Choice Section • 36 items • Traditional • Performance Section • 27 total items • 26 performance-based, administrator rated [yes or no] items • 1 administrator rated [yes or no] item evaluating student proficiency with computer over course of time • Computer-based • Individualized administration • Use supplied files and local applications [i.e., word processing, database, etc.] to complete tasks required by items • Files provided in text format for conversion into local applications [PDFs provided to serve as blueprints]

  40. Computer SkillsAlternate Assessment • Item performance, results, and feedback being analyzed at this time • Highly likely to be field tested again in early Fall 2005 to 9th graders • Test files will be provided for State supported platforms/packages in future administrations

  41. Computer SkillsAlternate Assessment End • End Section • 1992 & 1998 Curricula Tests • 2004 Curriculum Online Test • NCDesk • NCRegistration • 2004 Curriculum Test Development History • Other Items

  42. NCDesk • Center of the Universe • Test • Access test at Log in page [School code, User name, Password] • Test Simulation • Practice activity to simulate real test environment • Verify Connection • Runs test to verify if secure connection to test server is established • Documentation • Links to website for information, updates, etc • Applications • Access to all applications integrated in test environment for use and familiarization

  43. Technical • NCDesk is a locally installed client Java application • Client computers must have Java runtime installed • Quality Internet connection required for accessing test environment • Internet connection not required for NCDesk applications when used for learning and practice • NCDesk communicates with a central server for testing [not hosted locally] • Auto-update system check for current NCDesk version • Sufficient RAM recommended • CPU of good clock speed and recent vintage recommended • Minimum amount of drive space available • Sufficient amount of bandwidth required during testing • Best Resource for technical recommendations • http://ncdesk.ncsu.edu/ncdesk/technote.asp

  44. Technical Notes - Proposed Client Computer Requirements Special Note: Client computer systems running the minimum 128Mb RAM need to reduce the number of background applications running when trying to use NCDesk. Background applications consume memory resources that can become critically low when other applications are running. These types of applications include hidden applications, system inits (Macintosh) and system tray applications (Windows). The following proposed client computer requirements are posted with the assumption that currently active background applications are at a minimum.

  45. NCRegistration A test registration system (NCRegistration) is being developed to work in conjunction with any of the online testing programs being developed/offered to public schools in North Carolina. School systems and schools will use the system to indicate students who are eligible for testing within a testing window and schedule test administrations within predefined testing sessions for groups of students.

  46. NCRegistration Functions • Administrative - User access rights functions. Users are state level administrators, regional level administrators, local district test coordinators, school test coordinators, test administrators, and possibly teachers. • Bulk Registration - Function to allow bulk file uploads of student records to register large groups of students to a testing window. • Single Registration - Function to allow registration of single student to a testing window. • Test Session Scheduling - Indicating numbers of students at a school per test administration • Student Information Questions (SIQ) -- additional data collection process • Reports – Test Results

  47. Other Items • Current Activities • Accessibility Issues • Helpdesk • Mobile Labs • Future Plans • Necessities for Success • Web Site Reference • Contact Information

  48. Current Activities • Analysis of field test data • Item performance, results, feedback from field • Development and implementation of scoring parameters for items • Analysis of technical issues arising during field testing • Ongoing debugging and development of technology and test environment • Creation of operational form(s) based on analysis of field test data • Development of new items [item writing] for embedding in the future • Additions and testing of NCDesk & NCRegistration

  49. Accessibility Issues • Definite accessibility issues with online testing! • Standard accommodations are still available • Choice of large or regular font size for NCDesk • Keyboard and mouse actions functional • Currently developing the ability to integrate and support assistive technology [i.e., screen readers - JAWS] • Exploring multiple options for accessibility [zoom functions, etc.] • Implementation of additional assistive technology likely an extended process

  50. Helpdesk • Activated for feasibility studies/trials, field testing, and will be available for operational administration • Assistance provided prior to, during, and after testing • Addresses NCRegistration, NCDesk, Computer Skills Alternate Assessment, and any other issues involved in delivery and implementation of online test • http://cskills.ncsu.edu/ncdesk/helpdesk.asp • Email cskills@ncsu.edu to request assistance

More Related