1 / 49

North Carolina Online Test of Computer Skills

North Carolina Online Test of Computer Skills. UPDATE SESSION PASI Summer 2005 Presenter: Scott Ragsdale. Session Purpose.

vian
Download Presentation

North Carolina Online Test of Computer Skills

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. North CarolinaOnline TestofComputer Skills UPDATE SESSION PASI Summer 2005 Presenter: Scott Ragsdale

  2. Session Purpose This session will update the audience on recent developments with the North Carolina Online Test of Computer Skills and the North Carolina Computer Skills Alternate Assessment as the State prepares for operational implementation during the 2005-2006 school year

  3. The Five Ages • The Age of Anxiety • The world is coming to an end! • The Age of Reason • Why is world is coming to an end? • The Age of Knowledge • Facts you should know about the end of the world • The Age of Wisdom • Only the world as we know it is coming to an end • The Age of Enlighten “up”  • Take a breath, smile, and enjoy the fruits of a brave new world

  4. The Age of Anxiety Anxiety is the thin stream of fear trickling through the mind. If encouraged, it cuts a channel into which all other thoughts are drained. Arthur Somers Roche It has been said that our anxiety does not empty tomorrow of its sorrow, but only empties today of its strength. Charles Spurgeon

  5. The End is Near! • 2005-2006 school year! That is this year! • This is not feasible! LEAs and schools do not have the technological capacity nor the technology staff to administer a web-based test! • Students do not know what they are doing! Taking the computer skills test online will only confuse and frustrate them! • Why move to a web-based delivery? Why change what we already know works? Are we intentionally trying to make our lives painful? • What about accessibility for ALL students?

  6. The Age of Reason The past is our definition. We may strive, with good reason, to escape it, or to escape what is bad in it, but we will escape it only by adding something better to it. Wendell Berry Conscience is our magnetic compass; reason our chart. Joseph Cook

  7. Why Change? • Two Separate Tests • Currently two separate tests required for student to fulfill computer proficiency requirement Student must pass each to demonstrate proficiency • Multiple-Choice • Performance In reality two tests are really one requirement!

  8. Why Change? • Time • Current administration times* • 1992 curriculum • Multiple Choice – 105 minutes • Performance – 90 minutes • 1998 curriculum • Multiple Choice – 110 minutes • Performance – 133 minutes *Note: Administration times do not include distribution of material, printing and organizing of student printouts [performance], packaging, shipping, and other logistical activities Time is precious! There never seems to be enough!

  9. Why Change? • Test Files • Currently supporting 26+ different software packages for performance test • Distribution of test files to field • Question of equity And more packages keep coming out each year!

  10. Why Change? • Paper • Printing of test materials • Test booklets • Answer documents • Header sheets and shipping lists • Excess Ordering • Printing of student work [performance] • Handling of materials @ 6% deforestation last year of the Amazon rain forest!

  11. Why Change? • Hand Scoring Currently performance test booklets are hand scored by a scoring contractor at a central location • Time • Fall administration – approximately 2 months • Spring administration – approximately 1 month • Costs • High • Scorers, staffing, spacing, supplies

  12. Why Change? • Hand scoring continued • Reliability • Scores are reliable and valid • Inter-rater reliability is high [93% +] • Reliability monitoring, qualified scorers But… • Potential for human error still exists • Student work does not always provide evidence that student used correct methods to accomplish task Why dig a hole with your hands when you have access to a shovel?

  13. Why Now? • New Curriculum • New curriculum adopted in February 2004 • Implemented starting with 2004-2005 school year • New curriculum = new instrument for measurement • Technology changing at an astonishing rate • Traditional paper tests will be archaic before long • Traditional hand scoring is outdated If you live in the past, the best you can hope for tomorrow is to look back at today!

  14. Why Not? • Merge two tests into one… • Reduce the administration, testing, scoring, and logistical time required… • Provide a universal delivery to increase equity for a “standardized” test… • Eliminate costs of printing paper and reduce paper waste… • Decrease frustration of handling so much paper… • Maximize reliability and validity of scores… • Theoretically return scores more efficiently and timely… • Keep pace with the changing face of technology, testing, and scoring…

  15. The Age of Knowledge If facts are the seeds that later produce knowledge and wisdom, then the emotions and the impressions of the senses are the fertile soil in which the seeds must grow. Rachel Carson The first step towards knowledge is to know that we are ignorant. Richard Cecil

  16. The Test • One test • Combines multiple-choice and performance items • 72 items total • 4 sections: 18 items per section • Items not delivered randomly, but also not divided into sections by specific strands/objectives [i.e., database section, etc.] • A database performance item may be followed by a multimedia multiple-choice item, in turn followed by a word processing performance item, etc.

  17. The Test • Measures 2004 curriculum • Approximately 41 objectives tested • Performance items vs. Multiple-Choice items • 50% performance based • 50% multiple-choice based • Embedded field test items for future test development

  18. The Test • Web-based delivery • NCDesk Integrated Java Applications Suite • Delivery of performance items within testing environment • Includes: • text editor/word processing application • database application • spreadsheet application • e-mail composer application – mimic application • window management application

  19. The Test Environment • Secure • Encrypted • Save function disabled in Java applications • Self-contained • Browser parameters set • No “surfing” of Internet within environment • No “cut, copy, paste” functionality outside of environment • Data Warehousing • Student responses stored on server when moving between sections and questions • Allows for recovery of test and data if workstations crash or other technical problems are encountered

  20. The Test Environment • Screen Division • Frame contains functions and identifying information • Two parallel sub-windows • Item stem • Foils [multiple-choice] or application [performance] • Separate scroll bars for each

  21. The Test Environment • Functions • Navigation • Sections do not allow for navigation between and amongst them • Returning to section is not permitted once section is completed • Navigation within sections is permitted • Navigational buttons [PREV (previous), NEXT] • Linear movement backwards or forwards within section • Navigation bar at end of section [section summary] • Non-linear movement to any item in section

  22. The Test Environment • Functions • Restore • Clears item of changes and restores to original format • Flag • Identifies item on navigation bar at section end with a red question mark as an indicator that student may need to re-visit prior to exiting section • Student can still exit section if items are flagged

  23. The Test Environment • Functions • End Section • Links to section summary page • Pause • Pauses the test at the immediate location for recovery without exiting test environment

  24. The Test Environment • Section number identified on each item page [i.e., Section 1] • Item number within section identified on each item page [i.e., Question 2 of 18] • Section summary • Makes summary statement of how many items in the section were answered and/or flagged • Navigation bar also identifies items answered and flagged • Continue button moves to next section • Displays warning message that once exiting section, return is not permitted • Reconfirms intent to exit section

  25. NCDesk • Center of the Universe • Test • Access test at Log in page [School code, User name, Password] • Test Simulation • Practice activity to simulate real test environment • Verify Connection • Runs test to verify if secure connection to test server is established • Documentation • Links to website for information, updates, etc • Applications • Access to all applications integrated in test environment for use and familiarization

  26. NCRegistration • Administrative - User access rights functions. Users are state level administrators, regional level administrators, local district test coordinators, school test coordinators, test administrators, and possibly teachers. • Bulk Registration - Function to allow bulk file uploads of student records to register large groups of students to a testing window. • Single Registration - Function to allow registration of single student to a testing window. • Test Session Scheduling - Indicating numbers of students at a school per test administration • Student Information Questions (SIQ) -- additional data collection process • Reports

  27. Technical • NCDesk is a locally installed client Java application • Client computers must have Java runtime installed • Quality Internet connection required for accessing test environment • Internet connection not required for NCDesk applications when used for learning and practice • NCDesk communicates with a central server for testing [not hosted locally] • Auto-update system check for current NCDesk version • Sufficient RAM recommended • CPU of good clock speed and recent vintage recommended • Minimum amount of drive space available • Sufficient amount of bandwidth required during testing • Best Resource for technical recommendations • http://ncdesk.ncsu.edu/ncdesk/technote.asp

  28. Technical Notes - Proposed Client Computer Requirements Special Note: Client computer systems running the minimum 128Mb RAM need to reduce the number of background applications running when trying to use NCDesk. Background applications consume memory resources that can become critically low when other applications are running. These types of applications include hidden applications, system inits (Macintosh) and system tray applications (Windows). The following proposed client computer requirements are posted with the assumption that currently active background applications are at a minimum.

  29. Stages of Development • Feasibility Studies/Trials • Conduct research into feasibility of delivering web-based test environment • Conduct research into performance of local and central technology during delivery • Conduct research into overall performance of test environment and applications • Conduct research into item performance within test environment and applications • Receive feedback and implement debugging, redevelopment, or new development

  30. Stages of Development • Field Testing • Conduct further research into overall performance of test environment and applications • Conduct research into item performance • Use data and items to construct operational form(s) • Receive feedback and implement debugging or redevelopment where needed. Note: implementation of change can only occur where not affecting performance of item

  31. Stages of Development • Operational • Form(s) built using viable items from field testing • Based on item performance, feedback, analysis, IRT (Item Response Theory), psychometric review, etc. • Implementation “locks” forms and items for future • Standards/Proficiency indicators established • Delivery as instrument for determining proficiency

  32. The Past • Fall 2003 • Feasibility Study/Trial • Volunteer sites • Adults only • 1,926 starts: 1,351 finishes • 62 LEAs represented, 193 schools • Ongoing debugging and development of technology, test environment, and items • Fall 2004 • Feasibility Study/Trial • At minimum, 10 locally chosen students per school containing eighth-grade students • 5,620 starts: 4,783 finishes • Ongoing development and debugging of technology, test environment, and items

  33. The Past • Spring 2005 • Field Test • Sampled population of schools and students • Window: April 11 – June 15 • 8,510 students chosen for sample • 6,361 starts: 6,198 finishes • Alternate Assessment also field tested • Window: May 9 – June 15 • 2000 students chosen for sample • Ongoing debugging and development of technology, test environment, and items

  34. The Present • Analysis of field test data • Item performance, results, feedback from field • Development and implementation of scoring parameters for items • Analysis of technical issues arising during field testing • Ongoing debugging and development of technology and test environment • Creation of operational form(s) based on analysis of field test data • Development of new items [item writing] for embedding in the future

  35. The Future • Operational implementation starting with the 2005-2006 school year • Students entering eighth-grade • Testing Window • Daily Administration Blocks • Standard setting • Determining proficiency • Ongoing development of items [item writing] and technology • Ongoing evaluation and monitoring of technical and infrastructure issues both at the central and local level

  36. Computer Skills Alternate Assessment • Why? • Results of feasibility study and Federal mandates required development of an alternate assessment instrument for two distinct populations: • Students with special needs who could not access the online test using available accommodations • Students who could not access the online test as a result of technical/technology limitations [i.e., unable to meet minimum requirements for bandwidth, memory, etc.]

  37. Computer Skills Alternate Assessment • Field tested Spring 2005 • Different delivery from online test, but equal rigor of standard [item difficulty level, thinking skills, etc.] • One test consisting of two distinct sections

  38. Computer Skills Alternate Assessment • Multiple-Choice Section • 36 items • Traditional • Performance Section • 27 total items • 26 performance-based, administrator rated [yes or no] items • 1 administrator rated [yes or no] item evaluating student proficiency with computer over course of time • Computer-based • Individualized administration • Use supplied files and local applications [i.e., word processing, database, etc.] to complete tasks required by items • Files provided in text format for conversion into local applications [PDFs provided to serve as blueprints]

  39. Computer Skills Alternate Assessment • Item performance, results, and feedback being analyzed at this time • Possibility [probability] of being field tested again in the Fall 2005 • Some issues with local administrators and technology staff having to convert text files into local applications • Probability that necessary test files will be provided for State supported platforms/packages in future administrations

  40. Accessibility Issues • Definite accessibility issues with online testing! • Standard accommodations are still available • Choice of large or regular font size for NCDesk • Keyboard and mouse actions functional • Currently developing the ability to integrate and support assistive technology [i.e., screen readers] • Exploring multiple options for accessibility [zoom functions, etc.] • Implementation of additional assistive technology likely an extended process

  41. Support • Helpdesk • Activated for feasibility studies/trials, field testing, and will be available for operational administration • Assistance provided prior to, during, and after testing • Addresses NCRegistration, NCDesk, Computer Skills Alternate Assessment, and any other issues involved in delivery and implementation of online test • http://cskills.ncsu.edu/ncdesk/helpdesk.asp • Mobile Labs • Available for schools/systems unable to test because of technical limitations • By request only [actual process for requests still in development] • Availability issues and division of time

  42. The Age of Wisdom It is no longer enough to be smart — all the technological tools in the world add meaning and value only if they enhance our core values, the deepest part of our heart. Acquiring knowledge is no guarantee of practical, useful application. Wisdom implies a mature integration of appropriate knowledge, a seasoned ability to filter the inessential from the essential. Doc Childre and Deborah Rozman Learn wisdom from the ways of a seedling. A seedling which is never hardened off through stressful situations will never become a strong productive plant. Stephen Sigmund

  43. The Meaning? • Online testing is the future • Most states are in the process of either implementing or maintaining an online testing program • North Carolina is moving forward with online testing…this is only the beginning • Students are far more positive about online testing than administrators/teachers/staff • Trends suggest students more comfortable and engaged with online testing…overwhelming support from them • Technology concerns are warranted, but… • Implementation of technology will become seamless over time as traditional options for testing are exhausted • Systems/schools have been successful in implementing this test

  44. Basic Necessities for Success • Dissemination and sharing of information • Local, State, National, International • Use resources available and act as a resource • Communication • Question • Online testing is a new world so do not be afraid to question things or offer your opinion • Support • There will be some growing pains, but never waver in your support • Support at all levels, between all divisions and peoples is absolutely required • Learn • Do not be complacent in your knowledge, always seek more • Familiarity = Understanding = Less frustration, stress, and anxiety for all

  45. The Age of Enlighten “up” This I conceive to be the chemical function of humor: to change the character of our thought. Lin Yutang Humor is perhaps a sense of intellectual perspective: an awareness that some things are really important, others not; and that the two kinds are most oddly jumbled in everyday affairs. Christopher Morley

  46. A Brighter Future • Yes, anxiety is expected, normal, and okay • Yes, change can be painful • Yes, this is a serious matter not to be taken lightly But… • Never lose perspective on what is important • Think about why we are really doing this and who it ultimately benefits • Understand that we are all in this together, for better or worse, so let us all be friends, not enemies And lastly… • Smile and be confident in knowing that, though pioneers, we are moving in the right direction

  47. Websites • http://cskills.ncsu.edu/nccs • Link to home page of the North Carolina Online Test of Computer Skills • http://ncdesk.ncsu.edu/ncdesk • Direct link to the home page for the NCDesk application suite • http://www.ncpublicschools.org/curriculum/computerskills • Link to Computer/Technology Skills Standard Course of Study on the North Carolina Department of Public Instruction website • http://community.learnnc.org/dpi/tech • Link to Computer/Technology Skills page for Curriculum and School Reform on the North Carolina Department of Public Instruction website • http://www.ncpublicschools.org/accountability/testing/computerskills • Link to computer skills testing information on the North Carolina Department of Public Instruction website • http://tps.dpi.state.nc.us/ • Link to Technology Implementation & Planning Services page on the North Carolina Department of Public Instruction website • http://www.ncpublicschools.org/techservices • Link to Technology Services page on the North Carolina Department of Public Instruction website

  48. Contact Information Scott Ragsdale Project Manager, North Carolina Computer Skills Assessments scott_ragsdale@ncsu.edu Randy Craven Technical Manager randy_craven@ncsu.edu Jim Kroening Program Manager, Performance Assessments jkroening@dpi.state.nc.us

  49. Final Thought The important thing in life is not where we are, but in which direction we are moving. Oliver Wendell Holmes

More Related