1 / 7

Report of the Marginal Marks Working Group

DRAFT. Report of the Marginal Marks Working Group. David Flater National Institute of Standards and Technology http://vote.nist.gov. Testing Scanning Accuracy - Jones.

africa
Download Presentation

Report of the Marginal Marks Working Group

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. DRAFT Report of the Marginal Marks Working Group David Flater National Institute of Standards and Technology http://vote.nist.gov

  2. Testing Scanning Accuracy - Jones The TGDC recognizes that voters' ability to mark a paper ballots as they intend is critical to their success in voting accurately. The TGDC also recognizes that: The usability of a paper ballot is determined by the form of the ballot, the instructions, the available marking devices, and the voter's prior expectations from use of other similar paper forms. HAVA leaves the determination of what marks constitute a vote and how voters are instructed to the states The TDGC has concluded that systems must include documentation of the marks recommended for use with that system, and how the system will respond to common marking devices and typical marks voters may make. The TGDC requests that NIST investigate the development of a standard benchmark set of ballot markings representative of the types of marks real voters make on each common type of ballot, which will allow the inclusion of a test requirement to test compliance with the above documentation requirement.

  3. Charter Investigate the development of a standard reference set of ballot markings representative of the types of marks that voters make on each common type of optical scan / marksense ballot Goal: Enable VSTLs and acceptance testers to test and document the responses of scanners Non-goal: Define what is a valid vote

  4. Preliminary work Mascher, Mascher & Jones, "Ballot Marks from the Humboldt County 2008 General Election [DRAFT]," May 2010 David Flater, informal review of ballots retained from previous human factors work at NIST Both data sources had active write-in campaigns (real or simulated)

  5. Coincidental findings Most frequent anomaly was misuse of write-in lines Wrote in name but did not fill the target "Emphasis write-ins" "Write In" interpreted as instruction to voter Redundancy similar to writing a check Corrections that won't work Most voters seem to think that the ballot is going to be read by a human Check marks, X's, bleed-through, smudges

  6. Examples Cross-out correction Unmarked write-ins Miscellaneous checks, X's, lines through targets

  7. Main objective "Development of Standard Reference Set of Ballot Markings," https://www.fbo.gov/, solicitation number SB1341-10-RQ-0458 Data collection Hart ballots (square targets) from Clark County, WA ES&S ballots (oval targets) from Spokane County, WA Sequoia ballots (arrows) from Snohomish County, WA Analysis and classification of marks Identification of test set of marks Design and validation of standard reference marks

More Related