1 / 31

Update on the UOCAVA Working Group

Update on the UOCAVA Working Group. Andrew Regenscheid Mathematician, Computer Security Division, ITL http://vote.nist.gov. Overview. The TGDC UOCAVA working g roup has three outstanding task items: High-level g uidelines for UOCAVA voting s ystems

kaleb
Download Presentation

Update on the UOCAVA Working Group

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Update on the UOCAVA Working Group Andrew Regenscheid Mathematician, Computer Security Division, ITL http://vote.nist.gov

  2. Overview The TGDC UOCAVA working group has three outstanding task items: • High-level guidelines for UOCAVA voting systems • Narrative risk analysis on current UOCAVA voting process and demonstration project system • Low-level guidelines for demonstration project system Page 2

  3. Meeting Objectives This session of today’s meeting has three objectives: • Decide how to proceed on the high-level guidelines, including decisions on: • Intended scope and purpose • Auditability/verifiability guidelines • Usability/accessibility guidelines • Resolution of FVAP’s comments • Decide on a course of action for conducting a risk analysis on the current UOCAVA voting process • Discuss process/timeline for approaching demonstration project guidelines

  4. High-Level Guidelines EAC/NIST/FVAP UOCAVA Roadmap “EAC and the TGDC, with technical support from NIST, and input from FVAP, will identify high-level, non-testable guidelines for remote electronic absentee voting systems. This effort will focus on the desirable characteristics of such systems and serve as a needs analysis for future pilots and research; and for the purposes of driving industry to implement solutions.”

  5. High-Level Guidelines • Purpose • Fulfill charge from UOCAVA Roadmap • Interpreted the UOCAVA Roadmap language as asking for aspirational, high-level guidelines intended to identify goals for future UOCAVA voting systems • Intent is that these high-level guidelines would form basis for the development of low-level guidelines for the demonstration project and future UOCAVA voting systems • Scope • Included both demonstration project systems and future systems • Guidelines intended to be all-encompassing, covering roughly the same scope as future low-level guidelines

  6. High-Level Guidelines • Goal was to identify a small number (~25) high-level guidelines that covered all important topics • Build consensus around high-level concepts, and flush out details in low-level guidelines for the future • Emphasis on aspirational goals- we recognized some may not be achievable today

  7. High-Level Guidelines: Topics Current high-level guidelines draft includes: • Voting functions • Auditability • Quality assurance and configuration management • Reliability and availability • Usability and accessibility • Security • Interoperability

  8. High-Level Guidelines: Process • NIST staff initially drafted high-level guidelines in sections using: • Earlier drafts of high-level guidelines • Council of Europe’s Legal, Operation and Technical Standards for E-Voting • Research done to support VVSG development • Existing relevant standards • UOCAVA and U&A working group members reviewed and edited guidelines • Properties of the current UOCAVA voting system were taken into consideration, but did not limit the guidelines

  9. Voting Functions • Primary, basic guidelines expected from any voting system, e.g., • One cast ballot counted per voter (hlg-2, 3) • Accurate and reproducible vote counts (hlg-4) • Supply voters with correct ballot style (hlg-5) • Some were derived from CoE E-Voting standard

  10. Auditability • Primary guideline: “The UOCAVA voting system shall create and preserve evidence to enable auditors to verify that it has operated correctly in an election, and to identify the cause if it has not.” • Two controversial proposed guidelines: • “The audit system shall provide the ability to compare records and verify the correct operation of the UOCAVA voting system and the accuracy of the result, in an effort to detect fraud, to prove that all counted votes are authentic and that all authentic votes have been counted as cast.” • “The UOCAVA voting system shall make it possible for voters to check whether their vote was cast and recorded as they intended, and shall make it possible for observers to check whether all cast votes have been counted and tallied correctly.”

  11. Quality Assurance and Configuration Management • System must be “fit for use” • System must be developed, monitored and maintained in accordance with applicable best practices for quality assurance • Documented, tested, and stable configuration • Guidelines based on research done to support VVSG 2.0 draft

  12. Reliability and Availability • Definition of critical failure: any functional failure, the occurrence of which jeopardizes the validity of the election, or casts doubt on the credibility of the election result • Probability of critical failures and overall system availability must be fit for intended use (hlg-1, 3) • Assure reliability of system through application of best reliability engineering practices and standard reliability analysis procedures • Based on CoE guidelines and supporting VVSG 2.0 research

  13. Security • Security guidelines were developed accepting risks of the current mail-system, e.g., • Low-level compromises of ballot secrecy is accepted (hlg-2) • Some low-level fraud accepted- the goal is to prevent an undetectable change in the outcome of the election (hlg-3) • Some new issues unique to electronic systems: • Strong user authentication for voters, administrators, officials (hlg-1) • Systems must be free of vulnerabilities that allow remote attacks (hlg-4) • Prevent malicious software on terminals from impacting election integrity (hlg-5) • Recommended use of penetration testing (hlg-6)

  14. User-Centered Development (hlg-1) Develop with best practices in user-centered design and user testing Incorporate these principles throughout the system development cycle and as part of certification Evaluate system usability and accessibility via user testing with representative test participants Include usability evaluation of procedures and documentation for system administration

  15. Accessibility (hlg-2, 5, 7) Make system accessible to voters with disabilities Built-in access features Interoperability with personal assistive technology (PAT) PAT as supplemental rather than necessary to ensure system accessibility Maintain privacy and independence throughout all phases of voting process Ballot marking, verification, and casting Voter has same accessibility accommodations throughout Comply with legal mandates

  16. Best Design Practices (hlg-3, 4, 6) Follow human factors design best practices, for both system and ballot design where possible EAC’s report “Effective Designs for the Administration of Federal Elections” American Institute of Graphic Arts (AIGA)’s report “Top 10 Election Design Guidelines” Adhere to current standards and guidelines VVSG World Wide Web Consortium (W3C)’s Web Accessibility Initiative (WAI), specifically the Web Content Accessibility Guidelines (WCAG 2.0) and WAI for Accessible Rich Internet Applications (WAI-ARIA)

  17. More on Ballot Design FVAP expressed some concern over including ballot design in the high-level guidelines To clarify: High-level guidelines are not intended to supersede State laws Election Officials control formatting of ballot content High-level guidelines are intended to address only those ballot design features controlled by the UOCAVA system For example, navigation and user interface controls UOCAVA system should support implementation of good ballot design

  18. More on Accessibility FVAP requested high-level guidelines focus on the demonstration project, which would limit the scope of accessibility Suggested that only Section 508 be referenced Implications of this are unclear: Section 508 does require accessible design and some PAT interoperability Section 508 “Refresh” on the horizon How much of W3C’s WAI guidelines should be implemented in the demonstration project? Will we learn enough about accessibility from the demonstration project to inform future work?

  19. Discussion/Questions Open issues: • Intended scope and purpose • Auditability/verifiability guidelines • Usability/accessibility guidelines • Resolution of FVAP’s comments Next Topic: Risk Analysis Page 19

  20. Risk Analysis • TGDC Resolution #02-11 directs the UOCAVA Working Group to: “prepare a narrative risk assessment comparing the current UOCAVA voting process to electronic absentee voting systems used in a demonstration project with military voters.” • Currently, the demonstration project system is not defined • First step: analyzing risks in current UOCAVA voting process

  21. Risk Analysis: Transactional Failures • Current UOCAVA process has a number of transactional failure points between voter registration and ballot canvassing: • Voter registration failures • Ballot delivery failures • Ballot marking errors • Ballot return failures • These failures are observable and measurable • An analysis of these failures can lead us to an overall failure rate of the current process

  22. Risk Analysis: Identifying Risks • Transactional failures are only one type of risk • The UOCAVA working group can analyze one or more representative current UOCAVA voting processes to identify other potential risks • What is the potential vulnerability? • Who is in a position to exploit it? • What is the impact of a successful exploit? • What is the probability of a successful exploit? • Challenge #1: Impacts are not always easily quantifiable in comparable units. What is the value of a vote? • Challenge #2: Probabilities for malicious attacks are notoriously difficult to estimate

  23. Risk Analysis: Comparing Risks • It will be important to compare and balance risks between different types of systems, as well as different types of risks within a given system • We can create quantifiable comparisons of impact • Example: Comparing the impact of lost ballots and tampered ballots to the outcome of the election • Collaboration with NIST Statistical Engineering Division • Explore use of EAC Election Operations Assessment Tool • Qualitative comparisons will be done in other areas, such as malicious attacks or risks

  24. Discussion/Questions Feedback on Risk Analysis Path Forward Next Topic: Demonstration Project Guidelines Page 24

  25. UOCAVA Demonstration Project • Work is building up to the implementation of a remote voting demonstration project for military voters • EAC has tasked the TGDC in developing guidelines for the demonstration project system • TGDC Resolution #02-11 stated TGDC’s acceptance of this task, and directed the TGDC to develop guidelines for a demonstration project with simplifying assumptions: • Military voters only • Use of Common Access Card (CAC) for authentication • Use of professionally-administered machines

  26. Mitigated Risks • The simplifying assumptions mitigate some risks identified in NISTIR 7551: A Threat Analysis on UOCAVA Voting Systems: • Use of CAC mitigates authentication-related risks, including voter impersonation and phishing attacks • Digitally signed ballots using CAC could mitigate some malicious attacks on servers • Use of professionally-administered machines mitigates risk of malicious software on voting terminals impacting ballot secrecy or integrity • Use of military network could help to mitigate some remote attacks on servers Page 27

  27. Other Risks • Other risks may need to be mitigated by other means, pending results of risk analysis: • Network-based attacks may not be mitigated by the architecture • Internet voting systems inherit many of the same potential risks as electronic polling place systems Page 28

  28. Demonstration Project Prerequisites • Several items need to be completed prior to development of demonstration project guidelines • TGDC tasked with the high-level guidelines and risk analysis • TGDC/NIST also need: • Concept of operations of the demonstration system • Expected high-level system architecture • Clearly defined scope for demonstration project system • How extensive will this project be? One-time only? • What functions must be provided? • Who decides appropriate tradeoffs and accepts risks?

  29. Demonstration Project: Timeline • Current work: complete near-term deliverables (i.e., high-level guidelines and risk analysis) intended to inform low-level guidelines development • Demonstration project guidelines expected to take 24 months to develop, vet through a public comment period, and approve in TGDC and EAC • 12 month development process • 6 month vetting process • 6 month revision process • For a 2016 demonstration project, guidelines would be needed by mid-2014

  30. Discussion/Questions Page 31

More Related