1 / 38

Unified Accessibility Evaluation Methodology Jonathan Avila

Unified Accessibility Evaluation Methodology Jonathan Avila. About SSB BART Group. Unmatched Experience Accessibility Focus Implementation-Oriented Solutions Solutions That Reduce Legal Risk Organizational Stability and Continuity Knowledge That Is Up-to-Date, All the Time

trista
Download Presentation

Unified Accessibility Evaluation Methodology Jonathan Avila

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Unified Accessibility Evaluation MethodologyJonathan Avila

  2. About SSB BART Group • Unmatched Experience • Accessibility Focus • Implementation-Oriented Solutions • Solutions That Reduce Legal Risk • Organizational Stability and Continuity • Knowledge That Is Up-to-Date, All the Time • Published and Peer Review Auditing Methodology • Fourteen hundred organizations(1445) • Fifteen hundred individual accessibility best practices (1595) • Twenty-two core technology platforms (22) • Fifty-five thousand audits (55,930) • One hundred fifty million accessibility violations (152,351,725) • Three hundred sixty-six thousand human validated accessibility violations (366,096)

  3. Introduction Agenda • Introduction • Scope • Test Set Capture • Explore and Sample Modules/Use Cases • Audit • Report • Analysis • Metrics/Scoring • Resources

  4. Introduction Goals Purpose Inform stakeholders on a site's level of conformance to an accessibility standard(s) Reasons Document level of conformance (business drivers) Third-party evaluation To remediate issues (before or during development) Benchmark or measure progress Reliable Methodology Performed by experienced accessibility consultants Documented, repeatable, scalable Functional and Technical requirements

  5. Introduction Phases Scope Determine evaluation methods, platforms, technology, standards, and site or app Test Set Capture Explore and sample pages, screens (modules) and use cases Audit Technical and Functional testing Report Analysis Documentation Metrics/Scoring

  6. Scope

  7. Scope Site and Accessibility Standards Define the Site to be Tested • Desktop site • Mobile site • Public-facing portion of site • e-Commerce Site • Complete group of pages that make up a site Define Accessibility Standards • WCAG 2 Level A and AA • Section 508 • CVAA • EN 301-549 • Agency or organization specific standards

  8. Scope Assistive Technology, Browsers, and Evaluation Methods User Agent and Assistive Technology (AT) Versions • Browser/user agent • AT types and versions Internal/External Site – dictate baseline differences • Limit baseline for internal use – controlled environment • Expanded baseline for public site Choose Evaluation Methods/Test Process • Generally includes test steps and failures • Internal checklists, best practices or testing procedures • WCAG sufficient techniques and failures • US Federal baseline test process for Section 508 compliance

  9. Test Set Capture

  10. Test Set Capture Streamline the Process Goal Reduce the amount of duplicate content to test • Identify representative sample • Identify repeated portions of pages Definitions Page – a complete web page, doc page, or software screen Module - a page or a piece of repeated page content that can be tested once across a page or pages • E.g. a dialog box, a navigation bar, a menu structure Note: For conformance to WCAG and other sites you must ensure that full pages are covered

  11. Test Set Capture Choose Sample Size Choose representative sample of pages/modules Sample size dictated by: • Size of website • Portal vs. micro-site • Complexity of site • Web Application • Dynamic site/Static site • Interactivity • Consistency of site (disparate authors, technologies, etc.)

  12. Test Set Capture What Should the Sample Include? Common pages such as those on headers/footers Contact, support, or accessibility policy pages Pages with essential functionality to site (core tasks) Samples of each types of pages and ones of different technologies (Flash, Silverlight, PDF, SVG, etc.) Pages in a complete process (e.g. steps to make purchase) High traffic/touch point/risk pages For hardware (e.g. scanner) – surfaces, displays, operable controls Note: May need assistance from development team/other stakeholders

  13. Test Set Capture Example Types of Pages/Modules Forms Tables Menus, fly-outs, carousels Multimedia Interactions Dialog boxes Templates Error messages Different states Responsive pages

  14. Test Set Capture Other Pages - Documentation and Support Identify and Sample Documentation and Support Online help, PDF, user guides, and support documentation List of keystrokes or accessibility features Accessibility Policy Preference and settings screens Contact Information for support

  15. Test Set Capture Random Sample Include random sample Draft WCAG conformance evaluation methodology recommendation • Randomly choose 10% or (no fewer than 5) pages Testing random sample and comparing with results from other page results will determine if representative sample was valid After testing, if invalid sample add to representative sample

  16. Test Set Capture Capture and Enter Modules (Toolbar) Stage Modules (if needed) • Add data, set settings, create cases with error conditions Expose/select the appropriate content/elements for capture. Use Node feature – Inspect in Toolbar Capture module (e.g. in Firefox toolbar for AMP • The toolbar captures • Page name/title • Path to page (navigation stream) • URL • Screenshot • DOM

  17. Test Set Capture Capture and Enter Modules (Manual) Screenshot - Solid Capture or SnagIT, or Firefox depending on technology platform Collect DOM (web only) from toolbar such as IE Developer bar, Firebug, Adobe Edge Inspect, Safari Developer Tools, etc. Record Page Title, URL (web only) and or path

  18. Test Set Capture Use Case Identification - If Applicable What is a use case? • Core task within a system that users with disabilities will test with the baseline assistive technologies • Test users document issues found while performing the steps and score these based on a set Matrix • Score will give clients a rank on how accessible the application is The lists of modules and use cases are somewhat independent of each other, but the most important or complex features of the system should be reflected in both lists Use cases for each assistive technology and platform

  19. Test Set Capture Use Case identification - If Applicable (cont.) How to select use cases Client may have specific use cases/essential functionality The use cases should be picked to show a representative cross section of the product functionality Select actual tasks that are required to navigate to a particular section of the system. For example: Login page. • Define use cases based on user types/roles Each use case is formally scripted to define the sequence of steps that the user performs to complete that task

  20. Test Set Capture Use Case Example

  21. Test Set Capture Confirm Test Set with Stakeholders Confirm test set with stakeholders Add modules, use cases as needed Remove items that may no longer be essential, etc. Confirm system has not changed/will not change during testing

  22. Audit

  23. Audit Overview Audit each page/module Use Evaluation Methodology • Methodology may indicate a pass, fail or a not applicable • All standards/success criteria that are applicable must be tested • Enter results as audit is performed to save time • Follow granularity from scope phase Only test repeated content once (some exceptions) • Common pages • Repeated controls • Templated items such as headers and footer

  24. Audit WCAG Evaluation Methodology Example Test for known failure techniques Check sufficient techniques for the technology used Check for accessibility supported techniques Be sure to test all steps in a process Check for alternatives Check for other WCAG conformance requirements • Non-interference, etc. Check random sample and compare with representative sample

  25. Audit Types of Audit Testing Automatic (Web, PDF) Headless browser or toolbars capture and test web pages directly from the browser into platform (e.g. AMP), diagnosing issues directly against the Document Object Model (DOM) of the page Guided Automatic (Web – AMP specific) When a module is generated using the AMP toolbar, the testing engine will find likely candidates for accessibility violations, however these must be confirmed by the tester Manual Those that require a human in order to be accurately tested

  26. Audit Types of Testing - Automatic Automatic testing is the cheapest and most common Catches easy to detect items Covers only a small fraction of legal requirements • ~25% coverage May require auditor to remove false positives

  27. Audit Manual Code inspection/DOM inspection via Toolbar Toolbar preview modes/ Favlets (e.g. AMP toolbar for IE) Visual inspection, etc. Assistive Technology for AT supported methods Browser settings Tools vary depending on platform • Adobe Edge Inspect for DOM and screenshots • Safari Developer tools on iOS • Mobile tools limited

  28. Audit Technical Testing Tools • Contrast checker • Object Inspector/Accessible Event Watcher, AccChecker, aDesigner, etc. (MSAA, UIA) • Java Ferret (Java) • PDDomViewer (PDF) • Accessibility Inspector XCode (iOS) • Lint via Eclipse (Android) • Favlets (web) • Keyboard • High contrast • Zoom • Assistive Technology (be careful of using AT for technical results as result may be skewed) • Screen reader • Screen magnification • Voice Recognition

  29. Audit Guidance for Recording Violations Go through each checkpoint/best practice and determine whether a failure is met • e.g. AMP Toolbar provides a direct integration with AMP and allows users to test content directly within Firefox For other issues found – determine best fit for best practice/checkpoint Provide a description of the violation Include user impact May include how to fix the issue Pattern violations • Same issue that appear on multiple modules

  30. Audit Violation Entry Example from AMP of violation entry screen Enter a description, notes, recommendations, or other identifying information

  31. Audit Use Case Testing Complete each use case • Purpose is to establish the level of accessibility and Assistive Technology support of IT system with baseline assistive technologies Typically tested by users with disabilities Assistive Technology testing • Categories may include • Screen reader • Screen magnifier • Speech recognition

  32. Reporting

  33. Reporting Entry Enter results if not already entered in audit phase Complete entry of documentation • Who tested • Site version • Browsers used • AT used, If users with disabilities tested • Document scope • Document different technology platforms • Document how the representative sample was carried out • Document standards tested against • Document tools used

  34. Reporting Analysis, Scoring, and Statements Remove false positives; validating automated test results Interpret results: testing team will cross-validate the manual, use case, and automated testing results and synthesize them into a single compliance data set • Generate score (AMP does this automatically) • Determine the prioritization of violations to be remediated with solutions for developers Report may also include screenshots, samples, and other documentation captured in test set capture phase/step Create evaluation/conformance/compliance statement

  35. Resources Conformance Evaluation Methodology 1.0 • http://www.w3.org/TR/WCAG-EM/ W3C/WAI Web Accessibility Evaluation and Testing Activities • http://www.w3.org/WAI/ER/2011/eval/ HTML Accessibility API • http://www.w3.org/TR/html-aapi/ Website Accessibility Windows SDK which includes Inspect Tool • http://www.microsoft.com/download/en/details.aspx?displaylang=en&id=8279 Access Bridge Java Access Bridge for Windows OS - bundled with JRE but must download JAB 2.0.1 for Java ferret

  36. Resources Guidance on Applying WCAG 2.0 to Non-Web Information and Communications Technologies • http://www.w3.org/TR/wcag2ict/ WAI-ARIA Overview • http://www.w3.org/WAI/intro/aria.php

  37. Questions?

  38. Thank You Follow Us Twitter @SSBBARTGroup LinkedIn www.linkedin.com/company/ssb-bart-group Facebook www.facebook.com/ssbbartgroup Blog www.ssbbartgroup.com/blog Contact Us Jonathan Avila Chief Accessibility Officer jon.avila@ssbbartgroup.com SSB Contact Information info@ssbbartgroup.com (800) 889-9659

More Related