1 / 30

Web Accessibility Testing for AIR

Web Accessibility Testing for AIR. AIR Austin February 2012. Jim Allan http://tsbvi.edu jimallan@tsbvu.edu TSBVI Austin, Texas. Jim Thatcher http://jimthatcher.com jim@jimthatcher.com Accessibility Consulting Austin, Texas. Resources. Resources and Slides at

van
Download Presentation

Web Accessibility Testing for AIR

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Web Accessibility Testing for AIR AIR Austin February 2012 Jim Allan http://tsbvi.edu jimallan@tsbvu.edu TSBVI Austin, Texas Jim Thatcher http://jimthatcher.com jim@jimthatcher.com Accessibility Consulting Austin, Texas

  2. Resources • Resources and Slides at • Resources list http://jimthatcher.com/testing • PowerPoint testingForAIR.ppt • Most important for us and testing are • Web Accessibility Toolbar http://tinyurl.com/wattoolbar • Jim’s Favelets http://jimthatcher.com/favelets Testing for accessibility 2

  3. Testing for Web Accessibility Web Accessibility: Web resources are accessible if people with disabilities can use them as effectively as non-disabled people (UT Accessibility Institute) • So testing means user testing with disabled and non-disabled people? • Instead there are “standards and guidelines” against which we can test Testing for accessibility 3

  4. Testability – an example • Contrast – 1990 (WCAG 1) • Ensure that foreground and background color combinations provide sufficient contrast when viewed by someone having color deficits or when viewed on a black and white screen. • Contrast – 2010 (WCAG 2) • Text and images of text have a contrast ratio of at least 4.5:1 … 3:1 for larger text (18pt or 14pt bold) (Level AA) • A simple tool to measure contrast ratio • Colour Contrast Analyser Tool Testing for accessibility 4

  5. The Colour Contrast Analyser Tool Testing for accessibility 5

  6. Testing Tools • Colour Contrast Analyser http://tinyurl.com/colourcontrast • Web Accessibility Toolbar http://tinyurl.com/wattoolbar • Jim T’s Favelets http://jimthatcher.com/favelets/ • Toolbar for Firefox http://tinyurl.com/cptoolbar • WAVE http://wave.webaim.org • FireEyes http://tinyurl.com/jfireeyes • Lists of tools • http://www.w3.org/WAI/ER/tools/complete • http://www.colostate.edu/Dept/ATRC/tools.htm • http://www.webaim.org/articles/tools/ Testing for accessibility 6

  7. Computer tests vs. Human Review • Testable by a computervs. requiring human judgment • Only ~25% of accessibility errors can be detected by computers • Many claim compliant/accessible because no errors reported by testing tool • Testing all for Section 508 Standards discussed • http://jimthatcher.com/testing1.htm Testing for accessibility 7

  8. Computer tests(about 12 of them) • (3) Missing alt on <img>, <area>, or <input> with type=“image” • (3) Empty alt on <img> in <a> without text, same for <button> or <input> with type=“image” • (1) Form control with no title or label (or empty) • (2) No title on frame or iframe • (1) Sever-side image map (image link with ismap) • (1) Missing lang attribute for page • (1) No HTML headings (h1, h2, …) on page Testing for accessibility 8

  9. The Issues from AIR Judging Form • Judging form: http://www.knowbility.org/v/air-dev-resources/AIR-Austin/34/ • Core Accessibility 240 points • (Basic Accessibility 70 points) • Advanced Accessibility 120 Points Testing for accessibility 9

  10. Core Accessibility – Inactive images • 1. Inactive images (20 points). Use the alt attribute to provide a text alternative for non-linked (inactive) images. If the image is decorative or redundant, use alt=””; if the image conveys information the alt-text should convey the same information. … If the image consists of text, the alt-text should be the same. Each image without appropriate alt-text is an error. • Testing – Use WAT (images → show images) or a little simpler, Jim’s Favelets (Formatting images and larger images). Testing for accessibility 10

  11. Core Accessibility – Active images • 2. Active images (20 points). Use the alt attribute to provide a text alternative for every active image, including image links, image map hot spots, or input elements of type image. The text alternative must convey the function of the active element, the target of the link. If the image consists of text, the alt-text should be the same. Each active image without appropriate alt-text is an error. • Testing – Use WAT (images → show images mouse over for whether active) or the Favelets (Active images). Testing for accessibility 11

  12. Core Accessibility - Links • 3. Hypertext Links (20 points). Use descriptive link text to ensure that each link makes sense when it is read out of context. Make sure that links with same text on the same page go to same place. Each anchor with inadequate link text is an error. • Testing – Use WAT (Doc Info → List links) or inspection. Watch out for “More” “Click Here”, etc. Testing for accessibility 12

  13. Core Accessibility – Semantic markup • 4. Correct Markup/Parsing (20 points). Use semantic markup (block quotes, headings, lists, etc.) to properly represent the structure of the document. … Each instance of a structural tag used for formatting or content that should use semantic markup or that is not structured to specifications is an error. • Testing – Use WAT (Structure → Headings) and (Structure → List Items) and (Structure → Q/Blockquote). Testing for accessibility 13

  14. Core Accessibility  - Skip links • 5. Skip links (10 points). Provide at least 1 and no more than 3 links at the top of the page which are either visible all the time or become visible on focus. These should jump to the main content or main navigation sections of the page. Intended for keyboard users, be sure to test them with the keyboard. Each page that does not meet these requirements is an error. • Testing – Use the keyboard to test, simply: • Tab to skip link • Press enter to follow • Tab again; should be at first link after target of skip • Or Jim’s Favelet – Skip links Testing for accessibility 14

  15. Core Accessibility - Headings • 6. Headings for navigation (10 points). There should be a heading at the top of each major section of a page; use only one h1 and heading levels should not skip (h2 then h4). Each page that does not meet these requirements is an error. • Testing – Use WAT (Structure → Headings) and (Structure → Headings Structure) or Jim’s Favelet, Headings • Watch out for skipping levels (like h2 to h4) – Don’t do it. Testing for accessibility 15

  16. Core Accessibility - Landmarks • 7.Landmark roles (10 points). Use ARIA landmark roles to label key sections that don’t easily accommodate headings, like role="main", role=”content info”, role=”navigation” (use aria-Label if more than one navigation area) and role=”search”. Don’t overdo it; time is taken announcing these. Each page that does not meet these requirements is an error. • Testing – Use JAWS (; or Ctrl+Ins+; to list) • Watch for correct labeling • aria-labelledby for id of an on-screen element • aria-label for text • Or Jim’s favelet aria to display ARIA markup Testing for accessibility 16

  17. Core Accessibility – Info in presentation • 8.Information in presentation layer (20 points). Ensure that information conveyed through presentation (font, color) is also available in text. Any instance of information only available through presentation is an error. • Testing – Inspection! Testing for accessibility 17

  18. Core Accessibility - Contrast • 9.Contrast (20 Points). The visual presentation of text and images of text has a contrast ratio of at least 4.5:1, except for the following: Large Text (14 pt bold or 18 point or larger): have a contrast ratio of at least 3:1. Each instance that does not follow these guidelines is an error • Testing –Use the Colour Contrast Analyser tool WAT (Colour → Contrast Analyser application) Testing for accessibility 18

  19. Core Accessibility - Contrast • 10.Keyboard (20 Points). All functionality must be accessible with the keyboard. Keyboard functionality should not require the use of mouse keys, the JAWS cursor, or anything similar. Any instance of an operation or a control that requires the mouse, unless a keyboard equivalent is readily available, the score is 0. • Testing – Use the keyboard to be sure everything is available without a mouse. Combine with testing for #14 – focus indication. Testing for accessibility 19

  20. Core Accessibility - Language • 11.Language (20 Points). Identify the natural language of each page. Any failure is score of 0.is readily available, the score is 0. • Testing: Use WAT – (Doc Info → Show lang attributes) Testing for accessibility 20

  21. Core Accessibility - Forms • 12.Forms (20 points).  Label all form controls. For example, use the label element when on-screen text prompt is adequate.  Use the title attribute when on-screen text prompt is not adequate or is dispersed.  Use fieldset/legend to group radio buttons and check boxes.  Use aria-required for required fields. Use ARIA markup for error handling including aria-invalid on fields which have errors and use alerts or alert dialogs for announcing the errors. Each form control with inadequate labeling is an error. • Testing: use WAT – (Structure → Fieldset/Labels) and or use Jim’s Forms favelet and ARIA favelet Testing for accessibility 21

  22. Core Accessibility – Data Tables • 13.Data Tables (10 points).  For tabular data, use the caption element and/or the summary attribute. Use the th element or use either th or td elements with the scope attribute to unambiguously identify all row and column headers. For complex data tables associate data cells with the appropriate headers using either headers/id or scope. Each instance of missing or incorrect table markup is an error. • Testing: Use WAT – (Tables → Show Data Tables) and or use Jim’s Data tables favelet Testing for accessibility 22

  23. Core Accessibility – Focus Indication • 14.Focus Indication (20 points). There must be a clear visual indication (highlight, dotted rectangle, change in color) when an object receives keyboard focus. Each instance of an object with no focus indication is an error. • Testing: Use the keyboard and tab through every active object on each page. Testing for accessibility 23

  24. Advanced Accessibility Flash • 32.Flash (20 points).  If Flash content is non-essential to the meaning of the page then assistive technology must be able to either tab through, or bypass the Flash object. If the content conveys information or responds to user input, assistive technology must be able to access the information and functionality. Each instance of missing information or function is an error. There must be a clear visual indication (highlight, dotted rectangle, change in color) when an object receives keyboard focus. Each instance of an object with no focus indication is an error. • Testing: Inspection: Use the keyboard and test with JAWS Testing for accessibility 24

  25. Advanced Accessibility Flash • 33.Video (20 points).  For video with soundtrack, provide synchronized captions. Provide an html text description for video without sound. In addition, provide synchronized audio description if the video cannot by understood from the soundtrack alone. If not synchronized, audio description must be provided as text with the video link. Each instance of inadequate accommodation is an error. • Testing: Inspection. Testing for accessibility 25

  26. Advanced Accessibility Flash • 34.Audio (10 Points)  Provide html text transcripts for audio files. Speakers must be identified. Each place where the text transcript substantially differs from the audio is an error. • Testing: Inspection. Testing for accessibility 26

  27. Advanced Accessibility Style Sheets • 35.Alternative Style Sheets (10 Points) Provide at least two style sheets in addition to the default that can be conveniently selected by users, for example, to change font size, contrast, color schemes, printing, or displaying on small devices. Any page where the layout fails with any alternate style sheet or where the selection process fails is an error. • Testing: Inspection. Testing for accessibility 27

  28. Advanced Accessibility Scripting • 36.Client Side Scripting (20 Points) All scripted functionality is available from the keyboard or there is a readily available keyboard accessible alternative. Scripted controls must identify themselves (name, role, state, value and properties) to assistive technology. Make sure that screen readers are aware of important content exposed with scripting. Each instance violating these requirements is an error. No points awarded if scripting is for switching style sheets, or scripting is peripheral to the purpose of the site. • Testing: Inspection, keyboard. JAWS. Testing for accessibility 28

  29. Advanced Accessibility ARIA Widgets • 37.ARIA Widgets (20 Points) Use ARIA Best Practices to code special widgets including tab panels, accordion menus and modal windows. • Testing: Jim’s ARIA favelet, inspection, keyboard, JAWS. Testing for accessibility 29

  30. Advanced Accessibility Fluid layout • 38.Fluid Layout (20 Points) Using browser magnification in at least two browsers, increase the magnification level to 200%. If content reflows to prevent the necessity of using a left-right scroll bar to access content and all content is available, 20 points. If screen does not reflow 0 points. • Testing: inspection. Testing for accessibility 30

More Related