Topic: April 2017 – Criterion

Limitations of HTML Validation Scans

Many Section 508 and WCAG accessibility vendors, who make most of their revenues licensing automated website accessibility testing tools, have recently begun posting articles that claim their HTML validation tools are the answer to Section 508 and WCAG compliance testing. Although the sales pitch may sound promising, the results are not.

Any reputable website accessibility vendor understands the only way to validate compliance with Section 508 and WCAG guidelines is to conduct extensive manual accessibility testing utilizing experienced accessibility professionals and expert end-users with disabilities using a variety of assistive technologies.

Quality HTML validators have their place in every web developer’s toolbox, but they are certainly not the answer to website accessibility testing given that they capture about 20% of Section 508 / WCAG guideline failures.

Criterion is retained every week by companies who had previously engaged other accessibility vendors that promised them Section 508 and WCAG compliance simply by using their automated testing tools. Because these tools failed to ensure Section 508 or WCAG compliance as promised, these companies now face issues that either threaten their federal contracts or have resulted in threats of litigation.

Let’s take a look at some of the more common sales pitches we hear from our new clients by other vendors regarding automated website accessibility testing tools and HTML validators:

Common Sales Pitches regarding HTML validators
Sales PitchReality
Automated testing is faster

Automated testing is not faster when the results of the scan don’t ensure Section 508 and WCAG compliance.

Typically, it takes less than 10 business days for a team of Criterion’s accessibility testers to document all Section 508 and WCAG failures within a website.

Additionally, due to the extensive documentation created through our manual testing, the average duration from the start of a website accessibility initiative to Section 508 and WCAG certification is only four months.

Automated testing is cheaper

Automated testing tools are not cheaper than manual testing. Many vendors license their HTML validation tools for 2 to 3 times the cost of manually testing ALL pages within even the largest and most complicated dynamic websites.

The cost for Criterion to manually test ALL pages on a website utilizing several accessibility experts and end-users with disabilities generally costs 25% to 50% LESS than the cost of licensing the most robust automated testing tools that will merely skim accessibility failures and result in a website that is not compliant to WCAG and Section 508.

Automated testing is 100% accurate

Automated testing is only as good as the algorithms behind the scan. Unfortunately, many HTML validators report false positives and have major compatibility issues with certain browsers.

All “Free” HTML validators should be avoided. These are ineffective in gaining true accessibility.

Automated testing exposes all Section 508 and WCAG violations on a page Automated testing exposes less than 17% of relevant Section 508 and WCAG compliance issues.
Automated testing is all you need to make your site accessible to screen reader users

Automated testing is just a small piece of any comprehensive website accessibility testing protocol. In fact, automated testing typically misses key components of allowing screen reader users to fluently move through a website.

Successful website accessibility initiatives are dependent on:

  • Web developer training
  • Dedicated subject matter expertise
  • Expert end-users with disabilities testing
  • Extensive independent third-party documentation and more

Remember, if you need to know that your website is Section 508 and WCAG compliant, its common sense to have the website tested by people who actually have disabilities.

Automated testing tools combined with project management features allow for easier tracking and remediation of Section 508 and WCAG failures

Website accessibility is not something above and beyond good website development. It is good website development!

The majority of Criterion’s clients are major corporations who have already invested considerable money, resources, and training in their existing IT project management systems.

There is no reason for corporations to purchase a separate project management system that only tracks website accessibility issues and repairs. To do so reinforces the inaccurate perception that accessibility is an extra step in the day to day website project management, design, development, and maintenance.

Automated testing simply dissects code to identify inconsistencies with W3C standards and this is not the same as accessibility testing. Most objects and content on a page require human evaluation, and the application of reasoning skills as the content is compared to the more complicated guidelines set forth under Section 508 and the WCAG. Remember, context is critically important when testing website content and there is simply no algorithm capable of performing this type of evaluation.

Simply put, automated testing tools and HTML validators do not replace the need for manual testing by accessibility professionals and expert end-users with disabilities.

Below is a partial list of WCAG and Section 508 accessibility guidelines that automated testing is incapable of resolving:

  • WCAG 1.1 Text Alternatives / Section 508 (a):
    Automated testing cannot:
    • Test if Alt Text is meaningful and properly describes the image to the user.
    • Determine if images have alt text when they should not, such as images that are links or buttons.
  • WCAG 1.2 Time-based Media / Section 508 (b):
    Automated testing cannot:
    • Check embedded videos for captions and proper audio description tracks.
  • WCAG 1.3 Adaptable / Section 508 (c), (d), (g):
    Automated testing cannot:
    • Check for proper context, for example, if multiple “Select” buttons exist on a page of drug prescriptions, it cannot identify that the relationship information to the associated item it will select is proper.
    • Screen reader users need information and relationships of page information that is visually available but not programmatically available.
    • Identify meaningful sequencing of information on your pages.
  • WCAG 1.4 Distinguishable / Section 508 (c), (l):
    Automated testing cannot:
    • Determine proper color contrast of text that is part of an image.
    • Determine sensory instruction description violations. “Select the green button” is a violation that the automated tools will not detect.
    • Determine structural issues that exist when resizing text to 200%.
    • Determine if the site is using color to indicate error data fields in your forms.
  • WCAG 2.1 Keyboard Accessible / Section 508 1194.21 (a):
    Automated testing cannot:
    • Verify that menus are expandable and accessible via the keyboard.
    • Verify that proper tab order corresponds with visual elements for screen reader users who rely on keyboard accessible controls.
  • WCAG 2.2 Enough Time / Section 508 (p):
    Automated testing cannot:
    • Determine if the user is given an option to extend a login session where they have been idle.
    • Determine if a carousel does not have a stop or pause control.
  • WCAG 2.3 Seizures / Section 508 (j):
    Automated testing cannot:
    • Determine if visual page content flashes above 30 Hz.
  • WCAG 2.4 Navigable / Section 508 (d), (i), (o):
    Automated testing cannot:
    • Determine if tab order does not follow visual reading order.
    • Determine sufficient focus borders around focusable elements.
    • Check for context, such as multiple landmarks with the same term that need additional labeling for screen reader user to distinguish them apart.
    • Determine if a Skip Navigation link actually goes to the correct part of the page.
    • Determine if something is performed upon focus that should not be triggered without the user selecting rather than only focusing on it.
  • WCAG 3.2 Predictable:
    Automated testing cannot:
    • Check for dynamic content on an On Input event, such as one that that is triggered by the user’s selection of a radio button which displays additional input fields in a form that the screen reader needs to hear about.
    • Identify inconsistent navigation throughout your site.
  • WCAG 3.3 Input Assistance / Section 508 (n):
    Automated testing cannot:
    • Tell you if you have proper error messaging being announced by screen readers
    • Determine if form error identification is performed
    • Determine if proper labels or instructions are provided to the user to input form information, or inform the user of required fields.
  • WCAG 4.1 Compatible / Section 508 (n):
    Automated testing cannot:
    • Determine if a series of tabs should be coded with role=tab or other techniques to properly announce their role, state, and value.
    • Determine if you have proper conventions used for controls, such as expand and collapse items that are not coded correctly to announce their role, state or value.

Image

Newsletters (view all)

April 2017 – Criterion
August 2017 – Criterion
September 2017 – Criterion
September 2017 – Criterion