Advertising
  • The Problem with Automated Accessibility Testing Tools

     

    An automated accessibility tool is a piece of software which can test a web page, or even an entire website, for accessibility. Automated accessibility tools are useful because they can save you a huge amount of time. Don't want to check images for alt text on each and every page on your website? Run the site through an automated tester and it'll do it all for you!

    Automated accessibility testing tools have been around for a long time and have historically been a useful way of checking websites for accessibility. Bobby, one of the first and most well-known automated accessibility testing tools, is now almost 10 years, and although is no longer freely available, plenty of other free tools such as WebXact (http://webxact. watchfire. com/) and Wave (http://wave. webaim. org/index. jsp)do exist.

    But are these tools a little too good to be true? Can you test a website for accessibility so easily? Unfortunately the answer is a resounding no. There are a number of underlying problems associated with using just automated tools to test for accessibility:

    Literal interpretation of guidelines

    Any automated accessibility testing tool, being a piece of software, doesn't have very much in the way of common sense. It will interpret each and every accessibility guideline literally, without bearing any other thought to what else is on the page.

    The definition of the word guideline, according to Dictionary. com, is "a rule or principle that provides guidance to appropriate behaviour". A guideline simply offers guidance to what the best practice is - it shouldn't just be applied without regard to other factors.

    Can't check any content issues

    The way that content is structured both on the page and across the website is a massive part of accessibility. A website may be perfectly coded and conform to the highest coding standards. If its content is poorly structured though, the site will prove difficult to impossible for some special needs web users.

    - Front-loading content so that each paragraph begins with the conclusion

    - Ensuring content has been broken down into manageable chunks with descriptive sub-headings

    - Using lists wherever appropriate

    - Ensuring that plain and simple language is used

    Can't check many coding issues

    - Ensuring that text is real text and isn't embedded within images

    - Making sure that the site functions without the use of JavaScript or Flash

    - Providing equivalent text links if using server-side image maps

    - Ensuring that the structure within the HTML reflects the visual appearance (e. g. headings are labelled as headings within the HTML code)

    Outdated guidelines are used

    Automated accessibility testing tools generally use the W3C accessibility guidelines, which by now are over five years old. As such, a number of these guidelines are outdated and don't apply anymore. In fact, some of them are now thought to hinder accessibility rather than help, so it's best to totally ignore these guidelines.

    For example, an automated accessibility testing tool will probably insist that form items contain default place holding text. It may also insist that links need to be separated by non-link text. Neither of these guidelines are relevant anymore and their implementation could make accessibility worse rather than better.

    Most guidelines aren't properly checked

    Automated accessibility tools can check for a number of guidelines, and can tell you when a guideline isn't being adhered to. However, when the tool claims that a guideline is being fulfilled this may in fact be a false truth.

    For example, if all images contain alt text then the software will report a pass for this guideline. But what if the alt text isn't descriptive of its image? What if alt text is crammed full of nonsensical keywords for search engines? How can an automated accessibility tool possibly know this?

    Warnings may be misinterpreted

    The reports generated by automated accessibility tools provide warnings, as well as errors. These warnings are basically guidelines that the automated tool can't check for, but which may be errors. Often they're not, and in fact they're often not even relevant. However, some people reading a report may try to get rid of these warning messages by making the appropriate changes to their site. By doing so, they may be implementing guidelines that needn't be implemented and inadvertently lowering the website's accessibility.

    Conclusion

    Automated accessibility testing tools can be useful as they can save a large amount of time in performing some very basic checks for accessibility. However, they must be used with caution and they cannot be used as a stand-alone guide for accessibility checking. Indeed, some expert accessibility knowledge should always be applied in evaluating a site accessibility, perhaps in conjunction with the fantastic web accessibility toolbar (http://www. nils. org. au/ais/web/resources/toolbar/) to help dramatically speed up manual checks.

     



  • On main