Short note on ARIA support

browser icons

It is easy to get the idea that ARIA fixes everything for accessibility, but the reality is that ARIA serves a very specific purpose, for a very specific audience.

ARIA is only supported in browsers and screen readers. Actually, a tiny amount of ARIA is also supported in recent versions of Dragon, but not enough to make it a credible solution for speech recognition users.

Browser support for ARIA depends on the accessibility Application Programming Interface (API) being used. Fortunately, there is good support for ARIA in all accessibility APIs. The Core Accessibility API Mappings (AAM) specification describes how each ARIA role and state should be implemented in the accessibility APIs.

Of all the available Assistive Technologies (AT), only screen readers (Jaws, Narrator, NVDA, TalkBack, and VoiceOver) properly support ARIA. So to all intents and purposes, ARIA is a screen reader only solution. There is more information about the relationship between the accessibility API, browser, and AT, in What does accessibility supported mean?

Documentation of screen reader support for ARIA is scarce. There is an effort to document Jaws support for ARIA (and other web standards), and PowerMapper publishes an ARIA screen reader compatibility table, but there is little else. This gap in our collective knowledge is a bigger problem than you might think.

ARIA is not tested for screen reader support before the ARIA specification is published. The W3C requires that every feature in a specification, in this case every role, state and property, has at least two working implementations. The purpose of this requirement is to make sure that features are viable for use in the wild.

The catch is that ARIA is tested for support in accessibility APIs, but not in screen readers. In other words, it is possible for a feature to be included in the ARIA specification because it has accessibility API support, but for it to have little or no practical support in screen readers. The description and term roles are two such examples.

So test your ARIA implementation with screen readers before you deploy your code. Remember, the experience doesn’t have to be identical in every screen reader, but it does have to be usable in most.

Like to be notified about more articles like this? Subscribe to the Knowledge Center Newsletter. It not only gives you summaries and links to our technical blog posts but also TPGi webinars, podcasts, and business blog posts – as well as accessibility and web tech conferences and other events, and a reading list of other relevant articles. You get one email a month, it’s free, requires just your email address, and we promise we won’t share that with anyone. Check the archive.
Categories: Technical

Comments

Piedanna fabrice says:

AT needs better API for testers, it’s clearly difficult to switch from one AT to another as auditors(and operating system too), as developers it’s a utopia. Jaws is very too expensive, nvda is not user friendly for sighted user, voice over is exclusive and expensive. No way provided for exporting headings, links, area, aria pattern well know and visually see the website like a no sighted user. Reviewing these things out of context should probably help developers too see the cognitive load for user of AT. If you see the whole content you have to be experienced and emphatic to think like AT users, even as auditor it’s sometimes hard to don’t have biased opinion when you have the global context.

Aaron Di Blasi says:

A really useful frame of reference for the current state of support. Thank you. I think the defining sentence is, “So to all intents and purposes, ARIA is a screen reader only solution.” While I don’t personally think that’s all it could be, I agree that the bottleneck in testing is switching from AT to AT and OS to OS. It’s not impossible, but it can be exhausting. These same problems used to plague regular web development too, until services like BrowserStack.com and CrossBrowserTesting.com came along. I just think that an online tool of this nature for AT would almost certainly speed up the pace of change. Again, great article.

Joanne lastort says:

You can use the speech viewer with NVDA to make it a little easier as a sighted person:

To enable the speech viewer, check the “Speech Viewer” menu item under Tools in the NVDA menu. Uncheck the menu item to disable it.

The speech viewer window contains a check box labeled “Show speech viewer on startup”. If this is checked, the speech viewer will open when NVDA is started. The speech viewer window will always attempt to re-open with the same dimensions and location as when it was closed.

While the speech viewer is enabled, it constantly updates to show you the most current text being spoken. However, if you click or focus inside the viewer, NVDA will temporarily stop updating the text, so that you are able to easily select or copy the existing content.

To toggle the speech viewer from anywhere, please assign a custom gesture using the Input Gestures dialog.