Prototyping for Accessibility and Screen Readers

Hello everyone!
Does anyone have any advice on how to create a prototype that we could test with user who use assistive technology, such as VoiceOver, or NVDA/Jaws?

We would like to start including PWD(people with Disabilities) in our user/usability testing, but I cannot figure out how to accomplish the simplest things, like creating a form where the labels actually have a relationship to the text fields they preceded.

I have read https://www.axure.com/blog/approachable-guide-prototyping-accessibility-axure-rp already, but it doesn’t really help on the executional level.

Any advice would be very welcome!

1 Like

If you are trying to conduct a usability test with participants who are blind or have limited/impaired vision, it is possible with Axure, but you might be better off using “real” HTML with ARIA support so that their own screen reader of choice actually works as they expect it.

A few years ago I was able to test some prototypes of custom accessible keypads for controlling embedded devices (think like an ATM or copier with a GUI touchscreen that otherwise wasn’t vision accessible.) There were buttons for Right, Left, Select which mapped to Tab, Shift-Tab, Enter to step through screen elements and click them. In this context it was obvious to test participants they couldn’t use their own screen readers or mobile devices to control the machine, and were happy with anything that got close. However, when testing the help pages which was a prototype of a website they accessed on their phones, we ran into all sorts of problems when their screen readers and accessibility features didn’t work exactly right due to our Axure prototype. (I’ve also learned that PwD participants tend to be more opinionated, picky and resistant to bugs and limitations. A curse for the prototyper but a real blessing for the usability tester and design team.)

I wish I could post an .rp file but it’s all proprietary content, as well as very complicated. I had a fair number of “non-selectable widgets” as well as selectable widgets in various dynamic panel states, so simply arranging the order of widgets in the Outline pane didn’t work for “tab order”. But my basic method to track the order of of widgets was to include a number in the widget name, such as “Title_01”, “SubTitle_02”, “Paragraph_03”, and “ctaButton_04”. Then I had three hidden widgets I used as “page-local variables” (but you could also use global variables) for “itemCurrent”, “itemPrev”, and “itemNext”. When a widget received focus, it would set text on “itemCurrent” to [[This.name]], and iItem.Prev" to [[p.name]] where p was a local variable pointing to the “previous” widget and “itemNext” to [[n.name]] where n pointed to next widget that should receive focus. So, each widget only had to know which widget was on each side of it. If it was the last widget on the page it set “itemNext” to the first widget to “wrap around” the list. In this way, at any time I could know which widget was focused and “where to go” if Tab or Shift-Tab (or right-arrow / left-arrow buttons) pressed. I used the OnPageKeyDown event to capture keyboard presses and set focus on the appropriate widget.

Additionally, when a widget received focus its OnFocus event set the text on a hidden Text Field widget, “readMe”. in that widget’s OnTextChange event I had a javascript call to responsiveVoice, which is a nice text-to-speech web app, and that text would be read aloud, giving me a custom screen reader ability.

OK, I went ahead and made a little demo of this approach. Let me know if you’re interested in the text-to-speech output and I’ll add in javascript for that.

Accessibility Tab-Order Demo.rp (120.7 KB)

Here is a thread with more info about using responsiveVoice:

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.