Can TalkBack Read Me A Story?

Posted by

Someone asked me the other day how accessibility services behave if you use a single TextView with multiple spans, instead of multiple TextViews. I didn’t know, so I tried doing it.

Let’s start with three regular TextViews:

https://gist.github.com/ataulm/c846a6520f11b3d9a3604b804f7ebf95

We set the styles differently, so each section is clear, then bound text to the views:

https://gist.github.com/ataulm/f877e6e0c98516548b1f4cbcb2cd5103

An app showing three lines: “A Short Story”, “This is a short story” and “fin.”, each with a different visual style

With TalkBack enabled, how will it sound? The gif below shows that it treats each TextView independently; a swipe-right “next” gesture is required to navigate to the next line, where the text is then read aloud.

Gif showing TalkBack traversal between each line

Let’s try the same thing but with one Textview, using spans to style the text.

https://gist.github.com/ataulm/19bbbf0a838b746c6a40d0bb0314a053

TalkBack will read the text out, pausing between each one (since there are line breaks separating them) — no need to swipe between each line.

If the user wants, they can change the granularity at which the text is read aloud, so instead of reading the entire view (default), it could read lines, words or even letters, requiring a “next” gesture to move to the next.

The screen is selected and captions for TalkBack are displayed. Each of the three lines have been read aloud, with a pause between each

Depending on the span though, TalkBack will add additional auditory cues!

(This article was more interesting before TalkBack 6.0 was released…)

In earlier versions of TalkBack, the pitch would be adjusted to reflect the style span, and an earcon would be added (portmanteau of ear and icon — a sound that plays to help the user understand the context of what’s being read):

https://gist.github.com/ataulm/36268a62b6a358dd8bd212c2f68bde83

How cool is that! I guess it wasn’t very useful or people didn’t notice though, since with TalkBack 6.0, they’ve dropped this functionality. Not for everything, mind. If the text contains any clickable spans (e.g. URLSpan), then this is still handled as a special case — you can hear a faint pop just before the link text is read aloud:

https://gist.github.com/ataulm/fdb37a00dddf249699ec6027466a0081

And how are these clickable spans accessed by TalkBack users? We already saw above that the entire TextView is selected, so they can’t just tap on the links directly.

In my understanding, the only way for TalkBack users to access these links is to use the local context menu for the TextView; the “Links” option will contain the hyperlinks that the user can then select from the list dialog.

Gif shows similar text, but with two hyperlinks indicated by different color text and underline. The local context menu is used to navigate to the two links individually.

Accessing the local context menu might not be possible for all users, and even if it is — what a pain to have to go through those steps for links! Does that mean you shouldn’t use spans if you want to make an app accessible for TalkBack users?

No way, you can do what you like. We can provide alternative UX to make it more straight-forward by detecting the presence of a spoken-feedback accessibility service like TalkBack, or even a keyboard/switch-access device (I haven’t yet tested how these clickable spans would be used in those cases).

For this particular case, I would recommend the pattern as when we’re trying to develop single-action views, where we hide inline-actions (like “play” buttons or in this case hyperlinks), and instead have a single action on-tap, which opens a dialog listing all available actions for that view.

This is the approach taken in apps like Google+ and the technique is described here.

Let me know if you have questions!