What you will learn

This tutorial will teach you how our SDK works in terms of accessibility for your visually-impaired users.

How does accessibility work on Android?

Android provides a tool called TalkBack.

This is an accessibility service that helps visually-impaired users to interact with their smartphones. It uses screen reading, vibration, and spoken feedback to let them know what’s on their screen.

How to turn on TalkBack on your device?

You can activate TalkBack through the following steps:

  • Open your phone’s Settings, and access the Accessibility feature
  • Find the TalkBack service and switch it on (on Samsung devices, it may appear inside the View/Vision menu)
  • In the TalkBack settings, make sure you activate the Speak when screen is off option

How does our SDK manage TalkBack reading?

If you are using Auto beacon notifications, our SDK manages TalkBack reading thanks to the setTicker() method, in the NotificationCompat.Builder class.

If you are creating custom-built notifications, don’t forget to add this method to make sure TalkBack can work for your visually-impaired users.

Text-to-Speech content

Notifications and Alerts

As you may have learned from the Adtag data model tutorial, you can define the specific text that will be announced for your notifications and alerts, directly in ADTAG, by using the TextToSpeech field.

Welcome Notifications

Since Welcome Notifications are managed only in the SDK, you must use the correct constructor to pass on the TextToSpeech parameters.

BeaconWelcomeNotification(TYPE, title, description, textToSpeech, thumbnailId, pictureId, minDisplayTime)


For more information on the parameters of the constructor, do not hesitate to consult the Welcome Notification documentation.

Text-to-Speech reading by the SDK


For alerts, you have to code an Alert UI yourself. Make sure that it complies with the Accessibility Android guidelines.

We recommend that you use the setContentDescription method, available on any Android views, to be able to use the ADTAG alert Text-to-Speech content.

For example:



Once you have defined the notification Text-to-Speech content on ADTAG, our SDK automatically configures the notification based on it.