Our iOS SDK includes basic support for VoiceOver, the screenreader that comes with every iPhone and iPad. Now, users with visual impairments have the same access as sighted users to the information contained in map markers and callout views.
An invisible map
VoiceOver is integrated deeply into the operating system, and support for it is pervasive across Apple’s first-party applications. Visually impaired users expect third-party applications to make their entire interfaces accessible as well, and Apple even promotes such applications on the App Store.
It’s easy to overlook the importance of making a map fully accessible to screenreaders. After all, an image inside your application is best narrated as a single interface element; in general, it would be tedious to treat each region of the image as its own interface element. Some leading navigation applications on iOS take this approach to accessibility. For example, much of the Google Maps interface can be heard in this sound recording, but there’s something missing:
In a video of the same scenario, notice how the focus rectangle skips past the map’s content – the user location marker and search result pin – as though the map doesn’t really exist:
A map view isn’t an ordinary image: like an ordinary table view, it contains meaningful data. In order to take full advantage of your application, a visually impaired user needs to be able to navigate among the markers on the map and hear their names read aloud.
To be sure, points of interest are surfaced in tabular form in Google Maps’s search results. But if you’re developing your own application using the Google Maps SDK, you have to do extra work to build an alternative textual UI. Otherwise, the data represented by these map markers is completely inaccessible to VoiceOver users.
Our iOS SDK integrates directly with VoiceOver: for the most common use cases, you only have to write your interface once and it’ll work intuitively with the screenreader turned on, without any additional work on your part. VoiceOver lets the user navigate among various map controls, including the compass and user location annotation, and reads aloud the name of each pin on the map. For all your users, the map behaves as expected without any impact on performance.
If you want more control over what VoiceOver users hear, you can implement the standard UIAccessibility methods on your annotation views or use feature querying to read aloud elements of the base map, such as street names or points of interest.