Reputation: 21
Is it possible to build assistive applications for the iPhone that work in the same manner that VoiceOver does? (Using the UIAccessibility API) - To clarify, We would like to build a screen reader in the same vein as VoiceOver. Or, is VoiceOver the only assistive technology that is allowed to work on an iOS device?
Upvotes: 2
Views: 379
Reputation: 14498
VoiceOver is currently the only assistive technology app on iOS, and I suspect that Apple will keep it that way. There's a bunch of benefits to having the screenreader being part and parcel of the whole package rather than allowing 3rd party apps, including:
A screenreader, by definition, needs to be able to access the UI and contents of other apps. There's a whole bunch of security and privacy issues here. While there are some ways to mitigate this - eg. Android requires assistive technologies to be specifically given permission in a control panel - why even go there if it's not needed?
Some of the things that VoiceOver does - like intercepting touch - likely need special system support; and again that's not something that you generally want to allow any app to do. There's a sense in which a screenreader is a special case of app, and it's far easier to manage these cases where a screenreader needs special support from the OS if its all in-house than if that support needs to be extended to 3rd parties via some API, and that API needs to be somehow secured against misuse (see point above), and the API has to be documented and supported in future OS releases.
Having one screenreader means there's just one app to test with for accessibility. This hugely simplifies life for the developer. On iOS, test with VoiceOver, and you're done. By contrast, on Windows, you have to test against perhaps JAWS, NVDA, and maybe WindowsEyes too. And some of these apps do things that others don't, so your app may need to workaround one or the other.
Having the screenreader being part of the package also means that it works with new features right from OS release. Apple can guarantee that new iOS features are accessible from day 1. To do this with 3rd party accessibility software, they'd have to let 3rd parties in on new OS features, which is unlikely for a company as secretive as Apple.
Upvotes: 1
Reputation: 70693
Yes, you could build your own screen reader technology into your own app.
You would have include your own speech synthesis library, such as CMU FLite, which may not sound as good as VoiceOver, and subclass or add categories to all of your app's UI and text objects that you wanted to support your private assistive behavior.
There are a small number of talking apps in the iOS App store that do some limited custom voice assistance within some of the app's views, without VoiceOver having to be on. (Advertisement: my Talking Tuner is one example.)
Your assistive tech would only work within your own app, and would not be able to interact with the physical button or any other apps as Siri and VoiceOver can.
Upvotes: 1
Reputation: 515
Unfortunately, VoiceOver is currently the only assistive technology allowed. If you need to work with VoiceOver, it is pretty easy; all you have to do is add this line of code for each items you want the user to identify:
[myView setIsAccessibilityElement:YES];
[myView setAccessibilityTraits:UIAccessibilityTraitImage];
[myView setAccessibilityLabel:NSLocalizedString(@"Image of dog", nil)];
Upvotes: 0