Apple Previews Accessibility Features Coming Later This Year
No product from any single company is going to be perfect for every consumer of said product. Additionally, each user is unique and not likely to be similar to any other user. Some users may need additional accomodations. These accomodations could be due to physical, cognitive, or even visual needs. In order to help their ecosystem be useable by everyone Apple has put forth tremendous effort into accessibility.
Accessibility for Apple is not just an afterthought, but it is built into every operating system that Apple ships. Furthermore, it is not a set of features that Apple adds once and does not improve upon. To celebrate Global Accessibility Awareness Day, which is May 18th, Apple has previewed a set of new accessibility features coming to its operating systems later this year. The features that will be coming are:
- Point and Speak in Detection Mode in Magnifier
- Assistive Access
- Live Speech and Personal Voice
Let us look at each of these in turn, starting with Point and Speak in Detection Mode in Magnifier.
Point and Speak in Detection Mode in Magnifier
There are those who may have vision problems and have been able to benefit from a feature called Magnifier. Magnifier will, as the name suggests, magnify the view of what is being shown on an iPhone's screen. While being able to increase the size of what is being shown this is only a first step. Later this year Magnifier will be able to have the app detect text and then it will speak it out loud to the user. This will help users be able to easily understand the text on signs, appliance buttons, and a myriad of other items.
Assistive Access
There are many people within the world who have mobility issues, either temporary issues or even permanent issues. For these users beign able to use an iPhone or iPad can be quite difficult, particularly if the touch targets are smaller and more difficult to precisely touch. For those who can benefit from it, Assistive Access can be quite useful.
From Apple's press release:
Assistive Access uses innovations in design to distill apps and experiences to their essential features in order to lighten cognitive load. The feature reflects feedback from people with cognitive disabilities and their trusted supporters — focusing on the activities they enjoy — and that are foundational to iPhone and iPad: connecting with loved ones, capturing and enjoying photos, and listening to music.
Assitive Access will create a very easy to use interface with large buttons to easily access the primary featuers that a user might need to access. As an example, Assistive Access combines the Phone and FaceTime into a single app that will allow users to easily contact someone, wheether it be via a phone or via FaceTime. This will be a boon for those with mobility issues and allow them to easily access the functions of their phone that they need to access.
Live Speech and Personal Voice
Live Speech and Personal Voice is a forthcoming feature that will allow users to type out what they want to say and have it spoken out loud. This may be during a phone calls, FaceTime calls, or even in-person conversations. For some adding this feature, as outlined above, might be enough. But it goes further. From Apple's press release:
For users at risk of losing their ability to speak — such as those with a recent diagnosis of ALS (amyotrophic lateral sclerosis) or other conditions that can progressively impact speaking ability — Personal Voice is a simple and secure way to create a voice that sounds like them.
Users can create a Personal Voice by reading along with a randomized set of text prompts to record 15 minutes of audio on iPhone or iPad. This speech accessibility feature uses on-device machine learning to keep users’ information private and secure, and integrates seamlessly with Live Speech so users can speak with their Personal Voice when connecting with loved ones.
Additional Features
The items outlined above are just the beginning. Apple has also outlined a number of additional features coming. These include:
- Deaf or hard-of-hearing users can pair Made for iPhone hearing devices directly to Mac and customize them for their hearing comfort.3 Voice Control adds phonetic suggestions for text editing so users who type with their voice can choose the right word out of several that might sound alike, like “do,” “due,” and “dew.”4 Additionally, with Voice Control Guide, users can learn tips and tricks about using voice commands as an alternative to touch and typing across iPhone, iPad, and Mac.
- Users with physical and motor disabilities who use Switch Control can turn any switch into a virtual game controller to play their favorite games on iPhone and iPad.
- For users with low vision, Text Size is now easier to adjust across Mac apps such as Finder, Messages, Mail, Calendar, and Notes.
- Users who are sensitive to rapid animations can automatically pause images with moving elements, such as GIFs, in Messages and Safari.
- For VoiceOver users, Siri voices sound natural and expressive even at high rates of speech feedback; users can also customize the rate at which Siri speaks to them, with options ranging from 0.8x to 2x.
Closing Thoughts
Accessibility is a large aspect to Apple's operating systems that allow users, particularly those who need some sort of accomodation, to be able to customize their devices to work in a manner that functions best for them. In order for any operating system to be successful, Accessiblity cannot be something that is bolted on after the fact. It has to be a primary concern and should be able to help users create the best experience for their needs.
The three features highlighted above, "Point and Speak in Detection Mode in Magnifier", "Assistive Access", "Live Speech and Personal Voice" are just the latest in the line of accessibility features across iOS, iPadOS, macOS, watchOS, and tvOS. I suspect that we will see many more accessibility features being added in the future.
Source: Apple Newsroom.