Apple Announces New Accessibility Features for Users With Disabilities
The new accessibility features will make it easier for users with disabilities to navigate their Apple devices easily.
Apple users with disabilities now have top-notch features to help them navigate their devices, as Apple announces new AI-powered accessibility features to help make their lives much better.
This would not be the first time that Apple has introduced features for people with disabilities. Just last year, it unveiled value-added features such as the Magnifier tool, Voice Over, and Assistive Access with its iOS 17.
These features allowed users with low vision, hearing loss, and other disabilities to customize their interface and made it easier for them to navigate their Apple devices.
Now, Apple has decided to take a step further by releasing new accessibility features, which include eye tracking, music haptics, and vocal shortcuts that offer different functionalities to the user. For example, the eye tracking feature gives users a built-in option, allowing them to navigate their iPhones and iPads with their eyes.
The music haptics feature, on the other hand, is unique as it makes it possible for users who are deaf or have difficulty hearing to experience music on their iPhone devices through taps, textures, and refined vibrations to the audio of the music.
Users who suffer from conditions that affect their speech can use vocal shortcuts to assign custom utterances that Siri can understand to launch shortcuts and complete complex tasks. They can also use atypical speech to enhance Siri's speech recognition for a wider range of speech, including their speech patterns.
Moving on, users can expect to get access to these features with the release of iOS 18, iPadOS 18, macOS 15, and VisionOS 2. The tech giant could also feature them at its Worldwide Developers Conference (WWDC 2024), so make sure to watch out for them.