iOS 18 Accessibility Features Look Interesting

I’m writing from the point of view of a Certified Professional in Accessibility Core Competencies, not as an Apple fanboy.[1]Trust me: my wallet does not like the price I’d have to pay for the MacBook Pro I want to replace my mid-2015 model! The memory upgrade prices are way too steep in my opinion. Why I need more … Continue reading What looks the most interesting from my point of view:

  • Vocal shortcuts and listen for atypical speech. As we know, certain physical conditions affect the way speech is understood, especially for stroke survivors, those with cerebral palsy, and a range of other conditions. Most voice recognition systems fail utterly for a significant portion of the population because of problems with the vocal folds. If what Apple is promising comes to fruition, this could make voice recognition more effective, and might be a positive use case for artificial intelligence.
  • CarPlay finally gets color filters and bigger text. Now that I finally have a car that can use CarPlay, I have always wondered how it works for those with color blindness and other vision concerns. The image below is one of the standard CarPlay screens as presented by Apple in its press release about iOS 18 accessibility. The wallpaper is one of the default wallpapers in iOS 17 with gradients of red, orange, pink, purple and blue from top to bottom. Note how the green of the phone app and the green of the messages app overlap with the red in the wallpaper. Color blindness fail! The bold and large text could be promising for all users, as by default, bold and large text decreases the number of icons per screen, which in turn makes the tap zone bigger, and thus, easier to use.
Apple's Car Play screen with icons for various applications.

    My hope is that the vocal shortcuts and CarPlay can be combined: how nice would it be for someone who has a vocal condition to be finally understood in a moving car?! Then again, I can also see this being a tool for power users beyond what Apple’s shortcuts can already do.

    • Eye Tracking also looks interesting – this could make it easier for users to manipulate iPads and iPhones using only their eyes.
    • There’s also music haptics for deaf and hard of hearing users, which is nice, but the addition of an API (application programming interface) for developers to make music more accessible generally is the real story here. This could make it easier to build accessibility into apps by having a secondary cue that works for deaf and hard of hearing users along with other sound cues.

    I really can’t speak to the visionOS improvements in great detail, but the addition of Live Captions seems to be quite important.

    I will say that I’m encouraged by this particular press release. Now to see how well it will play out beginning next month….

    Notes

    Notes
    1 Trust me: my wallet does not like the price I’d have to pay for the MacBook Pro I want to replace my mid-2015 model! The memory upgrade prices are way too steep in my opinion. Why I need more memory and CPU cores is related to Stockfish’s ability to take advantage of them in chess analysis. And the home CFO has thus far vetoed the upgrade. But I digress…