iOS 18 is getting a bunch of cool new accessibility features this year, including bringing eye tracking to the iPhone for the first time. You also can feel haptic music, control your phone with custom voice commands, use your phone in the car without suffering from motion sickness, and improve the accuracy of Siri and dictation.
Check out our video to see it in action or keep reading below.
5 awesome accessibility features in iOS 18
Accessibility features are often some of the weirdest and most interesting ones in iOS. Designed to make Apple products easier for people with various disabilities to use, they typically fly under the radar. They enable interesting new ways of interacting with Apple devices. And often, they can be utilized by everyone.
Many of these iOS 18 accessibility features can be added as toggles you can quickly turn on or off from your iPhone’s Control Center — or even from the Lock Screen.
These features are all new to iOS 18 — available now in Settings > General > Software Update. The update will work with any device currently running iOS 17.
Table of contents: 5 awesome accessibility features in iOS 18
1. Eye Tracking
Eye tracking in iOS 18 is a pretty remarkable and ambitious accessibility feature that lets you control your iPhone entirely with your eyes. Likely borrowing some of the tech from the advanced Vision Pro headset, this feature lets you control your iPhone hands-free. You can use this feature in a pinch if you need to use your phone with soapy hands while doing the dishes or with grimy hands while working on a car or doing other dirty work.
Enable it in Settings > Accessibility > Eye Tracking, in the Physical and Motor section. Just like on the Vision Pro, you’ll need to calibrate your iPhone by following a dot along on the screen.
Once you activate iPhone eye tracking, you’ll see a floating on-screen cursor that follows your eye. The cursor snaps to a button or toggle switch to select it. If you stare at a button for half a second, it’ll activate it. Stare at the floating on-screen button for gestures like scrolling, adjusting volume, and bringing up the Home Screen or Control Center. You can recalibrate the eye tracking at any time by staring in the upper left corner of the screen.
In testing, the feature doesn’t seem super-reliable. The tracking is somewhat twitchy. It’s understandable — the Vision Pro puts multiple cameras within an inch of your face; your iPhone is typically held at arm’s length and only has one camera. The accuracy isn’t going to be the same. Plus, this is a beta.
For best results, put your phone on a MagSafe stand, get close to it, and make sure there aren’t a lot of reflections or glare if you’re wearing glasses.
2. Music Haptics
The new Music Haptics accessibility feature in iOS 18 adds another dimension to audio: vibration. Taking advantage of the incredible precision of the haptic motor in your iPhone, Music Haptics bring to life a specially recorded track of rhythmic vibrations and buzzing patterns timed to certain Apple Music songs. You can hold your iPhone in your hands and feel your music in a whole new way.
Enable it in Settings > Accessibility > Music Haptics, in the Hearing section. It requires a subscription to Apple Music. Plus, not every song in the catalog works with Music Haptics. If the feature is available, you’ll see a little Music Haptics button underneath the progress bar in the Music app.
There’s also an API for developers to use, so support may be added to your favorite third-party app in the future.
3. Vocal Shortcuts
Vocal Shortcuts seems like one of the most useful accessibility features in iOS 18. It lets you set up your own audio cues. Think “Hey Siri,” but for running your own custom actions from the Shortcuts app. (Here’s a quick rundown on Shortcuts, if you’re not familiar.) This can be a pretty powerful way to automate tasks on your iPhone with the power of your voice alone. You don’t need to make space on your Home Screen for a Shortcuts widget, type it in from Spotlight or, god forbid, open the Shortcuts app.
Set this up in Settings > Accessibility > Vocal Shortcuts, in the Speech section toward the bottom. Enable Vocal Shortcuts, then tap Add Action to create one. A Vocal Shortcut can either run a Siri command or a Shortcut:
- Create a Siri Request by entering a command you frequently give to Siri, like “Text my partner that I’m on my way home,” or “Play music by Driftless Pony Club,” or “Roll a D20.”
- Alternatively, scroll down to pick any Shortcut currently on your phone.
Then, enter in the trigger phrase you want to use to activate the shortcut. This can be anything you want, from something as simple as “Stock update” or “Favorite music” to something silly like, “Ahoy, computer!” You’ll need to repeat the phrase a few times so your phone can learn it, but then it’ll be ready to go. I recommend saying the phrase in different pitches and holding your phone at different lengths, so it learns to listen in all the different ways you may speak your custom Vocal Shortcut aloud.
4. Vehicle Motion Cues
If you get motion sick easily, you’ve probably had a hard time using your iPhone during a bumpy car ride. One solution is to get a car with expensive air-ride suspension. A much cheaper option is to tap the new Vehicle Motion Cues accessibility feature coming in iOS 18. To enable it, go to Settings > Accessibility > Motion > Show Vehicle Motion Cues. (Setting it to Automatic will make sure it comes on whenever there’s significant motion, and goes away when you’re sitting still.)
With Vehicle Motion Cues active, dots along the edge of your iPhone screen will animate in sync with the motion of the plane, train or automobile you’re riding in. According to Apple, “Motion sickness is commonly caused by a sensory conflict between what a person sees and what they feel.” Apple says these animations “reduce sensory conflict.” Hopefully, the dots will let you read or watch TikTok on your long, windy road trip without feeling ill.
5. Listen for Atypical Speech
If you have a condition that affects your speech, iOS 18’s new Listen for Atypical Speech accessibility feature could offer a big improvement on how dictation and Siri understand your voice. According to the Apple press release, the feature will “recognize user speech patterns” on-device, “enhancing speech recognition for a wider range of speech.”
You can enable this iOS 18 accessibility feature in Settings > Accessibility > Siri > Listen for Atypical Speech. Just checking that box might improve the accuracy of your dictation.
More top features in iOS 18
There’s much more in iOS 18 beyond its accessibility features: