On May 15, Apple unveiled accessibility features scheduled to be released in the second half of this year. It becomes much easier to operate devices using your body, such as writing, speaking, and listening. Ease of use began as a feature for people with physical disabilities, but has gradually developed into a convenience that everyone can use. I think a new interface may appear in the accessibility update. If Open AI and Apple collaborate as recently circulated rumors, it seems likely that more changes will occur with Siri.
eye tracking You can use your iPhone and iPad using just your eyes. It can be used only with Apple devices without any additional hardware. You can use physical buttons, swipes, and other gestures using just your eyes. It runs on the device's own on-device machine learning, so no data is shared with Apple.
music haptics Let users who are deaf or hard of hearing experience music with the Taptic Engine. The taptic engine embodies the sound of music with delicate vibrations such as taps and textures. It works with millions of songs on Apple Music, and in the future we plan to support developer APIs to make it more accessible to everyone.
voice recognition You can set up voice shortcuts for Siri to make complex tasks easier. The Listen for Atypical Speech feature uses on-device machine learning to recognize speech patterns for diseases that affect language skills, such as cerebral palsy, Lou Gehrig's disease, and stroke.
vehicle motion cue Reduces motion sickness when using an iPhone or iPad in the car. When the body moves but the screen does not, a conflict of senses occurs. You can check the content as the dot moves. However, I wonder if I should watch content while driving a car.
In addition, you can use the reader mode in zoom mode to zoom in on the object with the camera. It provides a new Braille screen input/output method, Hover Typing, which enlarges letters to the desired size and style when entering text, and real-time speaking and real-time subtitles.