Home Latest News Apple accessibility 2024: Eye Tracking, Music Haptics & more!

Apple accessibility 2024: Eye Tracking, Music Haptics & more!


Apple recently unveiled a suite of innovative accessibility features coming later this year, further solidifying their commitment to inclusive design. These features leverage the power of Apple’s hardware and software, harnessing Apple silicon, artificial intelligence, and machine learning to empower users with disabilities.

Eye Tracking: Control Your Device with Just Your Eyes


A revolutionary addition, Eye Tracking allows users with physical limitations to navigate their iPad or iPhone entirely through eye movements. Powered by AI, this feature utilizes the front-facing camera for quick setup and calibration. Notably, all data used for Eye Tracking remains securely on-device thanks to on-device machine learning, ensuring user privacy.

Eye Tracking seamlessly integrates with iOS and iPadOS apps, eliminating the need for extra hardware. By dwelling their gaze on specific elements, users can navigate and interact with their devices, replicating physical buttons, swipes, and other gestures – all through eye control.

Music Haptics: A New Way to Experience Music

Embed from Getty Images

Music has a transformative power, and Apple’s Music Haptics aims to make that experience inclusive for users who are deaf or hard of hearing. When activated, the Taptic Engine in iPhone translates music into an array of taps, textures, and subtle vibrations, providing a new dimension of musical appreciation.

Music Haptics isn’t limited to Apple Music; it will be available as an API for developers, allowing them to integrate this feature into their music apps, making music more accessible for everyone.


Please enter your comment!
Please enter your name here