Apple announces new accessibility features

Including “eye tracking”,”musical touch” and “sound shortcuts”

Apple today announced that it will launch new assistive features later this year, including “eye tracking,” a way for physically impaired users to control iPads and iPhones with only their eyes. In addition,”Music Touch” will provide deaf and hearing-impaired users with a new way to experience music using the iPhone Touch Engine;”Sound Shortcuts” will allow users to perform tasks by emitting a custom sound;”Vehicle Movement Tips” will help reduce motion sickness when using an iPhone or iPad in a moving vehicle; and VisionOS will also introduce more assistive features. These features combine the power of Apple hardware and software, using Apple chips, artificial intelligence (AI) and device-side machine learning to further advance Apple’s decades-old commitment to designing products for everyone.
Apple CEO Tim Cook said: “We firmly believe that innovation has the transformative power to enrich lives. That’s why for nearly 40 years, Apple has embedded accessibility features at the heart of hardware and software to promote inclusive design. We continue to challenge the boundaries of technology, and these new features reflect our long-term commitment to providing the best experience for all users.”
Sarah Herrlinger, Apple’s senior director of global assistive use policies and initiatives, said: “Every year, we make new breakthroughs in assistive use. These new features will have an impact on the lives of users and provide new ways to communicate, control devices, and move around the world.”

Eye Tracking launches on iPad and iPhone

Driven by AI, Eye Tracking provides users with a built-in option to navigate iPad and iPhone with just their eyes. “Eye Tracking” is specially designed for users with physical disabilities. It can be set and corrected in seconds using the front lens. It also uses machine learning technology on the device to set and control this function. All data is securely stored on the device and will not be shared with Apple.
Eye Tracking works across iPadOS and iOS apps and does not require additional hardware or accessories. Through Eye Tracking, users can navigate elements of the app and use Dwell Control to activate each element, using more functions such as physical buttons, swipes and other gestures with their eyes.

“Musical Touch” makes it easier for users to appreciate songs

“Musical Touch” is a new way for deaf or hearing-impaired users to experience music on their iPhone. When this assistive feature is enabled, the iPhone’s tactile engine provides tactile effects such as taps, tactile touches, and delicate vibrations based on the sound of music. Music Touch works with millions of songs in the Apple Music catalog and will be available to developers as an API to make it easier for users to experience music in their apps.

New features for multiple voices

Through Voice Shortcuts, iPhone and iPad users can specify custom words that Siri can understand to activate shortcuts and complete complex tasks. Another new feature,”Listen to Atypical Speeches”, provides users with an option to enhance the recognition of more voices. “Listening to Atypical Speech” uses device-side machine learning technology to identify a user’s voice patterns. Designed for patients with acquired or progressive conditions that affect speech, such as cerebral palsy, amyotrophic lateral sclerosis (ALS), or stroke, these features are based on the features introduced by iOS 17 for users who are not speaking or at risk of aphasia, providing a new level of customization features and control options.

Mark Hasekawa-Johnson of the Speech Accessibility Project at the Beckman Institute for Advanced Science and Technology at the University of Illinois at Urbana-Champaign said: “AI has the potential to improve speech recognition for millions of atypical speakers, so we are pleased that Apple is bringing these new assistive features to consumers. Speech accessibility programs are an extensive, community-supported effort that helps companies and universities make speech recognition more robust and effective, and Apple is one of the assistive users who help make speech accessibility programs possible.”

“Vehicle movement tips” help reduce motion sickness

“Vehicle Movement Tips” is a new experience for iPhone and iPad that helps reduce the symptoms of motion sickness among passengers in driving vehicles. Research has shown that motion sickness is often caused by the sensory conflict between people’s vision and sensation, which may make it difficult for some users to comfortably use their iPhone or iPad while driving in a car. Through the “Vehicle Movement Prompt” feature, dynamic dots on the edge of the screen represent changes in vehicle movement, helping to reduce sensory conflicts without interfering with the main content. “Vehicle Movement Tips” use built-in sensors on the iPhone and iPad to identify that a user is in a moving vehicle and respond to such situations. This feature can be set to display automatically on the iPhone, or can be turned on and off in the Control Center.

CarPlay provides “voice control” and more assistive feature updates

Assistive features CarPlay will introduce include “voice control”,”color filter” and “voice recognition”. Through “Voice Control”, users can operate CarPlay and control apps with only voice. Through “Voice Recognition”, drivers or passengers who are deaf or hearing-impaired can activate reminders to receive notifications from car horns and sirens. For colorblind users, the “Color Filter” makes the CarPlay interface visually easier to use and provides more visual aids, including bold text and enlarged text.

VisionOS will launch assistive features

This year, visionOS will launch assistive features that will include system-wide “instant captioning” to help everyone, including deaf and hearing-impaired users, keep up with verbal conversations in instant communication and the voice of the app. Through the FaceTime “instant captioning” feature in VisionOS, more users can easily enjoy the unique experience of communicating and cooperating using their own Persona. Apple Vision Pro will add the ability to use the window bar to move subtitles during Apple Immersive Video, as well as support for other Made for iPhone hearing devices and cochlear implant hearing processors. Among the updates to the visual assistive function,”Reduce Transparency”,”Smart Reverse” and “Dim Flashing Lights” will be added for users with poor vision or users who want to avoid strong light and frequent flashing lights.

These features add to dozens of assistive features already available in Apple Vision Pro. Apple Vision Pro provides a flexible input system and intuitive interface designed for a wide variety of users. Features such as “Narration”,”Zoom” and “Color Filter” can also provide spatial computing functions for blind or visually impaired users, while features such as “Guided Use Mode” can help users with cognitive impairment. Users can control Vision Pro with any combination of eyes, hands or sound, and its assistive features include “Switch Control”, Sound Actions and Dwell Control, which can also help people with physical disabilities.

Ryan Hudson-Peralta, a Detroit-based product designer, accessibility consultant and co-founder of Equal Accessibility LLC, said: “Apple Vision Pro is undoubtedly the easiest technology I have ever used. As someone who was born without hands and unable to walk, I knew the world was not designed for me, so it was amazing to see VisionOS working like this. It embodies the power and importance of barrier-free and inclusive design.”

If you want to learn more, you can click on the link below the video.
Thank you for watching this video. If you like it, please subscribe and like it. thank

Original text:https://www.apple.com/tw/newsroom/2024/05/apple-announces-new-accessibility-features-including-eye-tracking/

Video:

Scroll to Top