Soon, you’ll control your iPhone with a glance – Computerworld



The company has gone further this year, introducing something it calls “Listen for Atypical Speech.” This uses on-device machine learning to recognize user speech patterns and is designed to make speech recognition systems easier and more accurate for users with acquired or progressive conditions such as cerebral palsy, amyotrophic lateral sclerosis (ALS), or stroke.

“Each year, we break new ground when it comes to accessibility,” said Sarah Herrlinger, Apple’s senior director of Global Accessibility Policy and Initiatives. “These new features will make an impact in the lives of a wide range of users, providing new ways to communicate, control their devices, and move through the world.”

Even more accessibility updates

Another new feature, Motion Cues, can help reduce motion sickness when using a device in a moving vehicle; the company also promised a range of additional accessibility features for visionOS — including Live Captions, which lets people follow live or recorded conversations with captions on the display.



Source link