Apple is set to launch a range of new accessibility features for iPhone and iPad users, including a tool that can replicate a user’s voice for phone conversations after just 15 minutes of training. Announced on Tuesday, the feature, called Personal Voice, allows users to read specific text prompts to train the device to mimic their voice. Another feature, Live Speech, will use this synthesized voice replication to read out typed text during phone calls, FaceTime, and in-person interactions. Users can also save frequently used phrases for quick use during conversations.
These tools aim to make Apple devices more accessible to individuals with cognitive, visual, hearing, and mobility challenges. Apple particularly emphasized that those with conditions causing gradual loss of voice, such as ALS, could greatly benefit from the new features.
“Accessibility is embedded in everything we do at Apple,” stated Sarah Herrlinger, Apple’s senior director of Global Accessibility Policy and Initiatives, in a blog post. She highlighted that these innovative features were developed with input from members of the disability community to ensure they meet diverse needs and foster connection in new ways.
The new features, set to launch later this year, come at a time when advancements in AI have raised concerns about deepfake technology, where synthetic audio and video can be used maliciously. Addressing privacy concerns, Apple emphasized that Personal Voice relies on on-device machine learning to maintain user privacy and security.
Apple isn’t the only tech company exploring AI voice replication. In 2022, Amazon announced plans to develop an Alexa feature that could mimic any voice, even that of a deceased relative, though this feature has yet to be released.
Beyond voice replication, Apple is introducing Assistive Access, which merges several core iOS apps—including FaceTime, Messages, Camera, Photos, Music, and Phone—into a unified Calls app. This simplified interface features high-contrast buttons, large text, and an option for an emoji-only keyboard. It also allows users to record video messages, offering a range of communication options.
Additionally, Apple is enhancing its Magnifier app for users with visual impairments. The app will now feature a detection mode to assist in identifying and interacting with physical objects. For example, users can point their iPhone camera at a microwave keypad, and the app will read out and announce the text on the buttons as the user moves their finger across them.
With these updates, Apple is reinforcing its commitment to inclusivity, ensuring its devices can cater to a broader range of needs and enable better communication and interaction for all users.