Apple will bring live captions to iPhone, iPad and Mac and more gesture control to Apple Watch

Apple will bring live captions to iPhone, iPad and Mac and more gesture control to Apple Watch
Written by admin_3fxxacau

Tuesday morning, Apple announced a new wave of accessibility features for its various computing platforms, which are expected to roll out later this year as software updates for the iPhone, iPad, Mac, and Apple Watch.

Apple said it will beta test live captions capable of transcribing any audio — FaceTime calls, video conferencing apps (with automatic attribution to identify the speaker), streaming video, or in-person conversations. – in English on iPhone, iPad and Mac. Google push for live captioning features started around the release of Android 10 and are now available in English on Pixel 2 and later devices, as well as on “select” other Android phones and in other languages ​​for Pixel 6 and Pixel 6 Pro. So it’s good to see the Apple ecosystem catching up and bringing it to even more people.

Live Captions beta on iOS.
Image: Apple

Similar to Android’s implementation, Apple says its captions will be generated on user’s devices, keeping information private. The beta will launch later this year in the US and Canada for iPhone 11 and later, iPads with the A12 Bionic processor and later, and Macs with Apple Silicon processors.

User demonstrates how to use pinch gestures to scroll through a menu.

Assistive Touch’s current default gestures are pinch to go forward and double pinch to go back.

The Apple Watch will expand Assistive Touch gesture recognition commands he added last year with quick actions that recognize a double pinch to end a call, dismiss notifications, take a photo, pause/play media, or start a workout. To learn more about what gesture controls already do and how they work, we explain in more detail how to use your Apple Watch hands-free here.

Mirroring Apple Watch with iPhone

Mirroring Apple Watch with iPhone.
Image: Apple

The Apple Watch is also becoming easier to use for people with physical and motor disabilities thanks to a new mirroring feature that will add remote control from a paired iPhone. Apple Watch Mirroring includes technology taken from AirPlay, making it easy to access the watch’s unique features without specifically relying on your ability to tap its tiny screen or what voice commands can activate.

Customizing Sound Recognition in iOS

Customizing sound recognition in iOS.
Image: Apple

Apple deployment of sound recognition with iOS 14 to pick up specific sounds like a smoke alarm or running water and alert users who may be deaf or hard of hearing. Soon, sound recognition will allow tuning to allow personalized recognition of sounds. As this screenshot shows, it can listen for repeated alerts and learn to pick up alerts specific to the user’s environment, such as an unusual doorbell alert or device ding.

New enhancements to its VoiceOver screen reader app, Speak Selection and Speak Screen features will add support for 20 new “locales and languages”, covering Arabic (World), Basque, Bengali ( India), Bhojpuri (India), Bulgarian, Catalan, Croatian, Farsi, French (Belgium), Galician, Kannada, Malay, Mandarin (Liaoning, Shaanxi, Sichuan), Marathi, Shanghainese (China), Spanish (Chile) , Slovenian, Tamil, Telugu, Ukrainian, Valencian and Vietnamese. On Mac, VoiceOver’s new text checker will check for formatting issues such as extra spaces or capitalized letters, while in Apple Maps VoiceOver users can expect new audio and haptic feedback that will show where. start for walking routes.

Now, Apple says on-device processing will use lidar sensors and the cameras of an iPhone or iPad for door detection. The new feature coming to iOS will help users find entrances to a new location, tell them where it is, and describe whether it works with a button or a handle as well as whether it’s open or closed.

This will all be part of the Detect Mode that Apple is adding to Magnifier in iOS, which also collects existing functionality that allows the camera to zoom in on nearby objects and describe them or recognize people nearby and alert the user with sounds, speech or haptic feedback. Using the lidar sensor means that people detection and door detection will require an iPhone Pro or iPad Pro model that includes the feature.

Adjust how long Siri pauses to wait for a response

Adjust how long Siri pauses to wait for a response.
Image: Apple

Another new feature on the way is Buddy Controller, which combines two game controllers into one so a friend can help someone play a game, similar to the Copilot function on Xbox.

Finally, other tweaks include voice control spelling mode with letter-by-letter input, controls to adjust how long Siri waits to respond to requests, and additional visual tweaks for Apple Books that can bold the text, change theme, or adjust line, character, and word spacing to make it more readable.

The announcements are part of Apple’s recognition this week of Global Accessibility Awareness Day on May 19. He notes that Apple Store locations will offer live sessions to help people learn more about existing features, and a new Accessibility Assistant shortcut is coming to Mac and Apple Watch this week to recommend specific features based on user preferences. ‘user.

#Apple #bring #live #captions #iPhone #iPad #Mac #gesture #control #Apple #Watch

About the author


Leave a Comment