Caption for iPhone: According to many, iOS 15 turned out to be discreet and frivolous, but this is not the case: last year’s update received many features. It is on such trifles that further OS updates will be built – Apple no longer wants to take risks, as it did with iOS 7.
Surely you noticed that iOS 16 also contains a large number of innovations that do not immediately catch your eye: for example, the Live Captions – Live subtitles function, which will certainly make the use of iPhone and iPad more convenient. We say how sound recognition works is it safe and how to activate it.
Caption for iPhone: iOS 16 live subtitles
Live captions is one of the features announced in the new iOS 16. It will allow you to generate subtitles for any sound coming from any application on the iPhone in real time. So you can recognize the words of music, videos, podcasts, or FaceTime conversations. This happens thanks to the Neural Engine coprocessor.
Initially, the feature was intended exclusively for people with hearing problems, but it turns out that live captioning has proven useful for everyone: for example, when there is no headphones around or late at night in bed, when your partner is sleeping and you don’t want to make any noise.
Even in public places it’s easier turn on live captions than to make the sound louder. In addition, the function helps hearing impaired people when talking with another person – subtitles will also be displayed on the screen. Also, don’t confuse Live Captions with iOS 15’s Live Text: it’s a completely different feature that lets you copy text from images!
The text-to-speech accuracy of Apple devices is said to have improved dramatically over the years, but there are still problems with recognizing muffled dialogue. Therefore, the company notes that subtitle recognition accuracy may vary, and the function should not be invoked in high-risk situations.
iphone Caption in FaceTime
Most useful for ordinary users is the application Live Captions in FaceTime: so you can literally read the conversation. Conversations will be assigned to call participants so they’re easy to follow. And if you’re using FaceTime on your Mac, you can type a message and say it out loud.
Live captions works through machine learning, computer vision. Apple notes that Live Captions is secure, which means captions are generated on your device and stay there — not downloaded or transferred to other Apple devices or servers.
iPhone Subtitles: Which iPhones will have Live Captions?
Despite the fact that even iPhone 8 will get iOS 16, the list includes iPhone 11 and newer versions. Logically, it turns out that the A13 Bionic chip is necessary for the function to work, but not: Live captions will work on iPad with A12 Bionic processors and newer: these are the iPad Air 3, iPad mini 5 and iPad 8.
For some unknown reason, even the iPhone XS did not make the list – probably the main thing is not in the differences in hardware (the iPhone XS has more RAM (4 GB) than the iPad 8 (3 GB), but in device obsolescence.
The iPhone XS was released in 2018, and the iPad Air 3 and iPad 8 in 2019 and 2020, respectively. Apple deliberately removes older devices from the list of supported devices and reduces the number of new features available in iOS 16 on older iPhones.
How to Enable Live Captions on iPhone?
The feature has already appeared in the iOS 16 beta, so you can try it out. Please note that at the beginning Dynamic subtitles do not work in Russian – so far only in English for the USA and Canada. But you can try to enable it for voice recognition.
- Go to settings.
- Select Accessibility.
- Click “Accessibility” and select “Live Captions” at the bottom.
- In the “Quick order” item, also select “Live Captions” – this way the function will be activated by pressing the power button three times.
Besides that, you can add live captions to control center so that the switch is always within easy reach.
- Go to settings.
- Select Control Center.
- Select Live Captions at the bottom and click Add.
This is how Apple introduced a convenient feature for everyone, even though it was originally intended for people with disabilities. True, despite all the advantages of subtitles, the function is definitely not suitable for recognizing words in tracks – the words turn to mush, and the song clearly loses its meaning, if, of course, it was there originally.
I am Bhumi Shah, a highly skilled digital marketer with over 11 years of experience in digital marketing and content writing in the tech industry.