Share with friends and circle of friends with wechat scan QR code < / P > < p > on June 8, apple just announced a new function called live text on WWDC, which can automatically recognize and transcribe the text in photos, so as to digitize the text in all photos and support Chinese and other languages. This releases a range of convenient features, from converting handwritten notes into e-mails and messages to searching for receipts or recipes on photos p> < p > this is certainly not a new feature of smart phones. We have seen Samsung and < a target = "_ blank" href=" https://news.163.com/news/search?keyword=%E8%B0%B7%E6%AD%8C "> Google < / a > and other companies have provided similar tools. But there are still differences in live text. For example, with this feature, the user can read any text in a photo or viewfinder and take immediate action on it. Users can copy and paste the text and search for it on the web. If it's a phone number, you can dial it immediately p> < p > Craig federighi, an apple executive, announced on WWDC that live text will appear on the_ blank" href=" https://news.163.com/news/search?keyword=iOS "> IOS < / a > 15's < a target ="_ blank" href=" https://news.163.com/news/search?keyword=iPhone "> iPhone < / a >. He showed several photos, one of which was a whiteboard after the meeting, and several of which included a snapshot of the restaurant logo in the background p> < p > tap the live text button in the lower right corner to underline the detected text, and then you can select and copy it with a single sweep. Take the whiteboard as an example. It collects notes of several sentences, including bullets. At the same time, it also captures the phone number of a restaurant, which can be dialed or saved p> < p > Apple said that this function is realized through "deep neural network" and "device intelligence", which is the company's description of machine learning function. It highlights Apple's privacy focused AI approach, which focuses on processing data on the device rather than sending it to the cloud. Live text can be downloaded on iPhone, < a target = "_ blank" href=" https://news.163.com/news/search?keyword=iPad "> runs on iPad < / a > and MAC, supports seven languages p> < p > in addition to extracting text from photos, IOS 15 will also allow users to conduct visual search, which sounds exactly like Google lens. The difference is that in every photo taken by iPhone running the new system, the text is more or less captured passively, and users do not have to enter scanner mode or start a separate < a target = "_ blank" href=" https://news.163.com/news/search?keyword=%E5%BA%94%E7%94%A8%E7%A8%8B%E5%BA%8F "> Application < / a > p> < p > it's a good thing for anyone, but it's especially helpful for people with visual impairment. Apple didn't elaborate on the feature at the WWDC launch, but said the new tool could identify "art, books, nature, pets and landmarks" in photos( Small) < / P > < p > <-- EndFragment-->