Google Lens gets the ability to translate text in offline mode for Android

Google Lens, a feature unveiled back in 2017, which was initially reserved for Pixel phones, is now getting an update that will allow it to translate text in offline mode. Up until now, the translation feature of it was only available online with an offline feature being under development for a while. Now, Google is rolling out the ability to translate text on Lens for Android in offline mode.

The information was first reported by 9to5Google(via XDA). The Offline Translation support of Google Lens is rolling out to Android users. It is a server-side one, which means it could take a few days to reach everyone. Once you receive the update, you should see a new “Tap to download” prompt highlighting a new button next to most languages in the “Select language” screen. One you tap on this button, it will download the respective language pack for offline translations.

After the language is downloaded, you’ll get a checkmark next to the language to indicate that it’s ready for offline translations. Don’t tap on the checkmark again as it will remove the language pack from your device. Once downloaded, your Google Lens app on Android will be able to translate text from the downloaded language to the target language even without an internet connection. It works like “Instant” feature in Google Translate. This means, it automatically translates the identified text as soon as you point your camera at it. You don’t need to tap on the shutter button to capture a still shot.

Once done, you’ll get a preview window with the translation overlaid on the live preview. The preview will also include a “Copy all” button to help you quickly copy the translated text onto the clipboard.

The post Google Lens gets the ability to translate text in offline mode for Android appeared first on Pocketnow.

Google Search, Lens, and Maps get new features to make your pandemic life easier

Google’s Search On live event was quite a busy one. The company introduced a slew of cool new features such as ‘hum to search’ for discovering songs and meaningful upgrades for Google Lens, Search, and Maps. A majority of the features announced by Google somehow touch upon the aspects of life that have been affected by the coronavirus pandemic, and sound really helpful. So, let’s quickly recap the most important ones:

Google Maps will keep you safe and prepared

The ‘busyness’ feature on Google Maps shows how busy a place is at a given time or day of the week, helping users accordingly plan their visit to avoid crowded space, something that is of vital importance in the battle against a deadly pandemic. Google says it will increase the coverage of live busyness information to more areas such as beaches, pharmacies, and grocery stores among others, while also expanding its reach by five times.

Plus, busyness information will now be shown directly on Google Maps without even searching for a place, and while on the move as well. This feature will soon be available on Android, iOS, and desktop. In addition to real-time busyness information, users can also see a graph of how busy a place usually is over the course of a week.

Additionally, Google Maps will also show information about the health and safety precautions that are undertaken at a restaurant or shop. This information is contributed by businesses listed on Google Maps, but users will soon be able to add their personal experiences as well.

Lastly, users across the world will soon be able to use the Live View augmented reality feature to find more information about a place, such as when it opens, the busyness status, its star rating, and the safety measures it has put in place. All you have to do is just open Live View, point to a shop or building, and tap on the icon above it.

A smarter search experience

Google has announced that BERT language understanding is now used to process all search queries made in the English language. Plus, Google search now relies on a new spelling algorithm that can detect grammar and spelling errors more efficiently. As a result, it can find the right search results users are looking for.

Google search is also making it easier for users to find answers to questions that require some explanation. To do so, Google search now indexes individual passages on a webpage too, in addition to the webpage itself. Doing so will make it easier for the search algorithms to understand the relevancy of each passage and accordingly bring up results that can answer users’ queries.

Google has also started testing a new technology that will help users quickly find a particular moment or segment in a video they are looking for. AI algorithms will automatically recognize key moments in a video and will accordingly tag them, somewhat like chapters in YouTube videos. For example, a baseball match video will be labeled with time markers for moments such as home run and strikeout.

Google Lens is now even better

Google Lens is already capable of doing a lot of cool things such as recognizing objects, extracting text from photos, identifying codes, and a lot more. It is now getting even better, especially when it comes to education. Google Lens can now identify mathematics or science problems, and will accordingly show step-by-step solutions and guides to help students. This capability can be accessed from the Google app’s search bar on Android and iOS.

Another cool trick that is coming to Google Lens is an easier shopping experience, thanks to Style Engine technology. Now, when users long-press on an image while viewing it in the Google app or Chrome browser on Android (coming soon to the Google app on iOS too), Google Lens will show matching items listed on e-commerce platforms so that users can easily find more information or purchase them.

The post Google Search, Lens, and Maps get new features to make your pandemic life easier appeared first on Pocketnow.

Google Lens will soon be able to solve math problems from a photo

Google has been rolling out tools and features to help students and their parents with homework. It launched an augmented reality feature a couple of months ago. It lets you view 3D anatomy models and cellular structures. Now, the company is adding another feature to its Google Lens app.

Google is said to be using technology from mobile learning app Socratic, which will enable Lens to solve math problems by simply taking a photo. When the feature arrives, all you’ll be required to do is snap a pic of your study material and then highlight an equation or a particular problem you can’t seem to solve. And the Lens will give you quick access to step-by-step guides and detailed explainers.

The idea is to easily look up mathematical concepts giving you trouble. However, the company hasn’t stated when it is planning to roll out the feature. Meanwhile, Socratic itself is available as standalone apps for iOS and Android.

Via: Engadget

The post Google Lens will soon be able to solve math problems from a photo appeared first on Pocketnow.

Google Assistant can now read and translate text on KaiOS

At Google I/O 2019, the company brought camera-based translation to Google Lens to help users understand information they find in the real world. With Lens, you can point your camera at text you see and translate it into more than 100 languages. It can also speak the words out loud in your preferred language. The feature was released for Google Go too.

Today, the company has announced that it will be extending this capability to the millions of Google Assistant users on KaiOS devices in India. From Assistant, they can click the camera icon to simply point their phone at the real-world text (like a product label, street sign, or document, for example,) and have it read back in their preferred language, translated, or defined. Just long-press the center button from the home screen to get started with Assistant.

Within Google Assistant, KaiOS users can now use Google Lens to read, translate, and define words in the real world. The feature is currently available for English and several Indian languages including Hindi, Bengali, Telugu, Marathi, and Tamil, and will soon be available in Kannada and Gujarati.

The post Google Assistant can now read and translate text on KaiOS appeared first on Pocketnow.

Google Lens can now scan handwritten notes and paste the text to computer

Google has added a useful new feature to Google Lens that allows it to scan handwritten text, copy the highlighted part, and paste it on a computer. While scanning a handwritten text, Google Lens now shows a new “copy to computer” button that adds the highlighted text to the clipboard and it can then be pasted to a document open on a computer.

E694_Lens_Productivity_Assets_INT_CopyToDevice_HANDWRITING.gif

But you need to make sure that you are running the latest version of Chrome on your computer, and are also logged in with the same Google account on your PC as well as the smartphone. The feature is quite nifty, and while testing it, we found that “copy to computer” works quite well. But make sure that your handwriting is clean.

Learn the correct pronunciation

Google Lens has also landed a new pronunciation feature. Now, when you scan a text, you can highlight a word or a particular phrase and tap on the new “Listen” button to hear Google Lens read it out loud.

In-line search

Lastly, an in-line search feature has also arrived on Google Lens. While scanning a text, just highlight a term or keyword that you don’t understand, and Google Lens will pull up search results to help you get a grasp of it.

E694_Lens_Productivity_Assets_INT_HomeworkHelp.gif

Source: Google Blog

The post Google Lens can now scan handwritten notes and paste the text to computer appeared first on Pocketnow.

What happened to responsibility at Google I/O 2018? | #PNWeekly 304 (LIVE at 1p ET)

Duplex was great and all. Android P made a big step towards wider, faster distribution. But what about privacy? We talk Google I/O on this week's show.

The post What happened to responsibility at Google I/O 2018? | #PNWeekly 304 (LIVE at 1p ET) appeared first on Pocketnow.

OnePlus phones getting Google Lens as slow migration continues

Google's visual recognition service is coming not in drips, but in occasional bursts. OnePlus phones have quickly been taken with Google Lens.

The post OnePlus phones getting Google Lens as slow migration continues appeared first on Pocketnow.

LG G7 ThinQ brings more noise, more screen, more AI

The speakers sound like a Boombox, the screen is actually a Super Bright Display and the AI CAM is more intelligent than ever before. LG officially makes the hard sell to consumers on the LG G7 ThinQ.

The post LG G7 ThinQ brings more noise, more screen, more AI appeared first on Pocketnow.

Google Lens preview officially spreads to iOS devices in Google Photos app

Previously available only on Android devices, the AI-powered Google Lens object recognition tool is now rolling out to both iPhone and iPad users as a neat Google Photos extension.

The post Google Lens preview officially spreads to iOS devices in Google Photos app appeared first on Pocketnow.

Google Lens officially starts its gradual rollout on non-Pixel Android devices

Available for Pixel and Pixel 2 users since November, the Google Lens "preview" experience is finally expanding to other Android phones, also "coming soon" on iOS.

The post Google Lens officially starts its gradual rollout on non-Pixel Android devices appeared first on Pocketnow.

Pixel phones get instant access to Google Lens through Google Assistant

Seeing barcodes, business cards, baroque art and B-rate movies? This will be pretty useful if you have any sort of Pixel phone.

The post Pixel phones get instant access to Google Lens through Google Assistant appeared first on Pocketnow.