Also, if you set a timer on a smart speaker or display, Google's rolling out a feature that'll allow you to stop the timer by just saying "stop", no "Hey Google" required. This should open Incognito searches up to everyone who uses Google's search engine on other browsers, like Firefox or Opera.
Tapping the picker lets you switch to a different account, add a new one, manage email addresses on your device, but most importantly, it has that new "Manage your Google Account" option.
After you finish eating Lens can help you with the bill too.
"People have already asked Google Lens more than a billion questions about things they see", she wrote.
This means when Android Go users point their camera at text, Lens can now read it out loud.
Instead of reading about the solar system, Google Search can just show you how it works.
Google made the announcement during its annual Google I/O developer conference, at the Shoreline Amphitheatre in Mountain View, California.
Gets more visual Google Lens taps into machine learning (ML), computer vision and tens of billions of facts in the Knowledge Graph to answer user questions.
The company is working with several organizations, including NASA, Samsung, Visible Body and Wayfar, to deliver their content on Search using AR, so users can better understand the information provided. Google shows how it looks like on the event site of the I / O: As soon as the camera is placed on an information sign, virtual signs are placed in the live image of the camera showing where to find something.
Google Lens improvements were not strictly focused on food, although it seemed like it. Lens can now read text out loud or even translate it from another language.
Google Assistant gets an AI processing boost for faster performance on smartphones. The company also announced Tuesday that the technology has now been combined with Google Maps. With a new update for dealing with menus, for example, Lens can highlight the most popular dishes right on the real-time view of a restaurant's menu.