Google Assistant will be getting an AI makeover with the addition of Google Lens. This new feature is basically a set of vision-based computing capabilities within the Assistant.
"With Google Lens, your smartphone camera won't just see what you see," Google's Twitter explanation reads. "But will also understand what you see to help you take action." The latest innovation was announced at their I/O 2017 conference this afternoon and safe to say, we're pretty excited about it.
Google Lens is similar to Bixby Vision, which is exclusive to the Galaxy S8, in that it tells you more about what you're smartphone lens is seeing. It will be rolled via updates to the Google Assistant and Google Photos. At this point, it's unclear whether Google will create a separate Lens app; for now, it's merely an inbuilt Assistant feature.
Google CEO Sundar Pichai's announcement was greeted by applause from an audience who were clearly very impressed with the innovative function. He explained that the AI technology allowed Google to analyze what your camera was seeing and relay that information to the user.
Today we are announcing a new initiative called Google Lens. Google Lens is a set of vision-based computing capabilities that can understand what you're looking at and help you take action based on that information. We'll ship it first in Google Assistant and Photos and it will come to other products.
So how does it work? For example, if you run into something and you want to know what it is – say a flower – you can use Google Lens from your Assistant and point your phone at it and we can tell you what flower it is. It's great for someone like me with allergies.
Pretty impressive, right? But we were sold when Pichai highlighted the fact that Google Lens would be able to tackle the problem of Wi-Fi codes: simply point your smartphone at the router information "and we can automatically do the hard work for you." Another First World Problem eradicated!
If you're walking in a street downtown and you see a set of restaurants across you, you can point your phone because we know where you are and we have our knowledge graph and we know what you're looking at, we can give you the right information in a meaningful way.
Google Lens will certainly be a handy tool as it enables your Assistant "to have a conversation about what you see" as Scott Huffman at Google points out and converse beyond just words. Later in the conference, Huffman demonstrated how a Japanese sign was translated using this technology.
Another interesting aspect of this tool was the role it will play in your Photos. Anil Sabharwal at Google confirmed that "Lens will be rolling out to Google Photos later this year." The audience were understandably wowed by the demo because Lens will also be able to identify features in your photos after you've taken them.
We'll all be walking encyclopedias with Google Lens and I personally can't wait.
Want to master Microsoft Excel and take your work-from-home job prospects to the next level? Jump start your career with our Premium A-to-Z Microsoft Excel Training Bundle from the new Gadget Hacks Shop and get lifetime access to more than 40 hours of Basic to Advanced Instruction on Functions, Formula, Tools, & More.