Google I/O 2017 demonstrates how Google is doubling down on artificial intelligence to give its products an edge over the competition.
Announcing Google Lens
Google CEO Sundar Pichai kicked off I/O by announcing Google Lens, which will work as an extension of Google Assistant and Google Photos (Android, iOS) (with integration in other apps planned for the near future). It takes the input from your phone's camera and attempts to tell you more about it -- people, places, and objects of interest.
Google showed one example of a sandwich board on an Osaka sidewalk that mentions a restaurant menu item written in Japanese script. Google Lens translated this script into English and provided a stream of photos along the bottom of the screen that contained images of that menu item. So while there isn't an American English word that directly explains what sashimi is, for example, Google Lens could show you what sashimi looks like without your having to conduct a search.
Speaking of Google Assistant, the company announced that this AI has come to the iPhone and iPad, and they announced an Assistant SDK so that third-party developers could start integrating Lens into their own services and devices without having to make special arrangements with Google.
Google Home is where the heart is
Google is also pouring enhanced AI into Google Home (Android, iOS), its competitor to Amazon's virtual assistant devices like the Echo and the Echo Dot. Since its launch six months ago, Google's integrated Netflix and YouTube (with the help of Chromecast) to get video up on your TV based on your voice commands. Google announced that a number of other video services are getting on board, including YouTube TV (its live TV streaming competitor to PlayStation Vue and DirecTV Now), HBO Now, Hulu, HGTV, and The Food Network.
Over the coming months, Google plans to add three more components to Home. One is called proactive assistance. In the same way that you get notifications on your phone, Home will indicate notifications by blinking the set of lights on top of the device. Then you can ask Home to read those notifications aloud (and the device will use your voiceprint to verify your identity).
Next is what Google calls Visual Responses. With a Chromecast plugged into your TV, you can tell Google Home to put your Google Calendar up on the screen, navigate YouTube and YouTube TV (Android, iOS), and even turn your TV off when you're done.
Last is hands-free calling, which Amazon debuted on its Echo devices just recently. Like Amazon's Alexa Calling, this service will let you use the Home to call someone's phone or Home, for free. Unlike Alexa Calling, you will acquire a phone number unique to Home, but you can switch to your carrier's number if you choose.
Google Photos gets cool new features
In the coming weeks, Google Photos (Android, iOS) will get a new feature called Suggested Sharing. Like its name implies, Photos will use machine learning to identify people in your contacts list whom you've taken pictures of, then offer to share a collection of those photos with those contacts. If those contacts don't have the Google Photos app installed, or they don't have a Google account, you can send them an SMS message or an email that invites them to create an account or download the app.
A new feature, Shared Libraries, is also coming, where you can automatically share your photo library (or selected sections of it or all photos of specific people that the app recognizes) with the people close to you. The photos show up in the app as though they were taken on the other person's device.
If that's not enough, Google also announced Photo Books, a service designed to eliminate the hassles of taking the photos on your phone and turning them into a physical album. Prices start at $10 for an album of 20 photos. This service is available now via the Web and will be available in-app soon.