Google’s annual I/O developer conference kicked off today, and the first big announcement of CEO Sundar Pichai’s keynote was something called Google Lens, essentially a photo-recognition feature that lets users turn their camera lenses into a search box. By pointing their phones at a thing they want to know more about, they can tap into Google’s massive reservoir of data — point Google Lens at a flower, Pichai explains, and it will identify what kind of flower it is. The need may not be crazy high for on-demand flower taxonomy, so most users will probably be more excited about the next application he suggests: allowing Google Lens to automatically pull up a restaurant’s info as you pass by it on the street. Aim the camera at a restaurant, and the app won’t just recognize which restaurant it is, it’ll also populate Google’s restaurant rating and other relevant info right there on the screen.
With the tool installed, you can pan the façades of restaurants in eyeshot, see what kind of food they serve, and decide if they’re worth your while for a drink or meal. Assuming it works the way Google shows in the GIF it tweeted, the recognition is immediate (thank the data in Google Street View for that), and you can even toggle back and forth between spots. Basically, this is like if you put the food-identifying app on steroids, the one Erlich Bachman was dreaming of in Silicon Valley when Jian-Yang gave him Not Hotdog instead.