Have you ever emerged from the subway or a large building uncertain of which way to go? Your directions say “head north” but which way is north? Google is combining walking directions with augmented reality to fix this problem.
During its I/O developer conference today, the tech giant demoed a feature coming to Google Maps that’ll tap into your smartphone’s camera to tell you which direction to walk. Essentially, it’ll meld your phone’s camera view with Google’s mapping service to point out when you need to turn left or right.
Unfortunately, Google didn’t say when the feature might arrive. But currently, Google Maps gives out walking direction by showing you a 2D grid over your phone and drawing a blue trail to the destination. A virtual arrow on the map tells you what direction you’re walking.
However, the 2D view isn’t perfect, leading you to walk in any direction and hoping your path follows Google’s blue dots. “We’ve all been there,” Google vice president Aparna Chennapragada said at I/O. “So we asked ourselves, what if the camera can help us here?
When you activate the feature, your phone’s camera view will tell you which street you’re on, along with the upcoming directions.
To help users find their way, Google is also experimenting with the addition of cute animated guides. For instance, a virtual fox might appear and show you which way to walk.
The camera-powered Google Maps can also act as a helpful search engine. Simply point your camera down a street, and the app can identify restaurants and display their user ratings.
The technology works by combining Street View with images from the smartphone owner’s camera. Chennapragada called it a “visual positioning system” that uses landmarks identified in the smartphone’s camera to pinpoint the user’s location.
Although impressive, the system doesn’t sound easy to implement, and what we saw at Google I/O may be vaporware. But in the meantime, the company is adding new features to another camera-powered service called Google Lens.
Google Lens was unveiled a year ago and lets you run searches over your smartphone’s camera. Simply snap a photo of a dog, flower, or painting and the tool will instantly pull up relevant info about it.
In the coming weeks, Lens will add the ability to snap a picture of text in the real world and then copy and paste the words into your phone.
Another feature called “Style Match” will identify objects in your smartphone’s camera and offer suggestions of products you can buy that look similar. For example, if you find a dress you like, Google Lens will display links to clothes designed with a similar style.
Google Lens was initially limited to Pixel phones, but in February the company began releasing it over the Google Photos apps to select flagship devices from Samsung, Huawei, LG, and others.