New Delhi: Google I/O 2022 saw several announcements on new launches and updates from the search giant. Along with new products such as the Google Pixel 6a smartphone, Pixel Buds Pro TWS earphones, and the Pixel Watch, Google also highlighted several updates to its popular apps, including Google Maps and Google Search. While Google Maps looks to take on Apple Maps’ 3D view with a new ‘immersive view’, Google Search’s local multisearch and scene exploration features will help users do more with their Lens searches.


Google I/O 2022: Google Maps introduces ‘immersive view’


Apple Maps users can explore a city in incredible detail via its 3D View mode. Google Maps looks to one-up the iPhone maker with its new ‘immersive view’, which will help users explore cities, landmarks, venues, restaurants, and more locations like never before. Google said at its I/O 2022 event that it has developed the new ‘immersive view’ by bringing together “billions” of images to offer minute details to a viewer. Users will also be able to use the ‘time slider’ to see what an area looks like at any given point in time.


‘Immersive mode’ will also help users check out local weather and traffic conditions to properly plan outings. Apart from an overall city view, users can also check out the interiors of restaurants and other major venues, just to get an idea of what they offer.


Google Maps’ ‘Immersive View’ will only roll out to a handful of cities for now. Los Angeles, London, New York, San Francisco, and Tokyo will be the first ones to get the ‘immersive view’ support later this year on both Android and iOS. More cities will be added in the coming months.


Google I/O 2022: Google Search makes multisearch and scene exploration handier


In April, Google introduced multisearch as a concept, which uses Lens’ image search feature and makes it even more useful by letting users add more context with additional text to help search better. At I/O 2022, Google said that users will be able to look up local listings via the multisearch feature.


For example, if you are looking for pasta restaurants near you, all you need to do is Lens search an image of a plate of pasta, and then add “near me” as additional text via the new multisearch feature, and Google will show you related businesses located nearby.


With scene exploration, Google Search will let users move their cameras to capture a wider scene. Google will automatically pick up multiple objects in the frame at once and provide a context to the user. This can particularly come in handy when you are browsing objects at a store that you are not sure of. You can use scene exploration to scan through multiple items at one and go on to buy the right one you need, based on the Search result.


While local multisearch will be available in English later this year, more languages can be expected to be added later on starting 2023. Google said scene exploration will roll out soon but didn’t provide an exact timeline.