Google Search on 22 as interpreted by designers

google-search-on

On September 29, Google announced Google Search on 22 under the name Search outside the box. It was an announcement that gave a glimpse into the future of information exploration envisioned by Google, the king of search. Amid threats from services like TikTok and Amazon, Google announced several features that represent the direction Google Search is heading. At the center was the keyword 'visual'. He gave about 50 minutes of presentations on visual navigation, Google Maps, food, shopping, sustainability, and safety. From a designer's point of view, we summarized the impressive experience changes we found in this announcement.

Local search beyond reviews and information

Google Maps is one of the powerful services used all over the world. Adds the ability to significantly bridge the gap between flat maps and real reality when looking for information about an area. Significantly, Neighborhood vibe, Immersive View, and Live view functions will be added.

Neighborhood Vibe combines AI with content from users using Google Maps to show you the trendiest or must-visit places. Immersive View uses city AI technology to combine street view, aerial imagery, weather, traffic, and other information to provide a holistic view of the area. Based on historical information, you can also see what the area will look like in the future. Live View Livew View allows you to find the information you want using your camera on the go. You don't have to compare a flat map with the real world to find out where the store is.

Among the region-based search experiences, the regional search added by Instagram was disappointing, but I am looking forward to seeing what kind of experience we can have with Google Maps. Since Instagram is focused on content from influencers, the search for location context was not strong, but since Google Maps is a service for finding locations from its inception, I think updates that reduce the process can play a big role. We look forward to a local search experience filled with information that reviews alone cannot tell.

the future of shopping

Everyone is familiar with the competition between Google and Amazon. Because Amazon has so much infiltrated Google's search advertising business, Google, in turn, is investing heavily in shopping. Announcing a thoughtful feature to make people shop on Google instead of Amazon. Shop Look is a feature that shows products you can buy directly from images in Google search results. It is a technology that interprets the image and automatically links it to additional information, which had to be uploaded with product information and links corresponding to the image. We are also planning to add a 3D feature that allows you to browse the products. It will be tested with a small number of retail partners, and it is said that machine learning will create a 3D image using several product images.

The experience of buying things and the experience of finding information are different. There is a difference between imagining the moment you use an object and paying for it, and building knowledge about the object. This update makes browsing information better, but I can't imagine how it will connect to the shopping experience yet. When you buy a product after browsing information, you probably have a lot of experience buying from a place that is cheaper, arrives faster, or accumulates more points. Getting people to buy your product will take more than good usability. Can Google really match traffic and good usability with purchases?

Search takes a step closer to the real world

Multi Search Multi search is a function that allows you to search text and images simultaneously with Google Lens Google Lens. Until now, you had to type letters to find something on Google, but now it is evolving into a search experience that allows you to search without typing letters. Even if you don't know what you're looking for, you can find relevant information with pictures. Multisearch Near Me, announced this time, is a function that connects to nearby areas with this multi-search method. Focusing on clothing shopping, it is applied from search by combining information such as clothes images and colors. By expanding the scope, you can easily find out what kind of food is from which country even if you have never seen a dish, and even find a nearby store that provides that dish. that can be found.

I thought it was a feature to strengthen my connection with the real world. Just like when we search for information, we are improving to be able to enter information in a variety of ways, and I thought that we are focusing on the experience of finding and moving information in offline spaces based on the information we entered. . No matter how developed the digital environment is, our bodies will never disappear, and it seems a natural flow to invest in activities that must accompany our bodies as long as we have them.

Design Compass Newsletter

Every Tuesday morning, curated by a designer with 16 years of experience
A collection of inspiring news, trends and articles.

popularity

ios18-official-01
iOS 18 정식 출시: 드디어 통화 녹음이 된다.
figma-visual-refresh-01
완전히 바뀐 피그마: 탁월하고 직관적인 브랜드 새로고침
musinsa-desktop-quit-01
무신사 PC 버전 종료: 넘쳐 버린 개발 부채

Design Compass Academy

thumbnail-course-list-ux
UX Design Class: UX Design for Selected Services
ui-design-main-mobile
UI Design Class: Learn the essence of an interface that will never change

Latest news

figma-visual-refresh-01
ios18-official-01
national-identity-card-01
applewatch10-01
airpod4-01
iphone16-01
telegram-ceo-arrest-01
twitch-brand-refresh-01