Google announces search engine redesign with more images and additional context on results

Google has announced a new overhaul of its search tools, making it more visual and adding additional contextual information to its results.

At its Search On event, the web giant also announced new features for Google Chrome and its artificially intelligent photo software Google Lens.

The main aesthetic change is for visually navigable results, “for searches where you need inspiration” such as “for painting ideas”, says Google, which will bring up a series of images at the top of search results without having to navigate to the Images tab.

It will also bring more contextual information, rolling out over the next few months, with a new “Things to Know” section that includes “different dimensions that people typically look for”.

Google updates search to give users more context on results

For those looking to paint in acrylics, for example, below the top result is a series of drop-down results including a step-by-step guide, tips or styling options. Google will also have new “refine this search” and “broaden this search” options, so users can quickly jump between levels of contextual information.

Google is adding more context to search in other ways by expanding its “About these results” panel, which is accessible via the three-dot icon on the right side of desktop or mobile search results.

Currently this shows a description of a source – such as “The Independent”, or “from Wikipedia” – but Google will now allow sites to describe themselves “in [their] own words” too. Users will also be able to see what others have said on a website, including “news, reviews, and other useful context,” which the company says will help users better evaluate sources.

Whether or not this turns out to be true remains to be seen; Twitter has introduced a similar system called Birdwatch where users can annotate incorrect tweets, although many users simply opine or mark baseless allegations – such as voter fraud in the 2020 US election – as “not misleading”. The Independent has contacted Google for more information on how these claims will be verified.

In addition to search, Google’s updates to Lens will allow users to search for information based on the content of a photo, such as taking a photo of a pattern and asking Google for “socks with that pattern”, or a picture of a broken bike chain and ask Google “how to fix it”. Google’s machine learning will now recognize image content and search accordingly, using a technology it calls Multitasking unified model (MUM) who can better understand the context and comparisons.

MUM will also be used for video search results, identifying topics that might be related to video content even if not explicitly mentioned.

For iPhone and Android users, the Lens update will be available in the Google app which makes all images on a page viewable through Lens. Google has a clear advantage over its Android operating system but says The Independent that the Google app is opened three billion times a month; for context, there are a billion iPhone users, who open many apps multiple times a day.

A new Lens update is also coming to Chrome on desktop in the coming months, where users “will be able to select images, videos, and text content on a website with Lens to quickly see search results in the same tab – without leaving the page you are on”.

Finally, Google’s Shopping tab gets more information about in-store items at local stores, with an “in stock” button to filter results. It launches today in the UK, US, Australia, Austria, Brazil, Canada, Denmark, France, Germany, Japan, Netherlands, New Zealand, in Norway, Sweden and Switzerland.