Bots

#TECH: Google to make 'search' better with AI help

KUALA LUMPUR: Google has announced the use of the latest innovations in artificial intelligence (AI) to make Google search more helpful than ever.

"Today we shared how we're bringing the most advanced AI into our products to further our mission to organise the world's information and make it universally accessible and useful," said Google's senior vice president for Google Search, Assistant, Geo, Ads, Commerce, and Payments products, Prabhakar Raghavan during the global livestream of its second annual Search On event.

During this year's Google I/O, an annual developer conference, the technology company announced how they've reached a critical milestone in understanding information with the Multitask Unified Model (MUM). Since then, Google has been experimenting with using MUM's powerful capabilities and enabling entirely new ways to search for information, more natural, intuitive ways to search.

At the event early this morning, Google shared how the technology redesigned Search experience. Simultaneously understanding information across a wide range of formats, like text, images and video, MUM can also draw insights from and identify connections between concepts, topics, and ideas about the world.

With MUM, your acrylic painting search (or other topic) will come in all the different dimensions other people typically search for. To be launched in the coming months, and is available for searches conducted in the English language, users can now unlock deeper insights — like "how to make acrylic paintings with household items" — and connect you with content on the web that you wouldn't have otherwise found by zooming into more specific aspects of a topic, or broadening out to more general ideas.

For searches where you need inspiration or want to explore information visually, Google also announced a redesigned page that makes it easy to visually browse to find what you're looking for.

Demoed at the same event is a new way to search with Google Lens, which soon will come with the ability to add text to visual searches and ask questions on just about anything you see, may it be a shirt you like, or the same pattern but on socks, all you need to do is point your camera and ask the question.

WIDER LENS AVAILABILITY

Starting soon, iOS users will see a new button in the Google app to make all the images on a page searchable through Google Lens too. This means iOS users can now seamlessly search shoppable images on websites with Lens mode in the iOS Google App. This feature, however, will be limited to the US at this time.

Google is also bringing Google Lens to Chrome. Soon, to be available globally, users will be able to select images, video and text content on a website with Lens to quickly see search results in the same tab — without having to leave the current page.

MORE SHOPPABLE SEARCH EXPERIENCE

Also, starting today, Google is making it easier to browse for apparel from your Search results. For example, when you search for "cropped jackets," now your search will give a visual feed of jackets in various colours and styles alongside other helpful information like local stores, style guides and videos. This new experience is powered by Google's Shopping Graph, a comprehensive, real-time dataset of products, inventory and merchants with over 24 billion listings.

Soon, by using the "in stock" filter on Google search, users can see if nearby stores have specific items on their shelves. The feature is currently launched in English in the US and select markets, including the UK, Australia, Austria, Brazil, Canada, Denmark, France, Germany, Japan, Netherlands, New Zealand, Norway, Sweden, and Switzerland.

SOURCE AND INSIGHT

Starting today, users in the US will be able to find new insights about the sources and topics you find on Search, such as description from Wikipedia where users can also read what a site says about itself in its own words as Google expands its About This Result panels.

ON THE MAP

Last year, Google launched a wildfire boundary map powered by satellite data to help people easily understand the approximate size and location of a fire — right from their device.

Now, Google is increasing the coverage and bringing all wildfire information together with a new layer on Google Maps, including emergency websites, phone numbers, and evacuation information from local governments if they've been provided.

Launching globally on Android, iOS and desktop this October, users can also see details about the fire, such as its containment, how many acres have burned, and when all this information was last reported.

Piloted the Environmental Insights Explorer (EIE) Tree Canopy tool in Los Angeles, California last year, Google Tree Canopy tool will soon be available to over 100 cities around the globe, including places like Guadalajara, London, Sydney and Toronto during the first half of 2022.

Tree Canopy data uses aerial imagery and advanced AI capabilities to identify places in a city that are at the greatest risk of experiencing rapidly rising temperatures. With Tree Canopy data, local governments have free access to insights about where to plant trees in order to increase shade and reduce heat.

ADDRESSING THE UNDER-ADDRESS

As part of the company's vision to map the world, Google has been helping governments and NGOs to provide addresses to people and businesses around the world with Address Maker. It has been six years since this tool was first launched, Google will now use an open-source system Plus Codes to create unique, functioning addresses at scale and in much shorter time.

In a matter of weeks, Address Maker helps get under-addressed communities on the map — unlocking the ability to do things many people take for granted like vote, open a bank account, apply for a job, or even get packages delivered.

Governments and NGOs in The Gambia, India, South Africa, Kenya and the US are already using Address Maker, with more partners on the way.

Most Popular
Related Article
Says Stories