Google MUM New Algorithm Preview

Google MUM: New Algorithm Preview

T-RANKS » Blog » SEO News » Google MUM: New Algorithm Preview

Multitasking is a unique quality that sets anyone apart from its competitors within the market. Google introduces a new tech called the MUM that gathers deep knowledge by understanding texts, images, and video content.

Multitask Unified Model (MUM): Google’s New Tech (1,000x More Powerful Than Bert!)

https://www.youtube.com/watch?v=s7t4lLgINyo

Prabhakar Raghavan, one of Google’s intellect, presented a brand new tech called Multitask Unified Model (MUM) at the annual developer conference or the Google I/O last Tuesday.

Identical to BERT, MUM is built on a transformer architecture but is much more powerful (1,000x more powerful) and is able to multitask and deliver information in new ways to users.

Google further announced that they are currently running internal pilot programs using MUM, and there was no public rollout date specified.

Multitasking with MUM

One of the best qualities of MUM is that it can manage loads of tasks at the same time. As a matter of fact, it has over 75 languages and does its job simultaneously. With this, MUM can gather a better understanding of information more efficiently readily for everyone.

In Raghavan’s presentation at I/O, he has provided a couple of examples of the tasks that MUM can manage at the same time:

  • Gather thorough information of the world.
  • Can understand and create new languages.
  • Can teach over 75 languages.
  • Can understand various modalities, (allowing it to understand different forms of information such as texts, images, and videos).

How MUM Works

At the I/O stage, Raghavan used this question: “I’ve trekked Mt. Adams and now want to trek Mt. Fuji by next year. What should I do differently to get ready?” This query is an example that can make any modern search engine confused in providing relevant results.

However, in the simulated search with MUM, Google could emphasize the differences and similarities among the two mountains and provided insightful articles that tackle the proper types of equipment in trekking Mt. Fuji.

Because MUM works in multiple ways, aside from understanding the texts, it can also understand images and video contents. For example, a user can take a picture of their hiking equipment and asks if it is suitable for trekking Mt. Fuji.

Raghavan said in his presentation: “MUM could understand the content of that picture and the intent behind your question.”

For this hypothetical case, the multitask unified model would let the user know if their equipment is appropriate or not and direct them into an article or a blog that recommends proper types of equipment for trekking Mt. Fuji.

Bottom Line

Google manages loads of different tasks nonstop to provide a complete search experience.

But if MUM succeeds, this could lead any information more widely available for any modalities, such as texts, images, video contents, and could even break language barriers.

If MUM works the way Raghavan presented at I/O, it may allow people to conduct searches that they initially believed were very complex for an artificial intelligence to understand.

As we witnessed at the beginning of the global pandemic, businesses should adapt whenever a search behavior changes. Sad to say, we’ll have to wait to learn how this may affect the search behavior (if it does in any way).

On the other hand, if Google permits this advancement, other search engines will undoubtedly have a trickier challenge when dealing with their market share growth.

Comments are closed.