The successor to MUM is BERT, which stands for Bidirectional Encoder Representations from Transformers. This promised to improve the ability to capture human language.
On the other hand, MUM has been an evolution of BERT; that is, BERT's main innovation was bidirectionality. This consists of analyzing a sentence in two directions, that is, the words to the left and right of a keyword. This will allow search engines to understand contexts in greater depth. Meanwhile, MUM would explain how multidirectionality is used: "MUM has the potential to transform the way Google helps you with complex tasks. MUM uses the T5 text-to-text framework and is 1,000 times more powerful than BERT. MUM not only understands language, but also generates it," explains Google's official website. It also matches queries between words based on context, similar to real-life conversations.
"Language can be a significant barrier to accessing information. MUM has the potential to break down these boundaries by transferring knowledge between languages. It can learn from sources not written in the language in which you typed your search and help bring that information to you," Google adds on its updates page.
Furthermore, MUM is multimodal, so its search capabilities and search comprehension make it superior to BERT. Furthermore, it has the ability to interpret content in a variety of formats and media. This creates endless possibilities for results and complete customization for Google search users.