This multitasking model is the latest update created by Google. It is specified in the company\'s search algorithm and is based on artificial intelligence (AI). The search engine\'s improvement lies in the tools that facilitate its ability to provide answers, thus enhancing its accuracy. It also uses texts, images, videos, and podcasts to delve deeper and provide a better answer. The successor to MUM is BERT, which stands for Bidirectional Encoder Representations from Transformers. This promised to improve the ability to capture human language. Furthermore, MUM was an evolution of BERT; that is, BERT\'s main innovation was bidirectionality. This involves analyzing a sentence in two directions, that is, the words to the left and right of a keyword. This allows search engines to understand contexts more deeply. Meanwhile, MUM can be explained as multidirectional. "MUM has the potential to transform how Google helps you with complex tasks. MUM uses the T5 text-to-text framework and is 1,000 times more powerful than BERT. MUM not only understands language, but also generates it," explains Google\'s official website. It also relates queries between words, according to context, similar to real-life conversations. "Language can be a significant barrier to accessing information. MUM has the potential to break down these barriers by transferring knowledge between languages. It can learn from sources not written in the language you used for your search and help deliver that information to you," Google adds on its updates page. Furthermore, MUM is multimodal, so its search capabilities and understanding of those searches make it superior to BERT. It also has the ability to interpret content in various formats and across different media. This creates endless possibilities for results and complete personalization for Google Search users.