How Google is Transforming the Fundamentals of Search
Imagine having an informative and coherent conversation with Google assistant about any topic on your mind. Google’s latest search engine technology does just this, with the ability to better understand and respond to human queries using advanced language processing skills. Multitask Unified Model (MUM) was released last month, answering the calls of previous developers to deliver tech that can interpret and respond to complex queries instantly.
Why MUM is Expected To Transform the Way We Search
According to Google, it still takes us eight queries on average to tackle a task like planning for a hike using our current search engines. With MUM you can gather all the information you need with one complex query. MUM will provide you with more holistic answers based on your current situation and gathered knowledge. This ability is unlocked by a number of powerful features MUM possesses.
Thanks to its advanced language utilisation and capability to multi-task, MUM can respond to more complicated search queries. MUM’s access to Google’s extensive resources allows it to understand any form of information such as text, image and video with audio capability in the future. Unlike previous models, MUM can understand the world more comprehensively and identify helpful information for specific real-life contexts. Take the “Can I plant an Avocado tree in my backyard” query, for example. MUM could identify if the conditions are suitable, and describe the ideal time, soil, and location required for avocado plants to thrive.
Mum’s language ability is considered a thousand times more powerful than its predecessor, the BERT model. Both were built on a Transformer Architecture, an experience-based language processing model that allows them to interpret and translate languages more precisely. MUM is also capable of generating information in more than 75 languages itself. Today, we still experience great difficulties due to language barriers in our pursuit of information. Our search results only return the answers written in the language we searched in. MUM is expected to gather knowledge from the most credible sources, be it Spanish or Japanese, and present the answers in the preferred language. If succeeded, MUM could give us access to previously undiscovered regional or historical knowledge .
In their latest keynote, Google highlighted the multi-modal capability of this Search breakthrough. By asking MUM to assess an image pair of boots and decide if they are suitable for a hike up Mount Fuji, they showcased its ability to read information presented in different formats simultaneously. For now, the example only referred to MUM’s ability to understand users’ queries across various format types (partially similar to the concept of Google Lens). Once MUM could scan through any images and video in search of the best answers, it could address users’ search queries in the most suitable format.
What’s Next?
Various questions arose around whether or not AI, specifically MUM, will completely replace traditional search engines in the future. The answer comes down to the time-frame machine learning will require to absorb the richness and complication of real world’s contexts. However, according to Google we can expect to see MUM-powered upgrades for Search in the coming months and years.
Stay tuned for more updates on innovations in the world of Digital Marketing and Search from ADMATIC’s passionate team, or reach out for more!
Next Article