According to HubSpot, Google has 92.42 percent of the search engine market share worldwide. With approximately 2 trillion global searches per year, you’re likely one of the billions around the world using Google Search to answer life’s biggest (and smallest) questions.
In 2019, Google’s engineers identified a trend in the way that users began to ask questions when the algorithm would come up short. For example, instead of asking “Does pineapple belong on pizza?”, a person might type “pizza pineapple good or bad”. This shorthand language eventually evolved into what one leading engineer labeled “keyword-ese” – shaping your question around the keywords you expect Google to understand.
Google’s Search team wanted to create an ever-learning search engine that could develop a genuine understanding for human language, taking into consideration the fluff that packs many of our sentences and still understanding the meaning of what’s being asked. With that driving force, BERT was born and launched.
BERT, or Biodirectional Encoder Representations from Transformers, allows Google’s AI to understand the context of an entire sentence rather than analyzing and matching keywords to search results alone. The advancements in NLU, natural language understanding, are impressive to say the least. Through artificial comprehension, BERT amplifies Google’s ability to match users with the content that they are looking for in a way that more direct and accurate than ever before. However, Google isn’t the only tech giant getting in on the power of BERT.
As an open-source project, BERT is able to be used by any team to further not only their own technology, but the NLU field. In addition to industry leaders like Microsoft and Facebook, BERT is being adopted by teams around the world to improve AI comprehension and better connect people to the content that they’re searching for across every platform.
For the SEO pros out there ravenous for ways to optimize your content, it’s probably wise to pump the brakes and reel those expectations back in. While it’s possible to optimize, the best optimization for BERT is to create content naturally. Since the system takes into consideration the context and nuance of what is analyzed, having high-quality writing that provides meaning without overcomplicating keywords and sentence structures isn’t a bad idea. As Google’s AI improves with time, what provided great results today might have less of an impact tomorrow.
The biggest takeaway from BERT is to understand that search engines are always growing and evolving, changing their methodologies and tweaking algorithms in a pursuit to connect users with the content they’re searching for. With new technologies and coding breakthroughs, new methodologies will be born. The best SEO practice for Google’s algorithm in 2020 is to simply keep up with the times.