This is part two of our three part series on what our SEO experts took away from their time at Search Marketing Expo (SMX) in early December, as well as their learnings from working in SEO throughout the year.

You can read part one from the Andrei Popa, the PPC and SEO expert in our Copenhagen office, or part three from Rick Elenbaas, a Digital Marketing Consult in our Sydney office.

Learn how broad core updates operate

Glen Gabe had a great session at SMX on understanding the complexity of Google’s broad core algorithm updates. Glen had several great insights on how broad core updates work, how often they roll out, and how to improve.

One key point is that these updates tend to roll out a few times per year and are made up of millions of baby algorithms to output a score. This is important to understand, because SEO is made up of several different components, so you shouldn’t cherry pick changes. Instead, focus on improving your website overall—fix everything to get the best results.

Glen also highlighted his usual suspects of issues that make up the broad core updates, which are the following:

  • Low quality and/or thin content
  • Poor user experience
  • Advertising that is aggressive, disruptive, and/or deceptive
  • Lack of E-A-T (expertise, authority, and trust)
  • Technical SEO issues
  • Relevance adjustments

Typically, if your website was impacted negatively by an update, you will only see significant movement during the future broad core updates and cannot recover in between. Webmasters also typically see additional negative movement if you have not made the effort to significantly improve your website.

The recommendation is do not let things sit for too long. Be proactive and start optimizing your website as soon as possible so you can recover in time for the next update.

Focusing in on content optimization

Don’t overlook content as a key optimization, because search engines typically look to surface the highest quality and most relevant content for each keyword index. Content is king and webmasters need to keep publishing high quality, unique, original content that is engaging and relevant to the keywords you want to target.

Focusing on E-A-T (Expertise, Authoritativeness and Trust) is another key optimization factor. These content topics must be authored by trusted experts who have authority on that subject.

This is especially true for YMYL (your money, your life) topics. These topics or pages could potentially impact a person’s happiness, health, financial stability, or safety. Examples of YMYL topics are medical advice, information about people, shopping information, financial advice, government and law topics, and news or current events. 

Content is very important for Google, so their broad core updates will be more critical of websites with inaccurate or deceptive content that could impact reader happiness, health, financial stability, or safety.

Glen suggested when building content, ask yourself

  • Does your content creator have expertise in this subject?
  • Is the creator or website an authority on the topic?
  • Can people trust the creator and website? 

Lastly, Glen recommended reading Google Quality Rater Guidelines to get a better idea of what Google considers great content. I also agree with this advice. It’s 175 pages, but there’s great information on Google’s search quality raters’ guidelines, how it impacts search, and more about of E-A-T.

You need to prepare for the Page Experience Update

Earlier this year, Google announced that user experience would become a Google ranking factor. This development would introduce a new signal that combines Core Web Vitals with our existing signals for page experience to provide a holistic picture of the quality of a user’s experience on a web page. This new update will be called the Google Page Experience Update.

SMX had a detailed session on how to prepare and capitalize on this to help build a user experience that will perform for years to come for your website.

Page experience is a set of signals that measure how users perceive the experience of interacting with a web page beyond its informative value. Core Web Vitals is a set of metrics that helps to measure the user experience for loading performance, interactivity, and the visual stability of a webpage.

This also includes the existing search signals such as mobile friendliness, safe-browsing, HTTPS, and guidelines for intrusive interstitial.

Screen shot of how the Core WEb Vitals contribute to search signals for page experience

SEO engineer Aleks Shklyar had several recommendations on what you can do to improve Core Web Vitals metrics, including:

  • Largest Contentful Paint (LCP): Focus on the critical rendering path and optimize your server response to ensure the main content in viewport loads under Google’s recommendation of 2.5 seconds.
  • First Input Delay (FID): Reduce Total Blocking Time and review the critical rendering path. Your goal should be under 100 milliseconds.
  • Cumulative Layout Shift (CLP): Reserve static image space for images, videos, ads, etc., and aim for your pages to have a score of less than 0.1

Aleks also spoke about the existing signals, such as:

  • Mobile friendliness: Google will only look at mobile pages with mobile-only indexing. If your website is not mobile friendly, make sure to fix that as a starting point.
  • Safe browsing: Ensure your website provides a safe and secure browsing experience. Migrate to HTTPS if you haven’t already done so.
  • User experience: Provide a good user experience with interstitials by avoiding covering the main content as this provides a poor experience for your customers and can lead to your site being penalized.

Google has also announced recently that this update will be rolled out in May 2021. For more information, check out Siteimprove's webinar - Are you ready for Google's Page Experience update.

How AI enhances search results

Back in October 2019, Google released a new update called BERT, which stands for Bidirectional Encoder Representations from Transformers. It is a neural network-based technique for Natural Language Processing (NLP), which was open-sourced by Google. 

This update was designed to help Google better understand nuances in search queries and content to produce the most relevant search results. BERT was designed to impact long tail search, including featured snippets.

During SMX, Dawn Anderson shared how these advancements are changing search and the implications for SEOs and digital marketers.

One section of the presentation highlighted new updates from Google from its Search On 2020 event. This event shared several advancements in search ranking, which was made possible through the latest research in AI.

Some highlights included the fact that BERT now powers almost all English based queries on Google Search to deliver more relevant search results.

Another advancement is that Google now has greater ability to rank passages of content independently. Google can now understand the relevancy of specific passage, not just the overall page content. This will help a searcher find that needle in a haystack.

An example of how Google highlights specific infomration to answer search questions

Another interesting development is subtopics. Google has applied neural nets to understand subtopics around an interest with the goal of delivering greater diversity of content for broad queries.

Lastly, Dawn was also joined by Nuo Wang Pierse (Senior Applied Scientist at Microsoft), who gave great insight into two techniques that employ BERT-like models that are currently being used by Bing.

I hope you find these key learnings from SMX helpful as you launch your 2021 SEO strategy.

Check out part one of this series by Andrei Popa, the PPC and SEO expert in our Copenhagen office, or part three from Rick Elenbaas, a Digital Marketing Consult in our Sydney office.