How to Prepare for Google's June 2021 Core Web Vitals Update
We’ve been warned: In June 2021 Google will make an update to its algorithm that takes into account load speed, stability and interactivity. Google is not in the habit of issuing advance notices for algorithm updates, much less postponing an update to give businesses more time to prepare. For this impending update, it’s done both.
Understandably, it’s making anyone with a business that depends on stable search rankings quite nervous.
But just how big an impact will this update have on traffic (and thus conversion)? And what’s the best way to prepare?
We break down the facts, share our thoughts on projecting impact and suggest a plan of attack that your digital marketing and development teams can use to fit your timeframe and resources.
What exactly is the Core Web Vitals update?
Google has been measuring page experience—in other words, how users perceive their interaction with a page—for some time. But so far the search engine has only been considering the following signals to make that call:
- absence of malicious content
- a secure connection
- no intrusive interstitials (pop ups)
A good start, this is basic gatekeeping to exclude deeply problematic content. It means that currently results are filtered for content that is unsafe or rendered practically unusable by in-your-face popups or unnavigable layouts. The actual experience that awaits users on the “passed” pages, could still be pretty dodgy.
The impending Core Web Vitals update will add a new layer of signals to the above list by assessing the quality of the user’s engagement with the page. It will do this by adding three more metrics to the above page experience criteria, which focus on:
- load speed
- visual stability
Core Web Vitals promise to make the browsing experience better
If you’ve been accessing the web on any device, but especially your mobile phone, these three Core Web Vitals should strike a chord. They’re some of the biggest impediments to a smooth browsing experience.
No one wants content that takes ages to load, with buttons, sliders or toggles that aren’t immediately interactive and content that shifts around unpredictably and dizzyingly as it loads in.
As a user it’s easy to get on board with raising the bar to weed out these unpleasant experiences. But the thought that your content—content that is usually worthy of a top spot—could not make the cut, is unsettling.
Google doesn’t want that either. It has a dedicated Web Vitals section on its development resources website that covers each metric in great technical depth, shares tools for measuring them and gives suggestions for improving Core Web Vital scores.
As a decision maker in your organisation, these are the things we think you need to know to support your teams in getting prepared.
It’s taken Google some time to work out how load speed, interactivity and visual stability can be accurately and fairly measured across all content on the web. This is not a simple yes/no, on/off, “you either have it or you don’t”, consideration. And whilst there are a lot of ways to measure speed for example, not all approaches are consistent or can be standardised across content.
The technicality of the Core Vitals is inherent in Google’s official terminology for these metrics:
This is a measure of how long it takes for the largest piece of content to load on the screen.
This is the time between a user first interacting with a page (a link, a button, a filter) and the browser being able to actually respond to that interaction.
This is an aggregated score for the amount that visual elements have moved, unexpectedly, from their starting position.
Page performance will be rated on a scale
As might be expected from a more complex metric, the Core Web Vitals are measured on a scale. Each of the metrics have been assigned the following target scores:
- LCP: < 2.5 seconds
- FID: < 100 milliseconds
- CLS: < 0.1
Google then rates pages with scores below the target as “good” and those outside of them on a scale of “needs improvement” to “poor”. This traffic light design Google uses in its documentation makes it all a bit easier to digest:
Content will need to comply 75% of the time
To arrive at a score for a page, Google will look at the 75th percentile of page loads - in other words, 75% of page loads need to fit within the target to make the grade.
Google goes into a lot of depth explaining how they have arrived at the thresholds in these scales: They have combined what we know about the expectations humans currently have of computers (for example, how long we’re theoretically willing to wait for content to load), with user experience data from “the field” (i.e. how long we’re actually waiting before we abandon a page), to get these numbers.
The cutoffs they’ve landed on (and which we’ve shared in the graphic above) put roughly 10% of current web content into the “good” bracket and 10-30% in the poor one.
Core Web Vitals aren’t a filter for quality, they’re the north star for user experience
Core Web Vitals are a significant departure from the blunt and basic quality control measures that make up Google’s user experience criteria to date. Those filter out spam. These are ambitious objectives.
Through their thresholds, Core Web Vitals might appear to function like a filter. But the bigger picture is this: Google is unambiguously setting the bar for quality and is giving us a quantifiable way to assess and compare user experience.
With these Core Web Vitals, Google is saying: Currently your performance is down there. It should be up here.
OK, so how can you find out just how “down there” your content is?
How to measure Core Web Vitals
The good news is you won’t need to know the exact factors that go into each of the Core Vitals to get a measure of how your pages are stacking up.
Google’s Search Console has, since May 2020, included a Core Web Vitals report. The report gives a high-level overview of how your content is performing across those three metrics, following the traffic light system above.
This gives some sense of the extent of compliance with Google’s targets. But since the report assesses all pages as if they were equal, it may exaggerate or understate the risk or reward. Getting the nuance will be important. Google has more tools for that.
Lab vs. Real-World
There are five other Google tools that measure Core Web Vitals. Some of these, like the PageSpeed Insights tool, combine both lab data (how a potential user will likely experience the content) and field data (real-world data on how visitors actually experience the content).
Others give just lab or just field data.
Used in combination, these tools give the granular insight and feedback required to diagnose and address problems:
- PageSpeed Insights diagnoses both lab and field issues and the API can process a large number of URLs at the same time. (Great for SEO teams working out your priority pages and competitive landscape. We’ll get to that later.)
- Lighthouse, Chrome DevTools and Web Vitals Extension will help your technical team make and test changes in offline environments.
- Chrome UXReport can be used to set up a custom dash that gives you and your team a more granular and interactive way of monitoring updates than Search Console.
But what will all this mean in real terms? It’s hard to say exactly. Like most things Search, this needs to be put into your specific competitive context.
We suggest a two-stage approach:
Identify most at-risk pages and the traffic you might lose if they’re overtaken and assess the competitive landscape to understand just how big the likelihood is of being outranked.
Start the incremental process of updating content, beginning with your most at-risk content. Then keep going because this will be the new normal.
Stage 1: Predict & Prepare
How to predict the impact of the Core Web Vitals update
Whilst the Core Web Vitals update could shake up rankings, this is not a deviation from Google’s basic intent. It’s just Google getting better at doing what it has set out to do from the start: Serve useful content on a pleasant platform. What’s significant is that how pleasant the platform is will now count more than before.
In practical terms, it means content that is useful but on a poorly performing page could be outranked by content that is a bit less useful but on a better page.
So, to get a gauge of the impact on your web pages, you will want to get a sense of how your key pages are stacking up to those of your competitors.
Know your high-risk pages
Start by understanding your most valuable search assets:
- Which pages have been delivering most of your organic traffic?
- And, which organic entry pages have delivered most of your conversions?
Those two data sets might not be the same, and depending on your business, you may value the one above the other, so be sure to consider both.
Next, get a good understanding of the types of queries that are driving that organic traffic:
- What proportion of organic page traffic is branded?
- What proportion of organic page traffic is non-branded?
Your most vulnerable content will be those pages where you are getting significant search traffic via non-branded search. If they are outranked in the update, the decrease in traffic will be tangible.
Landing pages delivering most traffic
Landing pages leading to most conversions
Pages with low to no search traffic
High proportion non-branded traffic
Mostly branded traffic
But don’t get complacent about your branded traffic either. If you are in an industry where you are already actively competing with third party sites for your brand name (for example a hotel that must compete with third party booking sites), there’s a chance your branded traffic may also be at risk of being overtaken.
Understand the competitive landscape
With your vulnerabilities clear, it’s time to understand just how big the threat is to those pages.
First, try to get a good measure of the types of organic queries (keywords) that are driving traffic to those pages.
Next, compile the top 3-5 competitors for each of those keywords/queries/pages.
Finally, run a Core Web Vitals audit of all competitors alongside your top content to see how your content stacks up to that of the competitors for LCP, FID and CLS.
Putting it together
Combining your insights, you should have a matrix that lays out high-value pages against competition and allows you to clearly identify a priority order for making changes:
High traffic/value pages
Low traffic/value pages
High performing competition
Highest risk pages
Low performing competition
Lowest risk pages
With a better idea of the extent of the impact and the required urgency, your team can begin to implement changes.
Stage 2: Execute
Preparing for the Core Web Vitals update
With your priorities clear, it’s time to rally a team of experts and get to work. This is going to be a dialogue between your development teams (front and back end) and your Search specialists. Depending on the setup, you may need to consult a specialist in page speed and layout.
There’s no one-size fits-all solution. The appropriate response will be highly specific to your platform and the design and development legacy of your pages. Think surgery, not mechanics.
Triage and iterate
Start with the most at risk pages to see how they can be improved.
It’s also very unlikely that the first pass is going to solve all issues. Prepare to compromise and revisit pages to incrementally improve those metrics. Approaching this as a long-term health plan for your site rather than a quick-fix crash-diet can help to shape expectations, plan resources responsibly and importantly, secure stability in the long run.
Know what not to prioritise
If there is content that you don’t expect to ever play a meaningful part in generating traffic from search, it might be best left for later. The Core Vitals metrics can still be a valuable guide for adapting that content so you don’t lose your users due to poor experience, but you won’t have to take the scores as gospel for those lower priority pages.
If there’s a chance that some or all of your key pages are going to suffer a temporary setback in traffic (and there likely is), ensure you’re planning ways to supplement traffic from other sources.
Get your SEO, PPC and social teams aligned on this so you are not wasting resources, for example by setting up additional landing pages for campaigns when you can channel traffic to your search-optimised content. Or missing out on organic traffic off the back of a demand you’ve created through a social campaign.
Don’t stop optimising
It may be between the lines, but through its meticulous explanation of how it has determined the initial thresholds in the Core Web Vitals metrics, Google is hinting that the bar will continue to be raised as user expectations evolve.
So plan to dedicate some resources to Core Web Vitals-focused improvements for the foreseeable future. Even when all your key pages are within the target range, it will be a worthwhile investment to continue to press your teams to find ways of making pages surpass expectations. That way, when today’s “good” becomes tomorrow’s “needs improvement” you won’t be playing catch up with Google’s algorithm.
But it’s not just for Google. If every second delay in mobile page load leads to a 20% fall in conversion, then staying ahead of the curve also means maximising conversion.
Ultimately, it’s still what’s inside that counts
However much the quality of experience currently dominates the SEO agenda, it’s still the quality of the content that’s going to pack the biggest punch. After all, it’s highly unlikely that Google is going to serve rubbish content just because it’s on a great platform. The sum total of that is still a bad experience.
For everyone involved in the complex dance required to make effective digital content, the course is still set to exactly the same trajectory:
If we continue to commit to creating content that respects our users—by giving them substance in a format that is so issue free they don’t even think about the user experience—we will weather this turbulence just fine.