Core Web Vitals: a guide for eCommerce websites
For years SEO has been about the informational value of pages: on page SEO, link profiles, the crawlability of sites through technical SEO and so on. From 2021, although these factors are still crucial, Google has moved on to focus on something different: user experience.
A lot of attention has been given to the Page Experience Update and Core Web Vitals this year. Essentially, this is Google’s attempt to improve website performance for users, making sure that businesses deliver content quickly and offer great experiences and interactions on their websites.
Read on to find out what they are and why they’re causing such a stir in the SEO world.
It has to be said that in terms of SEO, unless you have a background in website development, an SEO specialist can only really point out what the issues are through audits. So if there is no budget or resource available from developers or webmasters to fix these issues, there’s little to no point in auditing them, unless you’re trying to build a case for why they should be focused on.
So don’t let this be just another tick box exercise to send a client which will never lead to any actual website improvement. Make sure that there is a goal in mind when looking at Core Web Vitals.
What are Core Web Vitals?
SEOs have been looking at Page Speed measurements and some UX factors for a while now (think of page speed, dwell times, click through rates etc.). The Page Experience update (featuring Core Web Vitals) is basically Google making UX and page experience a key part of their algorithm to make marketers step up their game and give users a great online experience.
Think of it this way: Google wants to be the best search engine on the market. To do that, they need to offer the best results to searchers. How best to achieve that result than by forcing the hand of marketers to provide that to them? If you don’t, chances are Google won’t want to rank you as highly as a competitor site that does.
The Core Web Vitals rollout began in June 2021 and will likely be finished by August of the same year. I believe this is especially important for eCommerce sites as the competition is fiercer, and there are more actions that a user is likely to be able to complete on your site, so they should be able to do so with more ease.
It’s important to note that other ranking factors will still be crucial, but just make sure to take these new ones on board to continue to compete. If you and your competitors are neck and neck in terms of on and off page SEO and technical SEO, if their page experience is better, they are likely to outrank you now.
Page Experience is all about the quality of a user’s experience on a page, looking at 5 main factors:
HTTPS (it has been for a while now - essentially Google prioritises businesses that protect user data with encryption through SSL certificates)
Mobile friendliness (also around for a while)
Non intrusive interstitials (pages with loads of pop ups truly make for an awful site experience and Google knows this)
Core web vitals
More specifically, the current set of core web vitals (these are likely to evolve over time) are loading, interactivity/responsiveness and visual stability. So it’s no longer about an arbitrary “how quickly does this site load” measure, but about how quickly and effectively a site loads for users in real time, with their internet connection and device.
Essentially, this aims to measure how good the user experience is in terms of how your pages load for them. It’s not just about the speed at which pages load (although this certainly does play into it) but about how they load and how easy they are to use in the meantime.
The specific metrics to be looked at:
Largest Contentful Paint
With LCP, you are measuring how quickly your page loads for a user: the quicker, the better. Now keep in mind that this is not a measure of how quickly the entire page loads, you’re measuring how quickly the largest content element (image, text, video etc.) on that page takes to load.
It measures (in seconds) the time it takes from when the page begins to load and when the largest element (either a big text block, or a banner image for instance) is rendered on the user’s screen. Essentially, this helps to gauge how long it took for a page's contents to load - the lower the score, the better.
Once you know what it is, it becomes easier to optimise. Google’s guidelines recommend that LCP should load within the first 2.5 seconds. Everything under 4 seconds needs improvement, and you can consider everything over that as performing poorly.
Bear in mind that LCP is dynamic, as the first thing that loads might not immediately be that large image or text block etc. The LCP shifts to that large image when it appears on the screen.
Google has declared that LCP is affected by:
Slow server response times: optimise your server, use a CDN, cache assets etc.
Slow-loading resources: optimise your images, preload resources, compress text files etc.
A lot of this looks like the type of thing that you need to optimise to improve Page Speed scores, so if you optimised these for that, then you should be in the clear now. Either way, if you find that you have a low LCP score, Google has more documentation on the background of LCP and how to optimise for it.
First Input Delay
FID looks at the interactivity and responsiveness of a page (for instance, can a person click on an image or link and get what they want, how long does the site take to “react” to an action?).
Essentially, you’re looking at how long the browser takes to respond to your first action on the site - you’re not waiting around to be able to click onto a new page or play that video: that’s frustrating for the user. In fact, for a good score, a user needs to be able to carry out an action within 100 ms.
An important note: FID can only be measured if there is a user interaction. That means that tools cannot speculate on what this is, it needs to be measured with field data, where real users have been tested. But this also means that it’s affected by all sorts of things such as people’s internet connections, the device they use, their own response times etc - so much less controlled than say lab data. And this is why scores can fluctuate in different tools - if it’s based on user data, this is going to change a lot.
This is the hardest part, though. Most sites can gain a lot by:
Lazy loading Jsome scripts
Breaking up complex tasks
Cumulative Layout Shift
Finally, CLS looks at the visual stability of pages. We’ve all been there before: you load a page, want to click on something but as the page is still loading it jumps and you end up clicking on the wrong thing - sometimes finding yourself in an infinite loop of impatience as it keeps happening.
Here, CLS looks at various frames of your page loading to see how the elements move (or don’t) across the page as it loads. It takes all the points at which layout shifts happen and calculates the extremity of those movements. Google considers anything below 0.1 good, while anything from 0.1 to 0.25 needs work. You can consider everything above 0.25 as poor.
Another major culprit: images. When building a page, developers should specify the width and height of an image in the code - otherwise they are leaving it up to the browser to figure out how the image should appear on the screen.
Never leave something to someone else to figure out when it is within your power to specify.
On a page with some images and text, the text will appear on the screen first, followed by the images. If no space has been reserved for the images within the code, the top part of the loading page will be filled with text, prompting them to start reading. The images, however, load later and appear in the spot where the text was first.
We’ve all experienced this, the text gets pushed down, leaving us frustrated, and sometimes quite annoyed.
Tools to Measure Core Web Vitals
Rule number one: mobile and desktop scores may differ significantly, so make sure to test them on both, always. As for fixes, do make sure to prioritise mobile scores.
Google Search Console
Google Search Console used to have a “speed” report which has now been renamed Core Web Vitals (you’ll be able to find it in the Experience section). This breakdown will look at all the URLs on your site and categorise them as Good, Needs Improvement and Poor.
If you click into this report, you can see exactly what the issue is:
And clicking into this will show you all of the affected URLs.
In order to pass the Core Web Vitals assessment, you need to score “good” on as many URLs as possible. For those that “need improvement” or score poorly, clicking into the report will tell you which metric needs improving and which URLs are affected:
I would not rely solely on Google Search Console’s reports - they’re pretty limited so I’d use some of the following tools to audit your pages.
PageSpeed Insights includes both field and lab data to measure core web vitals and gives you advice on what to improve and guidance on how to do so.
Lighthouse includes multiple audits that PageSpeed Insights doesn’t have, and it even has some SEO checks in there for good measure.
Pro tip: if you have a paid Screaming Frog license, you can use this in conjunction with a PageSpeed Insights API in order to see your scores for all pages. Use this awesome step by step guide to see how.
There are more tools that you can use, but I think that these are more than enough. Using more than one is key as each one will show you different measures and give you different scores and guidance on what to improve, so looking at your pages with more than one tool at a time will give you some comprehensive answers on what needs to change.
As previously stated, improvements to these scores can only really be made by a developer - but it’s important to be able to explain why it’s important - hence the reason for this blog post, which aims to break down what the update is about and what needs to be looked at and the tools with which to do so.