Alexei Kutsko from WP Speedguru and Miroslav Misho Meduric from WP Usabilitly have joined forces to write an article that attempts to answer an important question about trusting Google. Hint: yes, you can trust it.
I love Google. Google Search, Apps for Business, Analytics, Search Console, etc., etc., great stuff. Could be a bit more user friendly but great anyway. Google knows everything about everything it seems. If Google says or measures something – it must be true, right?
Google Almighty’s authority is rarely questioned. And that at times creates a lot of anxiety when Google tells us that we’re doing something wrong – like optimizing our sites for speed. So let’s take a close look at what really takes place when you use Google PageSpeed Insights (PSI).
The first thing to notice is that no actual speed is being measured. What is being measured is server response time. Yours may or may not satisfy Google, it uses an arbitrary figure. Server response will vary depending on what hosting you’re using and it will also vary on separate test runs. Google keeps your test results cached for 30 seconds, after that you can retry and see. The figure Google is using as good server response time is low. As it should be in the ideal world. But the world we live in is far from ideal. And when using PSI it helps to remember that all recommendations (and that’s what they are) will not necessarily apply to YOUR site.
What PSI does is scan your site looking for ways to improve things. It often cannot tell what it’s looking at. So it tells you to leverage browser caching and upon close inspection you see that it’s Google fonts and Analytics that are causing the perceived “issue”. It tells you to eliminate render blocking JS and CSS and/or minify but when you attempt it … it breaks your site. Seems no matter what you do, Google PSI is never happy enough. Don’t lose your sleep over it. At the end of the day all you need to keep Google happy is speed.
I’m not an SEO expert by any stretch. But here’s my personal observations. Looks to me like Google is steadily shifting away from using the “old school” signals like domain age, backlinks, etc. And it’s now relying more and more on user experience: bounce rates, time on site and number of pages visited and shared. So if you make your visitors happy (read: have good content, your site is fast and easy to use), then your chances to rank high are way better today than even a year ago.
So can we trust Google with speed recommendations? Absolutely. Can we always apply ALL of them to our sites? More often than not the answer is no.
Take care of the things you can fix and forget the rest. In the end Google will reward you with higher rankings for having a fast site, not a high score.
Google says that “Bounce Rate is the percentage of single-page sessions (i.e. sessions in which the person left your site from the entrance page without interacting with the page).“
This looks like a very useful metric, and among dozens of other metrics, it is important enough to be displayed in the default overview.
And there lies a problem.
Ok, somebody came to the site, landed on a single page and went away. But how long did they stayed? Have they found what they came for? Maybe they have noted the store address and happily went shopping an hour later, while we are counting them as a bounce. Maybe they read the whole 5000-word post we wrote, and they are still counted as a bounce.
Another problem is a Crawler and Ghost spam. If not filtered out, it will show a 100% bounce rate, adding a lot of weight to the average bounce rate.
Then there are unrelated-countries “spam”. If we are running a Swedish single-language store, someone landing from Hungary will most probably be a quick bounce.
So bounce rate is useless?
Well, not exactly. Consider a scenario where somebody landed in the middle of the sales funnel. You want them to go to the beginning or to the end of it. A bounce is probably an indicator that the page could be better optimized. That page also needs to have a low Exit rate (Visitors who exit the page to which they came from another page). Also, if the page has a clear Call to action, a bounce (and exit) means that this CTA was not followed.
Google Analytics data is not a ranking factor. Bounce rate is also considered not to be a (direct) ranking factor. But there is one thing worse than a bounce rate, known as “Pogo Sticking”. It essentially means that a visitor comes to the page, immediately leaves and goes to the related competitor’s page, and stays there. That means that our site failed in delivering, while the other one succeeded. This is almost certainly a ranking factor, if it happens consistently.
How to measure bounce rate more precisely
First of all, get rid of Crawler and Ghost spam. Those often severely affect bounce rate percentage. (Note that filtering will not affect the historic state. If we want to see our past traffic without spam, we need to use the same rules in a segment.
Once we got ridden of the spam, we need to decide which countries will be measured. Then we create a segment or view that filters out all the irrelevant countries.
Now we need to identify important click-through pages on our site, prioritize them and see how they perform. There is no magic formula about what is a good bounce rate. A better bounce rate is the one lower than what we currently have. 🙂
How to lower the bounce rate
There are countless methods for optimizing for the bounce rate, depending on the context and the purpose of the page. Almost any action will change the average bounce rate of the page, to the better or worse.
If there is no clear CTA, placing one will almost certainly reduce the bounce rate. Also, we should never forget that almost any page is also a landing page. People don’t come to the site only through the front door we call Homepage. They come through organic/paid search and links too. That means they can land anywhere on the site. So each page on the website needs to be able to clearly direct the visitor into the desired funnel.
Some bounce rate can be collected simply because a certain percentage of visitors don’t scroll below the fold. If something important is down there, some of them will not see it. This may seem like a small thing, but consider a shop that loses 2% of conversions because of this, and it makes $10 mil/per year. Two percent of that is a big loss. Whoever has seen scrollmaps of big webshops, certainly remembers the feeling which goes with a “face-palm”, when they realize how much money they were losing.
Blog posts are also a good optimization material. Some articles will have bigger bounce rate than the others. Some of the things that can be done here are interlinking with other related articles within the anchor text, clear CTA, as well as related posts at the end. We just need to make sure those posts are really related, for the UX and SEO purposes.
I remember once, on a very popular and authoritative blog, I have decided to go for a very questionable keyword, which also meant high traffic. It was somewhat related to the topic this business was about, but people it targeted were so far away from the persona this business was aiming for, that they were essentially useless. The keyword ended up #1 on Google, and enormous traffic started to pour-in, with 91% bounce rate!
So what I did to solve this problem was creating even more content for this new persona, and then direct them from the main article to the new content with related posts. The bounce rate dropped to 38%! Site has gained a lot with this traffic, allowing other strategically important keywords to rank even better. It is just that the goal for this persona was not to earn money, but traffic. And this brought more money in the end anyway, from the other personas.
Session duration metric in Google Analytics
This one is somehow related, so it is a good thing to measure it. We all had that moment when we went through our Analytics data and then suddenly under Behaviour/Engagement we see something like half of our site visitors having up to 10 seconds session duration! What! 10 seconds! All is lost…
All is not lost.
What is actually happening is that Google simply cannot measure how long someone stayed on the last page during their visit to our site. If that last page was also the first one, (people stayed only on one page and bounced-off), they will be automatically filed as 0-10 seconds session. So they could have read that 5000-words blog post and go away, but they will not only be filed as a bounce, but also as a 0-10 second stay!
On the other hand, “Average Time on Page” metric can be somewhat more accurate (found in Behaviour/Site content/All pages), but if the page we are checking also has a high Exit rate, this metric will be distorted too much, because there is more “last pages” mixed in the calculation, and those pages cannot add-up to the total visit.
In the end, we cannot go wrong if we focus on engaging our visitors and solving their unique problems. All the metrics will be just fine then.