How Improving PageSpeed Can Increase Organic Traffic

How Improving PageSpeed Can Increase Organic Traffic

By Tuesday May 29th, 2018

How Brolik used Google's PageSpeed Insights, and Lighthouse scores to highlight areas of improvement in the javascript, html, css, images, and resource delivery resulting in an increase of organic traffic by 60%.

Recently Google has started making a heavy push towards website page speed and mobile trends when it comes to its page rankings. From the introduction of their mobile-first indexing, to the development and inclusion of Lighthouse in Google Chrome’s Audit tools, it’s clear that paying more attention to mobile devices is important for your website’s SEO.

With the top-of-the-line 4G LTE speeds currently around 36 mb/s [1], and the national average for an ISP being 18.7 mb/s [2] in the United States, having an optimized website is important not only for mobile, but desktop as well. That being said, it’s been my recent experience that Google’s push towards “mobile first” is putting a much larger emphasis on page load time than before.

In early 2015, Brolik began building a new website for one of our clients. We had been doing marketing work for them for around a year prior to launching the new site and we were seeing around 180 unique visitors to their site per day. Following the launch of a new website in 2016, we saw a small drop in organic search traffic, having missed a few 301 redirects, but were quickly able to recover.

By using a combination of content creation and optimization throughout 2017, we were able to raise the number of unique visitors to the site from organic search traffic to around 700 per day. In early 2018 we began looking at the website’s optimization to further increase traffic. We ran the site through Google’s PageSpeed Insights and got some pretty low scores.

Google Page Speed Insights

PageSpeed Insights, focuses on a few key areas to improve load time:

  1. Render-Blocking JavaScript and CSS
  2. Enabling GZIP compression on the server
  3. Browser caching
  4. Minifying CSS, HTML, and JavaScript
  5. Optimizing Images.

You can also get a far more in-depth report by using Google Chrome’s Audit tool.

Below, I’m going to quickly go over what some of these things mean, and how you can approach improving them to increase your PageSpeed Insights score.

Render Blocking Javascript

This can be as simple as moving all of your JavaScript includes to the footer and adding a `defer` tag.

You can do other things, that make these includes better, including:

  1. Serving them up from a CDN
  2. Adding to the url to control caching
  3. Combining all of your files into a single JavaScript Include

All of these things allow the content of the page to load, and render without having to wait for the larger JavaScript files to finish.

<script defer src=”//ajax.googleapis.com/ajax/libs/angularjs/1.6.9/angular.min.js”> </script>

<script defer src=”/app/bundle.js?v=0.0.1”></script>

<script defer src=”/app/home.js?v=0.0.1”></script>

Render Blocking CSS

Google Suggests having small, inline css at the top of the page and asynchronous loading the bulk of the styles after the rest of the page has loaded. This can be a double-edged sword:, on one hand you improve the loading time, but you can get content jumps, where an unstyled site can appear then jump into place as the CSS finishes loading. Personally, I’m ok with this, but many people aren’t.

There are a few ways you can handle this. First of all, you should know that the initial congestion window is typically around 14.6kB when compressed, and anything beyond this will require additional round trips between your server and the browser. This roughly means you should try to minimize the size of your above-the-fold content and small inline css to around 14.6kB. Doing this can help minimize any style jump, and allows the content to be delivered to the user as quickly as possible.

Minifying HTML, CSS, and JavaScript

If you use a CSS preprocessor, minifying your styles should be as simple as setting the output to `compressed` or an equivalent setting.

JavaScript is a bit different. You’ll want to uglify your source code, when is the process of removing excess spaces. You could use some tool online or you can set up a build system with Gulp, Grunt, or Webpack to minify your JavaScript.

Similarly, HTML can be compressed with a project build tool or through PHP directly with `ob_start()`. This can potentially cause an issue if you have any inline JavaScript, so be sure to watch out for that.

Placing the following snippet in your header will minify the code within your project. It’s crude, and can cause some issues with any inline CSS or JavaScript, but it can work if you’re deferring your JavaScript includes correctly.

<?php // Min HTML
function sanitize_output($buffer) {
    $search = array(
         '/\>[^\S ]+/s',  // strip whitespaces after tags, except space
         '/[^\S ]+\</s',  // strip whitespaces before tags, except space
         '/(\s)+/s',      // shorten multiple whitespace sequences
         '/<!--(.|\s)*?-->/' // Remove HTML comments
    );
    $replace = array('>', '<', '\\1', '');
    $buffer = preg_replace($search, $replace, $buffer);
    return $buffer;
}
ob_start("sanitize_output");
?>

Optimizing Images

This is only one step of what you should be doing with images, but you can use a tool like ImageOptim or Photoshop to size down your images before uploading them to your site. In a perfect world, you should be doing this in addition to “lazy loading” them.

If you’re using a build system, you can work this into the build process. Keep in mind that this won’t work if the images are hosted on a different server, like an S3 bucket. Because we use Gulp, we do this with the imagemin npm package.

gulp.task('images:min', function() {
  return gulp.src(settings.img.input)
.pipe(cache('minification'))
.pipe(imagemin([
   imagemin.gifsicle({ interlaced: true }),
   imagemin.jpegtran({
     progressive: true,
     optimizationLevel: 6
   }),
   imagemin.optipng({ optimizationLevel: 6 })
]))
  .pipe(gulp.dest(settings.img.output))
});

The Results

By reducing our image sizes, deferring the loading of much of our JavaScript, and minifying the CSS and HTML further, we were able to improve our scores from 45/34 to 70/79. Around a week after doing so, we saw a jump in visits from organic search of roughly 60%, to around 1,200 daily users with no other marketing changes happening in that same week.

It’s difficult to completely attribute the change in organic search traffic to the increase in website load time, but we at least saw a dramatic increase. Certainly enough to warrant looking into these improvements for our other clients’ sites as well.

References

Like what you just read?

Sign up for updates whenever we post a new article. No spam. Promise.