When you’re practicing SEO, it’s incredibly important to not only know what opportunities and challenges to look for, but how to find them quickly and easily. So we’re going to take a moment to dive into the Google Chrome developer tools to help us find whether a page and its resources are compressed as well as an advanced tactic of providing search engine indexing directives. I’m going to walk us through a bit of what both of them mean, where to look for them, and how to create a way to surface those details without having to dig in on every resource.
Finding Content Encoding
To start out, what is content-encoding and why are we looking for it?
Without spiraling out of the realm of my expertise, in technical SEO, content-encoding is used to describe the type of compression for a file. Or, more appropriately, whether a page is compressed or not (we don’t need to split hairs too often about the type). By default, most modern browsers accept three types of content-encoding: deflate, gzip, and brotli (often shortened to br). It is most common to see gzip or br out in the web and less common that you’ll see deflate. The reason for that is that gzip uses the deflate algorithm.
Let’s pull ourselves back out of the weeds. The reason we are looking for content-encoding is because it is a good practice to compress pages, CSS, and scripts on your site to reduce load time. This becomes far more important for users with worse internet connection. We want to take an Ikea approach – pack the disassembled bookshelf in as small of a box as we can so you can load it in your car, take it home, unpack it and put it together as opposed to having to haul a fully-assembled bookshelf, which might require you to rent a lager vehicle and get a friend to help you move it into your 3rd floor apartment. Rough analogy, but you get the point.
Let’s dive in. You can follow along on this page. There are a ton of ways to get to the Chrome Dev Tools. You can go up to the Chrome menu > More tools > Developer tools. You can hold Ctrl + Shift + I . You can hit F12. Or you can right click and go to Inspect.
With dev tools open, go to the tab called Network. It will appear blank and tell you to refresh the page, so go ahead and do that. (I’m going to be using my robots.txt file in the example screenshots to reduce the clutter.
At the very top (you might have to scroll back up), you’ll see the page name. Select that and it will open the details. This will be broken down into sections: General headers, Response Headers, and Request Headers. We’re going to be looking at response headers, which represents what the browser gets back from the website. This is the section where we’ll find content-encoding. If you’re following along on my site, you’ll see content-encoding: br
.
If you don’t see this, double check the status code in the General section. I use caching so if you see Status Code: 304
, you need to cache refresh the page. Best way to do that is Ctrl + F5, Ctrl + Shift + R, or Cmd + Shift + R (on Mac).
This is a tedious process every time to see the content-encoding, if you’re checking multiple pages, subdomains, or sites (usually every page on a subdomain will follow the same pattern of content-encoding). The good news is that we can add the content-encoding header to the non-detailed view of the Network tab. X out of the details and right click on the column headers, drop down to Response Headers and select Content-Encoding.
Now, everywhere you go on the web, your Network waterfall in Dev Tools will show content-encoding at a quick glance!
Adding X-Robots-Tag Column
Same question as before, what is the X-Robots-Tag and why are we looking for it?
This is slightly more advanced technical SEO as well. You might be familiar with the meta robots tag to set index directives. The X-Robots-Tag is no different at its core, except it is a response header instead of a meta tag in the <head>
section of an HTML page. It’s particularly useful if you don’t want search engines indexing non-HTML pages like PDFs or images.
Default X-Robots-Tag directives are index
and follow
, so you might not come across this header very often. Since those are the default action, explicitly adding an X-Robots-Tag for those is unnecessary.
I’ll save us a screenshot, you can find the X-Robots-Tag in the same details that we found the content-encoding. In fact, they are in the same screenshot above where my robots.txt indexing directive is noindex, follow
(frankly, again, the follow
is unnecessary in this case since it is the default directive).
Adding it to the non-detailed view is not quite as straightforward. We will go through similar motions, though. Right click the columns, drop down to Response Headers and you’ll notice X-Robots-Tag is not there. Select Manage Header Columns. This will pop open a box with the header columns. Again, X-Robots-Tag eludes us. Select Add custom header… which will populate a small input section. Type in there, verbatim, X-Robots-Tag, and select Add.
And there we have it! Now anywhere you go on the web, you’ll be able to quickly find whether the pages are compressed and/or they allow indexing. Keep in mind, this column will only check the X-Robots-Tag and will not check the meta robots tag.
Eric is a Python SEO with a passion for data. He uses Python to automate tasks and analyze large data sets. He is often the go-to for spreadsheet challenges, too. He sees every challenge as an opportunity to learn.