Googlebot cannot access CSS and JS files Error

Michael KeatingSearch Engine OptimizationLeave a Comment

If you recognize the title of this article because of an email in the last week, then you may have recently been filled with dread. Website owners from around the world were left quaking in their boots upon opening this daunting looking email from Google. The words “detected an issue” and “suboptimal rankings” were enough to leave many of us scrambling around for a solution. Luckily, it isn’t quite as complex as Google have made it sound.

The Google Warning

Let’s kick off by taking a look at that email. The one that has got everyone talking. At the end of July, Google begun sending out a warning to website owners around the world. The title simply states that Google cannot access CSS and JS files on your website, which was enough to leave the not-so-techy of us scratching our heads. You may also have found the same warning (referred to as [WNC-658001]) on your Google Search Console. The email goes on to explain that Google systems have recently detected an issue (that daunting phrase again) with your homepage. Essentially, Googlebot is unable to access any CSS or Javascript files because of certain restrictions within your robots.txt file. They then come in with the killer blow: “Blocking access to these assets can result in suboptimal rankings.”

Cue the mass panic of webmasters everywhere, who have been working night and day to build up their search engine rankings. This isn’t exactly news to many experts in the industry, however. Google has said a number of times that these assets should be unblocked, when they made the mobile friendly changes. They also added these changes into the Google Technical Guidelines last year, saying that blocking your CSS and JS files could hurt rankings. So, it’s not really something that’s only just come about. However, it is something that now needs to be dealt with.

Locating Blocked Resources with Fetch As Google

The first thing you’re going to want to do is identify any blocked resources. In the warning message you’ll have been given an option to do this via the ‘Fetch as Google’ tool. What they don’t tell you, is that you’ll have to check each and every page individually. I don’t know about you, but this sounds like it could take some time. What about the websites that publish dozens of articles a week? What about those with thousands of landing pages? Luckily, there is an easier way to check which resources have been blocked.

Head over to the Google Search Console and click on ‘Blocked Resources’. You’ll find this under the ‘Google Index’ tab. Here you’ll not only be able to see which resources are blocked currently, but you can monitor the historic changes also. This means you’ll be able to work out whether this is a recent issue or something that’s been ongoing. If you click on each URL on the list, you will be able to see how many other pages have been impacted by these resources being blocked. You may find that the most common occurrences are with certain WordPress plugins and setups. Now we’ve located the issue, we need to fix the problem.

Fixing the Google Error Message

Strangely enough, it is much easier to solve the issue than you may think. Although everything in the email and on the Google Search Console may have seemed confusing, the solution is quite simple. You’re going to need to locate the robots.txt file of your website first of all. This should be relatively easy to find, depending on what platform you’re using. WordPress users can also install a plugin that will allow them to edit this file directly. Search through the root files of your website and you should come across it.

Found it? Excellent! Now, let’s get rid of the code that’s causing all of the problems.

Removing the Blocking Code from Robots.txt

Only follow this next step if you’re comfortable editing your own robots.txt file. It is a good idea to backup your website first, before making any changes. If you feel uneasy doing it yourself, it may be worth asking a web developer to do it for you. Once you’re in, search for code that looks like any of these:
Disallow: /.js$*
Disallow: /.inc$*
Disallow: /.css$*
Disallow: /.php$*

To make things easier, you should be able to use CTRL + F to locate any of the above. These little lines of code are what is causing such big problems. The solution is simply to remove them.

<2>Adding the Correct Rules in Robots.txt

Not everyone affected has these lines of code in their robots.txt file. In fact, some people have barely any code in there at all. If this is the case then you need to add in the following:

#Googlebot
User-agent: Googlebot
Allow: *.css
Allow: *.js

This piece of code will come in handy if you’ve also removed the usual suspects above, but Google is still seeing blocked resources. Once you have banished any ‘disallow’ lines and included your ‘allow’ lines, you can then update your robots.txt file on the Google Search Console. The update can take a little while to have any impact, so leave it a short while before you check again.

Finally, you’re going to want to go back to step one; checking for blocked resources. You’ll now be able to see whether your changes have resolved the issue. In most cases, you should now be fine. If you do still find that you’re having problems then it may be worth talking to a web developer, who has experience in your particular platform. All being well, those threats of ‘suboptimal rankings’ will suddenly feel a million miles away. Well, until the next Google warning email anyway.

About the Author

Michael Keating

Twitter Google+

Mike is a prolific digital marketing strategist, entrepreneur and SEO specialist who understands how to drive results using integrated digital strategies. He is one of the founders of Octatools and is excited about the opportunity to help DIY SEOs and business owners get results online.

Leave a Reply

Your email address will not be published. Required fields are marked *