(713) 568-2763

How to Handle Google Search Console Warnings for Blocking JavaScript and CSS

How to Handle Google Search Console Warnings for Blocking JavaScript and CSS

On July 28th, webmasters started reporting on new notifications being sent by Google through the Google Search Console regarding websites that were blocking key CSS and JavaScript resources. While these notifications are not indicative of a penalty, they are a clear warning that preventing Google from seeing all of your site could result in a drop in keyword rankings.

Here’s what the warning looked like in Google Search Console:

search-console-warning

How did we get here?

For many in the SEO industry, this new notification hardly comes as a surprise because Google has continuously reiterated their displeasure with webmasters blocking CSS & JavaScript, going as far back as 2012.

Popular to contrary belief, Google is no longer “as blind as a bat” when it comes to rendering our sites; it is capable of fetching and rendering pages in complete form.

It is also unsurprising that these notifications would be released so soon after the latest Panda update, the algorithm update designed to address low quality content sites.

John Mueller recently recommended that JavaScript and CSS resources were not blocked so Google could more easily recognize a site’s content and “give it the credit that it deserves”. This sentiment also ties into the recent Mobile algorithm update, where Google sought to reward sites with better mobile user experiences. Blocking CSS files – which dictate responsive design elements on a site – from being crawled would prevent a site from gaining that positive mobile experience credit.

This all directly follows from Google’s update of their webmaster guidelines back in 2014 where they had specifically warned that “Disallowing crawling of Javascript or CSS files in your site’s robots.txt directly harms how well our algorithms render and index your content and can result in suboptimal rankings”.

Who is mainly being affected?

It appears that many WordPress sites are being affected by these notifications and this is due to practices of blocking /wp-includes/ folder in the robots.txt. Other popular WordPress robots.txt practices have been to block the /wp-content/ folder, these create problems for Google’s spiders because these folders contain plugins, themes, and images that are crucially involved in the rendering of sites. Plugins, in particular, utilize a lot of JavaScript and CSS, so blocking the /wp-content/ folder would be directly at odds with Google’s desire to access all JavaScript and CSS files.

So now that you have all the requisite background knowledge, let’s get right down to it and answer the question you’re all here to answer.

What to do when you have received the JavaScript and CSS notification in Google Search Console?

1. Perform Fetch and Render tests in Google Search Console to view any discrepancies between the way Googlebot is rendering the site and the way a browser is rendering the site. Fetch and Render can be found under the Crawl section in the Search Console side menu.

fetch-and-render-test-forthea
2. Next, view results and obtain of the Fetch and Render test where you can find the list of resources that Googlebot couldn’t access. Test each individual resource being blocked by clicking through to the robots.txt tester.

fetch-and-render-forthea-example
3. The Robots.txt Tester will show which line of the robots.txt file is causing a blockage of any given resource. In the example below we can see that the resource identified in our Fetch and Render test (wp-includes/js/wp-emoji-release.min.js?ver=4.2.3) is being blocked by the line – Disallow: /wp-includes/.

At this point we can either remove that line in the robots.txt (be careful to make sure that doing this does not allow URLs that we don’t want to be crawled by Google to be found) or we can allow individual resources by adding the “Allow:” function. Google’s Gary Illyes also posted on Stack Overflow a simple way of unblocking JavaScript and CSS files to allow the GoogleBot to crawl all those resources:

User-Agent: Googlebot
Allow: .js
Allow: .css

You can test the effectiveness of these before making actual adjustments to your live robots.txt file, by re-running the robots.txt tester to ensure that the resources are unblocked correctly.

robots-test-before-forthea

4. Finally, you’ll want to re-run the fetch and render tool to ensure that googlebot is now rendering your site correctly.

fetch-render-complete

Though past warnings may have gone unheeded, it is clear that Google is now strongly encouraging that you unblock your JavaScript and CSS files to allow its crawlers complete access to your site’s resources. In light of the recent Panda and Mobile Usability algorithm updates, it is increasingly important for webmasters to ensure their sites can be entirely crawled so that their content can be given the credit they’re due.

Have any questions about this process? If so I’ll be around to help troubleshoot any issues you may be having.


Nima Alagha
Nima is an SEO specialist at Forthea and joins us all the way from merry old England. He enjoys putting his business & politics degree to good use to understand consumer behavior and will happily discuss the latest trends in economic data with you over a good cup of tea. You can follow his thoughts on Twitter @nemofindsworld.

0 Comments

Leave a reply

Your email address will not be published. Required fields are marked *

*