Here’s what the warning looked like in Google Search Console:
How did we get here?
Popular to contrary belief, Google is no longer “as blind as a bat” when it comes to rendering our sites; it is capable of fetching and rendering pages in complete form.
It is also unsurprising that these notifications would be released so soon after the latest Panda update, the algorithm update designed to address low quality content sites.
Who is mainly being affected?
So now that you have all the requisite background knowledge, let’s get right down to it and answer the question you’re all here to answer.
1. Perform Fetch and Render tests in Google Search Console to view any discrepancies between the way Googlebot is rendering the site and the way a browser is rendering the site. Fetch and Render can be found under the Crawl section in the Search Console side menu.
2. Next, view results and obtain of the Fetch and Render test where you can find the list of resources that Googlebot couldn’t access. Test each individual resource being blocked by clicking through to the robots.txt tester.
3. The Robots.txt Tester will show which line of the robots.txt file is causing a blockage of any given resource. In the example below we can see that the resource identified in our Fetch and Render test (wp-includes/js/wp-emoji-release.min.js?ver=4.2.3) is being blocked by the line – Disallow: /wp-includes/.
You can test the effectiveness of these before making actual adjustments to your live robots.txt file, by re-running the robots.txt tester to ensure that the resources are unblocked correctly.
4. Finally, you’ll want to re-run the fetch and render tool to ensure that googlebot is now rendering your site correctly.
Have any questions about this process? If so I’ll be around to help troubleshoot any issues you may be having.