Back to Top

Solution: Googlebot Cannot Access CSS & JS Warning

Googlebot cannot access CSS and JS files

Recently, Google has updated Webmaster Guidelines and optimize the rendering and indexing of site, you should allow Google spider to visits JavaScript, CSS and image files used by the page.Google Web Crawler has been expanded to cover the part of JavaScript and CSS so do not block the .js and .css files on your web server used for the site.

If you use robots.txt to ban JS and CSS files, You might receive an email from google Search Console which is about Googlebot can not access your JavaScript and / or CSS files.These files help Google to understand that your site is currently working normally.

Here is the warning the warning email I got:

To: Webmaster of,

Google systems have recently detected an issue with your homepage that affects how well our algorithms render and index your content. Specifically, Googlebot cannot access your JavaScript and/or CSS files because of restrictions in your robots.txt file. These files help Google understand that your website works properly so blocking access to these assets can result in suboptimal rankings.

If Google search engine or Googlebot will not be allowed to access your CSS and JS files, It may affect your ranking in Google Search so it is recommended do not prohibit JS and CSS file access from robots.txt.If the site is prohibited access of JavaScript or CSS file then Googlebot will not allow visitors to view it so it recommended practice is to allow access to the JavaScript and CSS files by the robots.txt.

Here I am going to show you how to fix this issue when you have the WordPress blog.

How To fix the problem using robots.txt

There is a very simple fix to solve this issue. The solution of this issue is to allow Googlebot to access your CSS and JS files by updating your robots.txt file. The robots.txt file is in the root directory of your server which gives instructions to search engines about what to crawl and what not to.

Here’s the line of code to add inside robots.txt

After adding above line of code into robots.txt file of your site, Go to Google Webmaster Tools and inside that you can find the “grab and presented” this place:

Then you can click on the “fetch and render” start the test, the results you’ll know that your page CSS or JS being accessible by google bot and confirm whether your problem fixed.In addition, you can also use the robots.txt testing tool from Search Console to identify if there are any other crawling issues in your site.

What are fetch and render?

Fetch and render tell Googlebot to crawl your page and your web browser will show how your page would be when visitors show the same page. First, Googlebot will get all the resources (such as images, CSS and JavaScript files) from your website and will run the corresponding code in desired order.


From suddenly got an email from Google Webmaster Tools, but you can deal with the issue easily after reading this article.However, if you have any query or confusion in this, you can ask in the comment section.

Comments (1)

  1. Hi very informative article, i too got the message from google that 16 urls are blocked and need to unblock the resources. so when i seen the error i found it is blocking from wp-admin. now my question is can i allow wp-admin for google and keep disallow wp-admin for other search bots. can you suggest me what to do . if i allow wp-admin there will be any secuity issues about my website . need your suggestion

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Most Popular Posts

How To use Pagination in Joomla

Posted on 6 years ago


How to get started with AngularJS

Posted on 3 years ago


MySQL Transaction tutorial for Beginners

Posted on 6 years ago


What is JavaScript Hoisting

Posted on 3 years ago