Can I disallow crawling of my CSS and JavaScript files?

A fun question from SEOmofo in Simi Valley. They ask “If I externalize all CSS style definitions and JaveScript scripts and disallow all user agents from accessing these files (via.robot.txt), would this cause problems for Googlebot? Does Googlebot need access to these files?”

I personally would recommend not blocking that for example the white house recently rolled out a new robot.txt and I think they blocked the images, directory or CSS or Java Script or something like that. You really don’t need to that. In fact sometimes it can be very helpful if we think something spam is going on with Java Script or if somebody is doing a sneaky redirect or something like that. So my personal advice would to let Googlebot to go ahead and crawl that and then it’s not like these files are huge anyway so it doesn’t consume a lot of bandwidth. So my personal advice just let Googlebot access to all that stuff and then most of the time we won’t ever fetch it. But in the rare occasion when we doing a quality check on behalf of someone or we receive a spam report and then we can go ahead and fetch that make sure your site is clean and not having any source of problems.

No comments yet.

Leave a comment

Request a Free SEO Quote