Article : Can-I-disallow-crawling-of-my-CSS-and-JavaScript-files - Last Review: May 31, 2011
Can I disallow crawling of my CSS and JavaScript files?
Solution:
I personally would recommend not blocking that for example the white house recently rolled out a new robot.txt and I think they blocked the images, directory or CSS or Java Script or something like that. You really don't need to that. In fact sometimes it can be very helpful if we think something spam is going on with Java Script or if somebody is doing a sneaky redirect or something like that. So my personal advice would to let Googlebot to go ahead and crawl that and then it's not like these files are huge anyway so it doesn't consume a lot of bandwidth. So my personal advice just let Googlebot access to all that stuff.