<div style="display:inline;float:right;margin-left:1em"><g:plusone href="https://www.searchenginegenie.com/blog-seo/is-yahoo-slurp-misbehaving/"></g:plusone></div>
<div style="display:inline;float:right;margin-left:1em"><g:plusone href="https://www.searchenginegenie.com/blog-seo/is-yahoo-slurp-misbehaving/"></g:plusone></div>
{"id":310,"date":"2007-01-20T04:15:00","date_gmt":"2007-01-20T08:15:00","guid":{"rendered":"http:\/\/www.searchenginegenie.com\/blog-seo\/is-yahoo-slurp-misbehaving\/"},"modified":"2012-09-20T03:44:09","modified_gmt":"2012-09-20T07:44:09","slug":"is-yahoo-slurp-misbehaving","status":"publish","type":"post","link":"https:\/\/www.searchenginegenie.com\/blog-seo\/is-yahoo-slurp-misbehaving\/","title":{"rendered":"Is yahoo slurp misbehaving?"},"content":{"rendered":"<p><strong>Complaints<\/strong><br \/>Webmasters complain that yahoo slurp is not tripping the filter given in robots.txt for the following<\/p>\n<p>User-agent: msnbot<br \/>Disallow: \/bloop\/<br \/>Disallow: \/blop\/<\/p>\n<p>User-agent: googlebot<br \/>Disallow: \/bloop\/<br \/>Disallow: \/blop\/<\/p>\n<p>User-agent: Slurp<br \/>Disallow: \/bloop\/<br \/>Disallow: \/blop\/<\/p>\n<p>User-agent: *<br \/>Disallow: \/shop\/<br \/>Disallow: \/forum\/<br \/>Disallow: \/cgi-bin\/<\/p>\n<p>Yahoo slurp obeys the agent specific rule and hence does not crawl the directories \/bloop\/ and \/blop\/directories where as it crawled the directories in the generic rule.<\/p>\n<p><strong>Discussions and Suggestions<br \/><\/strong><br \/>Webmasters figured out that not only yahoo but even other search engines behaved in the same manner.<\/p>\n<p>All the SE BOTS and slurps do not obey the Generic rule.<\/p>\n<p><span style=\"color:#3333ff;\">Suggestion 1:<\/span> Put the wildcard ones first that has generic rule to less specific bots and then place the agent specific code.<\/p>\n<p><span style=\"color:#3333ff;\">Suggestion 2:<\/span> main theme behind this is that the major bots get to their corresponding user agents and the rest carry on with wild card user agents.<\/p>\n<p><strong>Conclusion<\/strong><\/p>\n<p>All bots do trip and filter the specifications. They are tripped by the server configuration.<br \/>The fact is that major bots and slurps get to their respective user agents and the rest continue with the generic rule.<\/p>\n<p>The exact rule for the above case would be<\/p>\n<p>User-agent: Slurp<br \/>Disallow: \/bloop\/<br \/>Disallow: \/blop\/<br \/>Disallow: \/shop\/<br \/>Disallow: \/forum\/<br \/>Disallow: \/cgi-bin\/<br \/>Disallow: \/badbottrap\/<\/p>\n<p>User-agent: msnbot<br \/>Disallow: \/bloop\/<br \/>Disallow: \/blop\/<br \/>Disallow: \/shop\/<br \/>Disallow: \/forum\/<br \/>Disallow: \/cgi-bin\/<br \/>Disallow: \/badbottrap\/<\/p>\n<p>User-agent: googlebot<br \/>Disallow: \/bloop\/<br \/>Disallow: \/blop\/<br \/>Disallow: \/shop\/<br \/>Disallow: \/forum\/<br \/>Disallow: \/cgi-bin\/<br \/>Disallow: \/badbottrap\/<\/p>\n<p>User-agent: *<br \/>Disallow: \/shop\/<br \/>Disallow: \/forum\/<br \/>Disallow: \/cgi-bin\/<br \/>Disallow: \/badbottrap\/<\/p>\n<p><strong>This can be shortened to<br \/><\/strong><br \/>User-agent: Slurp<br \/>User-agent: msnbot<br \/>User-agent: googlebot<br \/>Disallow: \/bloop\/<br \/>Disallow: \/blop\/<br \/>Disallow: \/shop\/<br \/>Disallow: \/forum\/<br \/>Disallow: \/cgi-bin\/<br \/>Disallow: \/badbottrap\/<\/p>\n<p>User-agent: *<br \/>Disallow: \/shop\/<br \/>Disallow: \/forum\/<br \/>Disallow: \/cgi-bin\/<br \/>Disallow: \/badbottrap\/<\/p>\n","protected":false},"excerpt":{"rendered":"<p>ComplaintsWebmasters complain that yahoo slurp is not tripping the filter given in robots.txt for the following User-agent: msnbotDisallow: \/bloop\/Disallow: \/blop\/ User-agent: googlebotDisallow: \/bloop\/Disallow: \/blop\/ User-agent: SlurpDisallow: \/bloop\/Disallow: \/blop\/ User-agent: *Disallow: \/shop\/Disallow: \/forum\/Disallow: \/cgi-bin\/ Yahoo slurp obeys the agent specific rule and hence does not crawl the directories \/bloop\/ and \/blop\/directories where as it crawled the [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[9],"tags":[],"class_list":["post-310","post","type-post","status-publish","format-standard","hentry","category-yahoo"],"_links":{"self":[{"href":"https:\/\/www.searchenginegenie.com\/blog-seo\/wp-json\/wp\/v2\/posts\/310","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.searchenginegenie.com\/blog-seo\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.searchenginegenie.com\/blog-seo\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.searchenginegenie.com\/blog-seo\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.searchenginegenie.com\/blog-seo\/wp-json\/wp\/v2\/comments?post=310"}],"version-history":[{"count":1,"href":"https:\/\/www.searchenginegenie.com\/blog-seo\/wp-json\/wp\/v2\/posts\/310\/revisions"}],"predecessor-version":[{"id":1421,"href":"https:\/\/www.searchenginegenie.com\/blog-seo\/wp-json\/wp\/v2\/posts\/310\/revisions\/1421"}],"wp:attachment":[{"href":"https:\/\/www.searchenginegenie.com\/blog-seo\/wp-json\/wp\/v2\/media?parent=310"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.searchenginegenie.com\/blog-seo\/wp-json\/wp\/v2\/categories?post=310"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.searchenginegenie.com\/blog-seo\/wp-json\/wp\/v2\/tags?post=310"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}