How Google handles Scrapers - Nice information from Google team

Its been a long battle between Google, webmasters and content thieves who scrap information from a website and display it on their website to get traffic. Many Webmasters had been complaining for a long time about this problem. As far as i know Google is already doing a good job with content thieves and scraper sites. Now they have opened up with their inner workings on how they tackle this problem.



We tackle 2 types of dupe content problems one within a site and other with external sites. Dupe content within a site can easily be fixed. I am sure we have full control over it. We can find all potential areas which might create 2 pages of same content and prevent one version from crawling or remove any links to those pages which might be duplicates.



External sites are always a problem since we don't have any control over it. Google says they are now effectively tracking down potential duplicates and give maximum credit to the Original source and filter out rest of the duplicates.

If you find a site which is ranking above you using your content Google says

  1. Check if your content is still accessible to our crawlers. You might unintentionally have blocked access to parts of your content in your robots.txt file.
  2. You can look in your Sitemap file to see if you made changes for the particular content which has been scraped.
  3. Check if your site is in line with our webmaster guidelines.

Labels: ,


0 Comments:

Post a Comment

Links to this post:

Create a Link

<< SEO Blog Home