What is duplicate content for search engines – search engine optimization tip March 8th 2005
Various search engines have various thresholds on duplicate content issues, Some search engines like yahoo, exalead are unable to detect duplicate contents across sites, they seem to detect within a site but are not able to detect across sites, Best is to make the pages atleast 5 to 7% different from other pages of the site,
Google is the best search engine on detecting dupe contents, They strip away the main template of the site and take the remaining part into their algorithm consideration, We recommend making the page atleast 8 to 15% different from other pages to avoid dupe content penalty for a particular page, Remember to give proper file names if you cant create too unique pages, File names are indexed by search engines and good 5 or 6 word file names add upto unique contents,
Overall 10% is the best bet to make pages different,
SEO BLog Team,
No comments yet.
Leave a comment
Blogroll
Categories
- 2013 seo trends
- author rank
- Bing search engine
- blogger
- Fake popularity
- google ads
- Google Adsense
- google fault
- google impact
- google Investigation
- google knowledge
- Google panda
- Google penguin
- Google Plus
- Google webmaster tools
- Hummingbird algorithm
- infographics
- link building
- Mattcutts Video Transcript
- Microsoft
- MSN Live Search
- Negative SEO
- pagerank
- Paid links
- Panda and penguin timeline
- Panda Update
- Panda Update #22
- Panda Update 25
- Panda update releases 2012
- Penguin Update
- Sandbox Tool
- search engines
- SEO
- SEO cartoons comics
- seo predictions
- seo techniques
- SEO tools
- seo updates
- social bookmarking
- Social Media
- SOPA Act
- Spam
- Uncategorized
- Webmaster News
- website
- Yahoo