SEO

Is it bad to duplicate keywords and content in meta description tags? – search engine optimization tip march 16 2005

Meta description tags play a very small role in search engine ranking these days, It doesn’t matter whether the meta description tags are duplicated across your pages, Just make sure your important keywords/ phrases are present in those meta description tags,

It is common for sites to have similar meta description tag, Especially in ecommerce sites it is common to have duplicated meta tags, There is nothing wrong in it, It doesn’t hurt to have duplicated meta description, Also you should remember it doesn’t help much, So it is better to just optimize them and leave it as it is,

SEO Blog Team,

Potential reasons for Sites Crawled But Not Indexed – search engine optimization tip – march 15th 2005

For a more detailed list of search engine crawlers contact us and we will send it to you for free,

Many of us have noticed sometimes search engines keep visiting the site regularly but dont index the site nor show it in the site: command,

There are various reasons for this to happen,

1. The domain is an expired domain, if the domain expires and not registered for a certain period of time google imposes a expired domain penalty on that domain, That domain is left to suffer for certain number of months, in that period googlebot keeps visiting the site but they don’t index it and they don’t show the site in site: command too,

2. An other reason is the domain is a new domain and if the domain is a new domain sometimes the crawler regularly visits the site and it doesn’t show up in the index for a long time, There is nothing wrong with this, probably google index is taking longer time to expand, you just have to wait till google updates its index,

3. An other possible reason is the site is banned from the search engines for any particular onpage factor, In that case search engines periodically checks to see whether the onpage spam tactics is removed and as soon as they see the spam being removed they might reinclude the site into the index, So for people who are complaining that their site was previously indexed and listed in google but suddenly it disappeared from the index and googlebot keeps visiting the site it is good to look at your onpage work and see if there is any spam tactics like hidden text, cloaking, keyword stuffing etc,

4. An other important reason could be that the site was permanently banned from the search engines, Even here search engine crawlers visit the site following existing links to the site but they don’t index the site because the site is banned, This is common with Yahoo slurp yahoo’s robot, yahoo slurp is known to visit the site and don’t index the site if the site is banned,

List of top search engine user-agents – search engine optimization tip march 14th 2005

List of top search engine user-agents

We get lot of mails from people who want to know the names of the leading user-agents, We will be pleased to give the information, Identifying the useragents is a very important criteria in search engine optimization, A regular visit by search engine robots like Googlebot, yahoo slurp etc is a good sign,

Here is the list of top search engines crawlers,

Googlebot/2.*ooglebot@googlebot.com)
Yahoo – Yahoo Slurp
Msn – Msnbot
Lycos – Lycos_Spider_(T-Rex)/3.
Teoma- Mozilla/2.0 (compatible; Ask Jeeves/Teoma)

For a more detailed list of search engine crawlers contact us and we will send it to you for free,

Does search engines crawl Javascript links ? – search engine optimization tip – march 13th 2005

Scripts are complex codings which difficult for some browsers and crawlers to read, Some tough javascripts are read only by advanced browsers, search engine crawlers are not advanced browsers some of the browsers that search engines use find difficult to index high graphics, So it is best to avoid too much javascript,

Prevent creating menus in javascript or in any scripting language, Create menus in simple html or other crawler understandable language, Javascript menus wont be crawled by search engines, Best is to avoid using clickable menus and other important inner page links in javascript,

if you cannot avoid using javascript menus just add the links to a sitemap and attach the sitemap to the homepage or any other important page,

SEO BLog Team,

Does outgoing links affect your site – seo tip march 12th 2005

Outbound links are links going out from a site to an other site, Whole internet/WWW was built upon links, Inbound links and Outbound links make up the web, Outbound links are good to maintain the quality of a site, search engines like links, they like outbound links too, If you link out to a collection of quality sites then definitely there is a small boost to that page,

Jon M. Kleinberg proposed that hubs are a collection of quality links, More information on hubs and authorities in this paper, http://www.cs.cornell.edu/home/kleinber/auth.pdf

Outbound links are good for usability too, For certain references it is important to give the source so that people are guided in the correct way, Especially non commercial sites need to link out freely for people to find relevant information if they are found elsewhere on the web,

SEO Blog Team,

Do tracking parameters dilute ranking of page? – search engine optimization tip march 10th 2005

tracking parameters dilute ranking of page?

many will be having question whether tracking URLs like the one below will dilute pagerank/link popularity/ranking,

www.myURL.com/page.htm?trackingid=theirURL

Nope they dont dilute much, if you think they are diluting the link better option is to 301 redirect them to the main URL, that way all link popularity is passed to the main URL and no duplicate pages are formed,

Do search engines index flash?? – search engine optimization tip march 9th 2005

Flash is not good for search engines, Most of the search engines have difficulty in parsing code from the complex DHTML coding of flash, Google has been recently reported on following links from flash .SWF files, We have seen google read text within a flash file and follow links from a flash file,

But yahoo don’t read flash they are not sophisticated to do so, Similarly MSN, gigablast and lot of other search engines don’t follow flash, Best bet is to avoid flash sites if you are planning to do search engine optimization,

Flash has always been a hindrance for search engines better avoid designing full sites with flash,

SEO BLOG TEAM,

What is duplicate content for search engines – search engine optimization tip March 8th 2005

Various search engines have various thresholds on duplicate content issues, Some search engines like yahoo, exalead are unable to detect duplicate contents across sites, they seem to detect within a site but are not able to detect across sites, Best is to make the pages atleast 5 to 7% different from other pages of the site,

Google is the best search engine on detecting dupe contents, They strip away the main template of the site and take the remaining part into their algorithm consideration, We recommend making the page atleast 8 to 15% different from other pages to avoid dupe content penalty for a particular page, Remember to give proper file names if you cant create too unique pages, File names are indexed by search engines and good 5 or 6 word file names add upto unique contents,

Overall 10% is the best bet to make pages different,

SEO BLog Team,

Is alexa ranking worth looking at ? search engine optimization tip March 7th 2005

Alexa has a unique ranking system where the measure the ranking of a site based on their toolbar users visiting certain sites, Alexa’s ranking criteria can give you an idea how good a site is and how well they get traffic from various sources,

But it is not a definite measure since there are not many Alexa toolbar users, Only Alexa toolbar users are taken into consideration for ALexa site rankings and it is not a good measure,

But Alexa will give an idea of the quality of the site, If there is a hike in traffic to a particular site as reported in Alexa then we can judge definitely there is some increase in popularity for that particular site,

SEO Blog Team,

does spamming guestbooks, phpbbs and wikis work ? SEO tip march 6th 2005

Does spamming forums, blog comments, guestbooks wikis work, Search engines like google have already written effective algorithms to ignore links from these sources, Whether it works now is a big question, It might work to certain extent but is it good for competitive ranking is a doubt,

Apart from that google, yahoo, MSN and some more blog providers all combined recently launched the rel=nofollow tag, this tells search engine crawlers to ignore that particular link where this attribute is used, Wikis and certain blog providers have already implemented this tag automatically, So it will be difficult to spam in future,

Better option is to avoid it and work inaccordance with search engine guide lines and build quality sites which search engines will be proud to rank,

SEO Blog Team,

Request a Free SEO Quote