aa SEO BLOG: 03/06/2005 - 03/13/2005
 
SEO Link Building Services Articles Genie Magic Web Design Contact Us Why Us

 

Friday, March 11, 2005

Do search engines index flash?? - search engine optimization tip march 9th 2005

Flash is not good for search engines, Most ofthe search engines have difficulty in parsing code from the complex DHTML coding of flash, Google has been recently reported on following links from flash .SWF files, We have seen google read text within a flash file and follow links from a flash file,



But yahoo dont read flash they are not sophisticated to do so, SImiliarly MSN, gigablast and lot of other search engines dont follow flash, Best bet is to avoid flash sites if you are planning to do search engine optimization,



Flash has always been a hinderance for search engines better avoid designing full sites with flash,

SEO BLOG TEAM,

An other excellent post by brett tabke of webmasterworld.com on duplicate content issues with search engines,

A great post in webmasterworld by brett tabke explains how search engines treat duplicate content, It is worth a read by everyone,



What is dupe content?
a) Strip duplicate headers, menus, footers (eg:
the template)
This is quite easy to do mathematically. You just look for
string patters that match on more that a few pages.
b) Content is what is
left after the template is removed.
Comparing content is done the same way
with pattern matching. The core is the same type of routines that make up
compression algos like Lempel-Ziv (lz).
This type of pattern matching is
sometimes referred to as a sliding dictionary lookup. You build an index of a
page (dictionary) based on (most probably) words. You then start with the lowest
denominator and try to match it against other words in other pages.
How
close is duplicate content?
A few years ago, an intern (*not* Pugh) who
helped work on the dupe content routines (2000?), wrote a paper (now removed).
The figure 12% was used. Even after studying, we are left to ask how that 12% is
arrived at.
Cause for concern with some sites?
Absolutely. People that
should worry: a) repetitive content for language purposes. b) those that do auto
generated content with slightly different pages (such as weather sites, news
sites, travel sites). c) geo targeted pages on different domains. d) multiple
top level domains.
Can I get around it with random text within my template?
Debatable. I have heard some say that if a site of any size (more than
20pages) does not have a detectable template, that you are subject to another
quasi penalty.
When is dupe content checked?
I feel it is checked as a
background routine. It is a routine that could easily run 24x7 and hundreds of
machines if they wanted to crank it up that high. I am almost certain there is a
granularity setting to it where they can dialup or dial down how close they
check for dupe content. When you think about it, this is not a routine that
would actually have to be run all the time because one they flag a page as a
dupe, that would take care of it for a few months until they came back to check
again. So I agree with those that say it isn't a set pattern.
Additionally,
we also agree that G's indexing isn't as static as it used to be. We are into
the "update all the time" era where the days of GG pressing the button are done
because it is pressed all the time. The tweaks are on-the-fly now - it's pot
luck.
What does Google do if it detects duplicate content?
Penalizes the
second one found (with caveats). (As with almost ever Google penalty, there are
exceptions we will get to in a minute).
What generally happens is the first
page found is considered to be the original prime page. The second page will get
buried deep in the results.
The exception (as always) - we believe - is high
Page Rank. It is generally believe by some that mid-PR7 is considered the "white
list" where penalties are dropped on a page - quite possibly - an entire site.
This is why it is confusing to SEO's when someone says they absolutely know the
truth about a penalty or algo nuance. The PR7/Whitelist exception takes the
arguments and washes them.
Who is best at detecting dupe content? Inktomi
used to be the undisputed king, but since G changed their routines (late
2003/Florida?), G has detected the tiny page to the large duplicate page without
fail.
On the other, I think we have all seen some classic dupe content that
has slipped by the filters with no explaination apparent.
For example, these
two pages:
The original: http://www.webmasterworld.com/forum3/2010.htm
The
duplicate: http://www.searchengineworld.com/misc/guide.htm
The
10,000 unauthorized rips: (10k is best count, but probably higher): Successful
Site in 12 Months with Google Alone
All-in-all, I think the dupe content
issue is far over rated and easy to avoid with quality original content. If
anything, it is a good way to watch a competitor penalized.




What is duplicate content for search engines - search engine optimization tip March 8th 2005

Various search engines have various thresholds on duplicate content issues, Some search engines like yahoo, exalead are unable to detect duplicate contents across sites, they seem to detect within a site but are not able to detect across sites, Best is to make the pages atleast 5 to 7% different from other pages of the site,



Google is the best search engine on detecting dupe contents, They strip away the main template of the site and take the remaining part into their algorithm consideration, We recommend making the page atleast 8 to 15% different from other pages to avoid dupe content penalty for a particular page, Remember to give proper file names if you cant create too unique pages, File names are indexed by search engines and good 5 or 6 word file names add upto unique contents,



Overall 10% is the best bet to make pages different,

SEO BLog Team,

Wednesday, March 09, 2005

Google guy responds to Attack on google for potential cloaking of some pages of their site

Many forums and blogs blame Google's is keyword stuffing, cloaking their title etc in their support, that page was fixed and removed from google index now, Googleguy immediately responded explaining what has happened behind the screen,



Hey everyone, I'm sorry that it took me a while to post about this. I
wanted to make sure I completely understood what was going on first.
Those
pages were primarily intended for the Google Search Appliances that do site
search on individual help center pages. For example, https://adwords.google.com/support has a search box, and that
search is powered by a Google Search Appliance. In order to help the Google
Search Appliance find answers to questions, the user support system checked for
the user agent of "Googlebot" (the Google Search Appliance uses "Googlebot" as a
user agent), and if it found it, it added additional information from the user
support database into the title.
The issue is that in addition to being
accessed via the internal site-search at each help center, these pages can be
accessed by static links via the web. When the web-crawl Googlebot visits, the
user support system thinks that it's the Google Search Appliance (the code only
checks for "Googlebot") and adds these additional keywords.
That's the
background, so let me talk about what we're doing. To be consistent with our
guidelines, we're removing these pages from our index. I think the pages are
already gone from most of our data centers--a search like
[site:google.com/support] didn't return any of these pages when I checked. Once
the pages are fully changed, people will have to follow the same procedure that
anyone else would (email webmaster at google.com with the subject "Reinclusion
request" to explain the situation).



http://slashdot.org/article.pl?sid=05/03/08/1621206

Monday, March 07, 2005

Is alexa ranking worth looking at ? search engine optimization tip March 7th 2005

Alexa has a unique ranking system where the measure the ranking of a site based on their toolbar users visiting certain sites, Alexa's ranking criteria can give you an idea how good a site is and how well they get traffic from various sources,



But it is not a definete measure since there are not many Alexa toolbar users, Only Alexa toolbar users are taken into consideration for ALexa site rankings and it is not a good measure,



But Alexa will give an idea of the quality of the site, If there is a hike in traffic to a particular site as reported in Alexa then we can judge definetely there is some increase in popularity for that particular site,

SEO Blog Team,

does spamming guestbooks, phpbbs and wikis work ? SEO tip march 6th 2005

Does spamming forums, blog comments, guestbooks wikis work, Search engines like google have already written effective algorithms to ignore links from these sources, Whether it works now is a big question, It might work to certain extent but is it good for competitive ranking is a doubt,



Apart from that google,yahoo,MSN and some more blog providers all combined recently launched the rel=nofollow tag, this tells search engine crawlers to ignore that particular link where this attribute is used, Wikis and certain blog providers have already implemented this tag automatically, So it will be difficult to spam in future,



Better option is to avoid it and work inaccordance with search engine guide lines and build quality sites which search engines will be proud to rank,

SEO Blog Team,

What is an Authority site? - Search Engine Optimization Tip March 5th 2005

An authority site is determinded the Number and quality of links that site has and the quality of content that site has,



If the site has lots of quality contents it attracts lot of quality inbound links and this will make that site an authority site,

For example microsoft is an huge authority site for computer related questions and windows related doubts, they get backlinks for most of their inner pages from various sources which makes them an authority,



They just dont get links to their homepage but to their 100s of 1000s of inner pages too, Simply saying an authority site is an great source of information which attracts all sort of crowd,

SEO Blog Team,

Google desktop search API now open for developers for developing unique tools,

Google recently introduced API feature for desktop search, this enables users to do unique applications for desktop search,



http://desktop.google.com/developer.html

Read the developer guide for more information on how to work with the API,



Do No of links per page matter these days for search engines - SEO Tip March 4th 2005

Number of links per page for search engines has been a major debate among search engine optimizers,



There are no definete rules when it comes to number of links per page, Google suggests having 100 links per page, But that is not a rule just a suggestion, Googlebot is known to follow more than 1000 links on page, Google used to have an indexing limit of page size upto 101 KB now that rule too seem to be removed, Google these indexes more than 101 kb and all links are followed, If the pagerank of the page bearing the links is high then there wont be any problem following those links or passing pagerank to those links,



Similiarly yahoo and MSN too doesnt have any set of rules, These are known to follow morethan 1000 links per page, So better dont see what is good for search engines, see end users if the no of links per page is good for your users, then use them,

SEO Team,

Is Google Toolbar and other search engine toolbar useful - Search engine optimization tip 3rd March 2005

Google toolbar is a BHO ( Browser helper object ) which has pagerank display and other unique features, Some features of google toolbar are autolink, highlight button, blog this button, pagerank display , news button, backlinks button, auto fill, spell checker etc,



Auto Link: Auto link automatically changes certain part of the text in a site you browser into hyperlinks to other sites, This feature sometimes changes the links and points to specific maps or to other sites like amazon,

Pagerank Display: Pagerank is the technology of google in judging a quality of a site based on the links pointing to that pagerank, The more the links and the better quality the links are the pagerank passed with be more, Pagerank display helps in judging the quality of a site to a certain extent,



highlight button: Highlight Button helps in highlighting certain text specified in the search box,

Search Box: helps in searching the web,

Auto Fill: Helps in filling out online forms much faster,

Yahoo, MSN, A9, Teoma all have their unique toolbars,

Search engine toolbars helps in search engine optimization in various ways, options like the highlight button helps in judging the keyword density and keyword distribution of a page, Pagerank display helps in determining the pagerank of a site etc,

Previous Posts & Archives
Search Our Site
Featured Links
PageRank 10 sites
SEO Search Engine Copywriting
Traffic power Aggressive SEM Company
Services
 Search Engine Optimization
 Web Design
 Link Building
 Search Engine Marketing
 Internet Marketing
 SEO Consulting
 Ecommerce  Implementation.
 Pay Per Click Services
 Graphic Design.
 Shopping Feeds Optimization.
 Shopping Cart Customization
 Product Development.
 Online Forms & Database       Integration.
 PHP Programming  Services
 Programming Services Java,J2EE
 .NET Application Development      Programming Services
 Business Process OutSourcing
 Offshore Outsourcing
Articles
Froogle Feeds
About Froogle
Submission Services
Get Listed In Froogle
Froogle Listing Benefits
Why Us
FAQ Froogle
Pricing in Froogle
Froogle Merchant Information
Froogle optimization
Froogle Request Quote
SEO Plans
Company
Genie Magic
***Links***
SEO Information

 

 

 

 

Search Engine Optimization SEO Company | Privacy Policy | Term of Service | Copyright
Search Engine Genie is an Ethical Search Engine Optimization Company Specializing in Search Engine Marketing, Search Engine Promotion and Search Engine Ranking Services.