new Bigdaddy datacenters report huge increase in search result pages

comparing to the previous google the new improved googlebot which powers the bigdaddy datacenters are having huge increase in indexed data, Could this be because of new googlebot's ( google's crawler ) effective way of crawling sophisticated Javascript, flash, frames etc??



A search for *.* showed 9,660,000,000 now its showing 25,270,000,000 ( that's more than 25 billion ) results,


A search for "the" showed 8,660,000,000 now its showing 22,010,000,000



Is new bigdaddy infrastructure datacenter a step ahead in google's crawling technology??????

Have Sitemaps killed my site? - interesting thread in webmasterworld

An interesting thread in webmasterworld.com was discussing about a site's pages which went URL only after signing up for google sitemaps, so did google sitemaps affect his site's indexing of pages,



Webmasterworld.com member mrmister says

"I noticed that there are about 8 pages that are URL only. I've never seen this happen before. I did a standard Google search for my "green widgets" category page by searching for [green widgets]. It used to appear somewhere on page 1 but it was no longer there. I clicked to page 2 and I found my "history of green widgets" page sitting at the top of page 2. The "history of green widgets" page is the only subcategory page that is still listed as non-URL only. I checked my logs and Google has yet to crawl this page.
The site itself is a small site (about 80 pages). It's been going in various guises for about 10 years and therefore has a fair amount of inbound links (mainly to internal pages rather than the home page). For most categories it gets to page 1 of the SERPs for two work keyphrases. I gave it a design overhaul about a year back to convert it from tag soup to clean valid HTML and CSS. I changed the linking structure and the URLs using 301s. I also improved the prominence of Adsense ads (that I'd been trailing for about a year previous) It weathered the change fine (some previous number 1 results dropped down a few places but nothing major).
There have been no major changes since then. I've added a few categories. It's all been fine, every page has been indexed in a timely fashion.
However a week ago, I signed up to Google Sitemaps. I am wondering if this is connected in any way with the URL-only pages that I'm starting to see. "




So what could have caused this, Submitting to google sitemaps might have been a coincidence,

Vanessa Fox, Google Engineering reports on effect ways of moving domain names,

We had already given good suggestions on moving domains in our blog,



Nothing is more effective than getting information from a google employee, Matt cutts senior google engineer already discussed on 301 redirects ( https://www.mattcutts.com/blog/the-little-301-that-could )

Now vanessa fox reports in sitemaps blog on effective 301 redirect,

Here an extract from google sitemaps blog



"Recently, someone asked me about moving from one domain to another. He had read that Google recommends using a 301 redirect to let Googlebot know about the move, but he wasn't sure if he should do that. He wondered if Googlebot would follow the 301 to the new site, see that it contained the same content as the pages already indexed from the old site, and think it was duplicate content (and therefore not index it). He wondered if a 302 redirect would be a better option. I told him that a 301 redirect was exactly what he should do. A 302 redirect tells Googlebot that the move is temporary and that Google should continue to index the old domain. A 301 redirect tells Googlebot that the move is permanent and that Google should start indexing the new domain instead. Googlebot won't see the new site as duplicate content, but as moved content. And that's exactly what someone who is changing domains wants. He also wondered how long it would take for the new site to show up in Google search results. He thought that a new site could take longer to index than new pages of an existing site. I told him that if he noticed that it took a while for a new site to be indexed, it was generally because it took Googlebot a while to learn about the new site. Googlebot learns about new pages to crawl by following links from other pages and from Sitemaps. If Googlebot already knows about a site, it generally finds out about new pages on that site quickly, since the site links to the new pages. I told him that by using a 301 to redirect Googlebot from the old domain to the new one and by submitting a Sitemap for the new domain, Googlebot could much more quickly learn about the new domain than it might otherwise. He could also let other sites that link to him know about the domain change so they could update their links."

does redirects pass pagerank ?

Many have the question whether redirects pass google pagerank?



Answer:

It depends on the type of redirect handled, if its just a normal 302 redirect then it rarely passes pagerank, if the redirect is 301 then it will pass pagerank,



Passing of pagerank depends on the type of redirect used, if the redirects are hidden inside javascript its impossible for the link to pass pagerank, similarly if the links are from folders blocked by robots file then it wont pass pagerank,

Google reaching new heights - today's market cap 127 billion USD,

Google has reached new heights, Its market capital is around 127 billion which is huge,



today's stats,

GOOGLE (NasdaqNM:GOOG) Delayed quote data
Last Trade:
428.20
Trade Time:
2:17PM ET
Change:
5.29 (1.22%)
Prev Close:
433.49
Open:
429.39
Bid:
428.10 x 200
Ask:
428.20 x 200
1y Target Est:
476.31
Day's Range:
425.00 - 433.28
52wk Range:
172.57 - 475.11
Volume:
6,620,322
Avg Vol (3m):
11,177,700
Market Cap:
126.55B
P/E (ttm):
94.86
EPS (ttm):
4.51
Div & Yield:
N/A (N/A)



seo blog team,

Getting banned from adsense for running clickbots,

A webmaster reports in a forum saying that his adsense account was closed because of fraudulent clicks on his site,



He said

"An ex-friend of mine recently created a both. This both visits a site, clicks on a ad, changes its IP and comes back to click again. I was wondering why my adsense revenue for that day shot up so dramatically. My google account has now been banned and revenue lost for several of my sites. Presumably there is no chance of getting my account reinstated? Additionally, whats to stop him running this both on sites he wants to get banned? I am very very unhappy with this guy. "



this is really unfortunate, we recommend everyone monitor their adsense account actively, if they see fraudulent clicks we recommend them to shutdown their ads temporarily, Google is very strict on this issue,

carcaserdotcom seocontest test page just checking to see how this site ranks in carcaserdotcom seocontest

Just checking to see how this site ranks for carcaserdotcom seocontest organized by carcasher.com , We don't want to exchange links with sites running seocontest, so please don't contact us for link exchange offers for carcaserdotcom seo contest,



1000s of sites will soon jump into this carcaserdotcom contest, good luck to all,




New Search Engine Optimization (SEO) Contest starting feb 1st 2006

carcasher.com is organizing a new seo contest where the keyword to be ranked is "carcasherdotcom seocontest "



carcasherdotcom seocontest is a weird keyword to be selected, rules of the contest,

SEO Promotion Rules:
No dirty SEO techniques! (read Google Webmaster Guidelines)SEO's caught using any of the following: doorway pages, spam, hidden links, etc. WILL NOT qualify for the prize, and the prize will go to the next best SEO.
Every website HAS TO HAVE an email address published ON THAT SITE so that nobody else can claim your prizes.
We will pay prizes through PayPal, Money Order or CC, whichever applicable.
In order to qualify for the prize, a web page must include ONE of the following:
a) link to the rules of the contest http://www.carcasher.com/SEOContest/, ORb) link to the www.carcasher.com homepage, ORc) the following text (no link) We support CarCasher.Com



Great prizes include

$14,000, 42" Plasma TV, Sony PSP, and iPod in prizesPLUS 12 months $12,000 SEO contract for 2007.

Got the challenge go for it at www.carcasher.com

Picasa now available in 25 different languages,

Picasa google's image editing software is now available in 25 different languages,



New interface languages for picasa are Bulgarian, Croatian, Czech, Danish, Estonian, Finnish, Greek, Hungarian, Icelandic, Indonesian, Latvian, Lithuanian, Norwegian, Tagalog, Polish, Romanian, Serbian, Slovak, Slovenian, Catalan, Swedish, Thai, Turkish, Ukrainian, and Vietnamese.



Picasa is a great photo organizing software, it is a must download for people who want to organize their photo effectively,

Download picasa here picasa.google.com

Can google adsense and yahoo publisher network ads be published on same page,

Everybody who runs contextual ads on their site will have a doubt whether to publish adsense and YPN on same site / page,



Adsense advisor says adsense and YPN can be published on same site but not on same page,

this is what he said in webmasterworld.com

"YPN ads fall are considered competitive ads, as any other contextually targeted ads, or ads that mimic Google ads in appearance. For a more detailed description, see our policies page (http://google.com/adsense/) 'Competitive ads and services'.
While you can use both AdSense and YPN on the same site, our program policies prohibit publishers from using them on the same page. "


Website penalized due to spamming, - website penalty due to potential spam,

A webmasterworld.com guy reports that his site got banned for using link spam tactics,



he says

"My website got penalized by google. (Congratulations to me :-) )Things started
from this JAN (wen Google scheduled to upgrade its PR algo? I am not sure if its
true.) Traffic dropped. So did the earnings. Many people felt changes in their
rankings as well as earnings in this particular month. On te other hand, others
felt more income and traffic (obviously)
Reason of Penalty: I was using my
own automatic Link partnering tool and in few instances posted the link
partnering email on Forums and Blogs too. (accidently)
Google detected this
as "SPAMMING" and the traffic that I am getting on this very website (from
Google only) has decreased drastically.
However, other Search Engines seem
to be BLIND to this "SPAMMING" (Yes! I agree that in some sense this is spamming
what I did). Rather MSN and Yahoo have started sending more traffic.
Though,
the website is a quality website (quality content and all). But my greed to get
more links has put this website down. I will have to relaunch it now.
Furthermore, I will start experimenting and will be enhancing my Link
Partnering tool algorithms to detect these irrelevant websites and filtering out
Blogs and Forums.
Moral of the story: 1 - Google has applied some SERIOUS
SPAM combacting algorithms. Its growing more and more conscious about SPAM. So be
careful. 2 - Other Search Engines are still a lot away from what Google is doing.
I salute Google for their dedication and taking SPAM websites seriously. 3 -
Don't post your Website links on Comments section of Blogs or forums. (which I
did accidently) 4 - Look out for other ways of SPAMMING that Google has become
resistant to. Kindly post them here for further evaluation by WebmasterWorld
community. "




Please be careful with google they are pretty strong in detecting link spam,

Evidence reveals MSN using editors to maintain quality in its search results,

Some of our referral logs show visits from this URLhttp://64.4.8.28/hrsv3/Judging.aspx , at first look it seems like a spam site but it is not the case,

We see a login screen when we visit that URL mshrs.search.msn-int.com/hrsv3/Login.aspx it looks like human review area for MSN search results,



Google has been doing human review using eval.google.com for a longtime,

What googleguy said about eval.google.com

"walkman, your comment illustrates a misconception that I've seen in a couple places. The system that was up at eval.google.com was a console to evaluate quality passively, not to tweak our results actively. But when Henk van Ess submitted his own blog to Slashdot, he asserted "Real people, from all over the world, are paid to finetune the index of Google," and that made it sound like people were reaching in via this console to tweak results directly, which just isn't true at all.
I have serious reservations about Henk van Ess taking information from one of his own students (who presumably signed a non-disclosure agreement when the student agreed to help rate the quality of our results) and posting that information online. I also believe these web pages said things like "Google Proprietary and Confidential," but it appears that the screenshots have been cropped to exclude that information. Those are the two things that really made me sad, not the "breaking news" the Google evaluates its own results quality. It shouldn't be a surprise that Google evaluates the quality of its results in lots of ways--the fact is that every major search engine evaluates its relevance in many ways. "



I said
But when Henk van Ess submitted his own blog to Slashdot, he asserted "Real people, from all over the world, are paid to finetune the index of Google," and that made it sound like people were reaching in via this console to tweak results directly, which just isn't true at all. and you replied
Google Guy, do I read between the lines that you think my postings are irrelevant and misleading? That would be a shame.
I don't believe they're irrelevant, but yes: I do believe that the assertions you've made are misleading. In my original post, I was replying to walkman, who asked "ok, so how do you know you've been manually hit by this?" which implies that walkman thought that eval.google.com was responsible for sites being hit. Likewise, I have a ton of respect for Tara Calashain at ResearchBuzz. But her post about your site says "Basically what Henk seems to have found is a part of Google that allows humans to tweak search results to ostensibly get rid of spam and let the most contextually-relevant search results rise to the top." Again, Tara wonders whether your posts said that results were being directly tweaked. Then there are assertions from your site like "The Google testers are paid $10 - $20 for each hour they filter the results of Google." "Filter" again makes it sound like an active process. And your self-submission to Slashdot ("Real people, from all over the world, are paid to finetune the index of Google"), which also gives the impression that people used eval.google.com to change our search results.
So yes, I looked at the wording from when you submitted your own site to Slashdot, plus the use of active verbs such as "filter" on your own site, plus the comments of smart people such as Tara and walkman and how they interpreted what you wrote, and in my opinion your posts have been misleading. Again, this was not a console in which people could directly fine-tune, tweak, filter, or otherwise modify our search results. eval.google.com was for "eval," i.e. passive evaluation.
Your follow-up question was "Why pay them for something if it has no effect om the index? Must be charity then." Why are you surprised that we would pay people to rate search results? The job posting has been public, after all. We do provide ways for people to volunteer to help Google (e.g. see our translation console at http://services.google.com/tc/Welcome.html ), but to rate search results consistently and well takes time and training. I think it's perfectly normal to pay people for their time.
When you quoted me on your site, you said "Google Guy: I've serious reservations about Henk van Ess" and in your post you said "Google's spokesmen Google Guy, who I love to read, has serious reservations about me." Just to be clear, that's not accurate: I don't have reservations about you personally, Henk. I think I stated clearly that I have serious reservations about two of your actions. I mentioned those two specific things in my first post, and I'll reiterate them: you took information from one of your students, and you posted information that (in my opinion) was clearly proprietary/confidential. Regarding the first, I believe you wrote in a comment on your own site that this information came from a student of yours? Regarding the second, I'm quite surprised that you assert "I'm not aware of restrictions." Besides the copyright symbol that you mentioned earlier, the very first picture you posted has a link "An NDA Reminder..." on the left in the Important Announcements section, where NDA stands for non-disclosure agreement. Are you honestly saying that if you had realized there were restrictions, you wouldn't have done five blog posts (so far), posted screenshots, posted employee's real names on the web without consulting them, and posted two training documents? In that case, I'll ask politely. Henk, this information was for ratings training. It's copyrighted, and I'm sure that the evaluation group considers it proprietary/confidential. I'd appreciate it if you would stop posting these documents.
By the way, I apologize in advance if this post comes across as strident. I hate he-said-she-said stuff, and normally I try not to post when I'm at ruffled at all. But I do think that things like posting an innocent employee's name from internal training documents is rude and unnecessary. Henk, feel free to include this entry on your blog, but if you do, I'd appreciate if you'd quote the entire post.

then we have yahoo's human review of search results, we can see referrals from corp.yahoo domain

now we have MSN human review of search, I think its mostly for quality control purpose? anyway good to see they are hand reviewing search, their results are spammed a lot by search engine spammers,

Yahoo update - yahoo updates its search index on 23rd january

Webmasterworld.com guys report yahoo update webmasterworld.com/forum35/3814.htm



check your rankings with our yahoo rank checker



https://www.searchenginegenie.com/yahoo-rank-checker.html

SEO Advice: Spell-check your web site - a typical Joke from Matt cutts

Matt cutts seem to be in happy mode today when he blogged about a site which offers 100% money back guarantee, The site which posted this offer has a banner which has typo errors, so how do you get business when your banner has spelling errors, Nice that matt noted this,



See spelling error in this message,




Boris Floricic shuts down german wikipedia even after his death,

Boris Floricic was a german hacker and phreaker, He died a mysterious death in the year 1998, Many reported his death was a suicide, some members of his family reported he was murdered by an intelligent agency,



read more on the wikipedia entry here - Boris floricic hacker

the german wikipedia.de reads as follows due to court rules



"Liebe Freunde Freien Wissens,
durch eine vor dem Amtsgericht Berlin-Charlottenburg am 17. Januar 2006 erwirkte einstweilige Verfügung wurde dem Verein Wikimedia Deutschland – Gesellschaft zur Förderung Freien Wissens e.V. untersagt, von dieser Domain auf die deutschsprachige Ausgabe der freien Enzyklopädie Wikipedia (wikipedia.org) weiterzuleiten.
Wir lassen derzeit durch unsere Rechtsanwälte alle möglichen Schritte prüfen, um Ihnen schnellstmöglich wieder einen unkomplizierten Zugang zur freien Enzyklopädie Wikipedia zu bieten. Bitte haben Sie dafür Verständnis, dass wir aus rechtlichen Gründen bis auf Weiteres keine weiteren Stellungnahmen in dieser Sache abgeben werden. "

askjeeves releases cache dates - when will MSN and Yahoo add cache dates to their cache,





Search Engine Watch blog reports ask jeeves adding cache dates to sites cache in their results, This is a bold improvement, Yahoo and MSN still didnt bring this facility, Hope they will follow soon,



Google is the first search engine to introduce option,

Google talk now open generation - the seo blog news,

Google talk, google's mservice now offers open generation a new way to communicate with your friends using different IDs,
https://googleblog.blogspot.com/2006/01/open-federation-for-google-talk.html



As google says you are allowed to interact with users with any ID,

on the help page

"We currently support open federation with any service provider that supports the industry standard XMPP protocol. This includes Earthlink, Gizmo Project, Tiscali, Netease, Chikka, MediaRing, and thousands of other ISPs, universities, corporations and individual users. "

What google says



"We've just announced open federation for the Google Talk service. What does that mean, you might be wondering. No, it has nothing to do with Star Trek. "Open federation" is technical jargon for when people on different services can talk to each other. For example, email is a federated system. You might have a .edu address and I have a Gmail address, but you and I can still exchange email. The same for the phone: there's nothing that prevents singular users from talking to Sprint users."

Some sites were reporting MSN was showing weird search behavior for current seo contest,

Lots of blogs are monitoring the ongoing SEO contest, Some blogs report MSN is showing weird behavior for the search http://search.msn.com/results.aspx?q=V7ndotcom+elursrebmem&FORM=QBRE ,

First many thought MSN is filtering out results for this keywords, Now it revealed its not the case, MSN search looks brilliant than what we think of, It seems msn search first try to understand the meaning of the words, Later if it cant find the words in dictionary, it adds the words to adult search keywords,



Now the search V7ndotcom elursrebmem reveals 41,000 results but we have the following note below the search bar,

Web ResultsPage 1 of 41,216 results containing V7ndotcom elursrebmem (0.06 seconds) (with SafeSearch: Moderate)

You can see the mention safesearch moderate, Usually this gets displayed it adult related keywords are searched,



So is MSN so concerned about po*n in its results, if its true MSN looks great,

Million Dollar Homepage back after a severe and malicious DDOS attack,

We reported in our previous posting that million dollar homepage is suffering from ddos attack, Now they are back online,





this is what Alex says in his blog

"I can confirm that MillionDollarHomepage.com has been subjected to a Distributed Denial of Service (DDoS) attack by malicious hackers who have caused the site to be extremely slow loading or completely unavailable since last Thursday, 12th January 2006.
I can also confirm that a demand for a substantial amount of money was made which makes this a criminal act of extorsion. The FBI are investigating and I'm currently working closely with my hosting company, Sitelutions, to bring the site back online as soon as possible. More news soon.
"




Million Dollar Homepage down for past 2 days, Reports says its because of a major DDOS attack launched against the site,

The Million Dollar Homepage which has been reported to run for 5 years is now down for a considerably long time, Most reports say the site was down because of a major DDOS ( distributed denial of service ) attack against the site,



Infoworld reports

"The wildly successful pixel-powered Web page of a British university student is coming under increasingly intense DDOS (distributed denial of service) attacks trying to knock down the profitable brainstorm

The site is hosted on a server in Ashburn, Virginia, in a data center run by Equinix, where InfoRelay has much of its hardware, Weiss said. The high bandwidth use didn't cause problems for InfoRelay, as the company has a multigigabit network and provides bandwidth for a major search engine, Weiss said.

But The Million Dollar HomePage attracted malicious attention, coming under DDOS fire late Tuesday night and early Wednesday morning, he said.

Officials from InfoRelay met to figure out what they could do to stem the attacks within the constraints of Tew's service package, Weiss said. Tew wasn't on an enterprise-level deal that often includes advanced hardware, from vendors such as Cisco Systems (Profile, Products, Articles), used to prevent the effects of DDOS attacks, Weiss said.



Administrators implemented several proprietary internal techniques to slow and alleviate the effects, he said. "We sort of volunteered our time to do what we could," Weiss said.

The attacks are coming from computers worldwide, including the U.S., Europe and Asia, Weiss said. The attacks could be the work of a botnet -- a network of computers illegally commandeered for sending spam and DDOS attacks.

"

Sitelutions.com their webhost says the site was burning 200 megabits per second of internet bandwidth ( Oops )

"We are honored that Alex selected us to host his site. With so much worldwide coverage, Alex needed a host that could support [the site's intense bandwidth requirements," said Paul Singh, Assistant Director of Operations at InfoRelay Online Systems, Inc.

Sitelutions has hosted The Million Dollar Homepage since September, 23, 2005. At that time, Mr. Tew's site had grossed approximately $108,000. Since Sitelutions has hosted the site, Mr. Tew's site has received up to 16 million hits per day, 500,000 unique visitors per day, and has utilized up to 200 megabits per second of Internet bandwidth."

So when will the site be online?

sketchy testimonial of a seo company - fake seo testimonail,

We got a link exchange request from a new seo company, when we checked their site we saw something very funny, the testimonial which is listed on every page of their site is totally fake,



It reads,

"My experience with Guide SEO has been exceptional. Our website, www.xyz.com moved from virtually no page ranking and listing on the search engines to being in the top ten in many of our most important keywords. Daily traffic to the site has increased eightfold in just a few months. My experience with Guide SEO has been exceptional. I highly recommend the group." Salmon.Mc Donald, President, e-learn, inc.

optimized xyz.com>?? look at that, xyz consulting owns xyz.com and they don't do any seo,



e-learn, INC??? who owns e-learn inc? it belongs to passged.com and definitely salmon.Mc.donald is not the president, So how many seo companies have sketchy testimonials???

Very interesting.

Screen shot,



v7n seo contest - kicks off - keyword v7ndotcom elursrebmem to be ranked by may 15th 2006 for a prize money of 4000$,

v7n.com seo contest has started, we can see many webmasters and SEOs directly getting involved in this contest, we will be actively monitoring this contest to bring the latest updates and improvements,



v7ndotcom elursrebmem is the keyword phrase that needs to be ranked by may 15th 2006, The prize money is 4000$,

First prize also includes an apple IPOD,

The rules for the competition are as follows:1. In order to win the first prize, you must place first in Google (organic rankings) for the search term on May 15, 2006, noon, Pacific standard time. 2. Prizes for 2nd place through 5th place will be awarded based on web page placing in the corresponding positions in Google on May 15, 2006, noon, Pacific standard time. 3. For the purpose of this competition, indented listings in the SERPs will not be counted.4. In order to qualify for the prize, a web page must include one of the following:a. A link back to the V7N home page. The link can be in any manner you wish, any anchor text you wish, with nofollow, without nofollow, JavaScript, cloaked or fried up and served with potatoes. b. One of these Official V7n SEO Contest banners:
The banners may link to V7N, or not link to V7N. Linking the banner to any domain other than v7n will disqualify the contestant. Just to make this very clear, the banner may be unlinked. You do not need to link the banner graphic to v7n. For those who do not speak English: you do not need to link el-banner-o to v7n-o.c. The following text:We support v7n.com Please remember, in order for the web page to be compliant, it must contain at least one of the elements listed above. If datacenters show different results on the 15 of May, the winner will be declared solely at the discretion of V7N. If there are any disputes as to the validity of the winner declaration, V7N will be the sole arbiter.



Good luck everyone with the SEO contest v7ndotcom elursrebmem , wish everyone all the best,

Good day,
SEO Blog Team,

Million Dollar Homepage owner Alex is a new millioniare

Alex of million dollar homepage has sold out all his pixels, Alex came out with a great concept milliondollarhomepage.com , he sold a million pixels on his homepage at 1$ per pixel, He sold out his final pixels at ebay auction for more than 38,000$,



He came out with one of the greatest concepts in the history of internet, He has featured BBC, CNN and other top news sites,

In SEO its all about natural linking and doing something unique to make people link to, Alex did this and we hope some else does something more interesting, we are willing to hear from our readers any good suggestions which can bring in lot of natural links,



Care to share??????

Is triangular linking / 3 way linking helpful for a site? Is three way link exchange helpful?





3 way linking is not a great strategy, We at SEG dont endorse 3 way link exchange, It is not an healthy way to gain links, Most of the 3 way link exchange requests we get, we see people proposing back link from some spam directory and we have to give link from the main legitimate site, that is pretty unfair,



Plus if doesnt work properly, We recommend no one should do 3 way linkings,

Yahoo site explorer - A nice friend for SEO companies,

Yahoo site explorer is a tool which helps to search a sites backlinks, pages links in yahoo search etc, It gives the data in a more simple form than the normal search, Also it is very quick and is very helpful for major research,



The export feature of yahoo's site explorer is very helpful in search engine optimization, you can do a linkdomain: search and just export all the data to an excel sheet,

Yahoo's approach is just opposite to google's approach,

Google just don't show proper backlinks for a site, Also their arrangement of site: search is not logical a tall, apart from homepage they don't arrange all other pages properly,



Yahoo site: arrangement is more logical, it is based on the quality of each and every page listed algorithmically,

Hope continues this great service for a longer time,

Pages falling out of MSN search for No reason, - webmasterworld members report fall of pages in MSN results

webmasterworld.com members report fall of pages from MSN's index, some of them notice their homepages dropping, some notice that their pages drop by a trend,

Most notice a huge flux with MSN results after christmas, Is MSN onto something, lets wait and see,



Rich of webmasterworld says,

"Pages are falling off for no reason. On using the site command it returns 650 pages which is less than 1% of the sites content.
I noticed that some pages return the page title with the following in the description field rather than the meta description on the page:-
"To get the most out of the XXXX site please either enable Javascript or download a newer version of your favourite browser - Mircosoft Explorer Mozilla Firefox Alternatively please follow "
On watching the recent video clip of the two msn guys talking about the search engine learning from itself i do wonder if the have a major problem with this search facility.
I would say they are trying to be smart by reading a sites pages in a different browser and it clearly doesn't work - rather than trying to be cleaver they should try and get the basic search right first - such as deep indexing of websites for a start
In this example msn clearly does not cash all of the sites pages despite the site being an established site and authority with thousands of backlinks nor does the bot fully index the site when it visits.
On the face of it i would say that the site has triggered some sort of filter and that's why its a)not deep spidering the site and b) not indexing the pages and c) removing pages it already has in its index - whats causing this i am clueless of!
Currently based on other sites we work on i would say that if you knock together a thin site low on content, with keyword domain name you will rank top for that keyword very easily. In one example a site with one page but loads of links to it ranks high in an extremely sought after sector.
All in all, its still early days for the search but i think its got serious problems and issues that need fixing"



More here, webmasterworld.com/forum97/716.htm

The amazing Google Proxy - Google's proxy helps safe surfing of the web,





its Amazing to see Google offering proxy services, Though this was a bit old news I just heard of it, it offers safe browsing, though it doesnt hide the IP but still it prevents system from potential viruses, spywares, trojans, adware, malware, scumwares etc,



https://www.google.com/gwt/n

SEO blog Team,

Del.icio.ous - A great site to get tagged - massive traffic flow experienced,

Del.icio.ous is a great site, Recently it was acquired by yahoo, it is a bookmark site where people just bookmark when they like a page or a site,



del.icio.us/popular If your site gets into the popular listing of del.icio.us you can expect massive traffic, Plus many other sites just pick up the story, To get into del.icio.us/popular many people around the world have to bookmark it at the same time,




list of TLD ( top level domain extension codes

Here is a list of TLD codes, These extensions shows which country each extension belongs, It was compiled by the ICANN



ICANN is Internet Corporation for Assigned Names and Numbers

IANA - Internet Assigned Numbers Authority

ac – Ascension Island
.ad – Andorra
.ae – United Arab Emirates
.af – Afghanistan
.ag – Antigua and Barbuda
.ai – Anguilla
.al – Albania
.am – Armenia
.an – Netherlands Antilles
.ao – Angola
.aq – Antarctica
.ar – Argentina
.as – American Samoa
.at – Austria
.au – Australia
.aw – Aruba
.az – Azerbaijan
.ax – Aland Islands
.ba – Bosnia and Herzegovina
.bb – Barbados
.bd – Bangladesh
.be – Belgium
.bf – Burkina Faso
.bg – Bulgaria
.bh – Bahrain
.bi – Burundi
.bj – Benin
.bm – Bermuda
.bn – Brunei Darussalam
.bo – Bolivia
.br – Brazil
.bs – Bahamas
.bt – Bhutan
.bv – Bouvet Island
.bw – Botswana
.by – Belarus
.bz – Belize
.ca – Canada
.cc – Cocos (Keeling) Islands
.cd – Congo, The Democratic Republic of the
.cf – Central African Republic
.cg – Congo, Republic of
.ch – Switzerland
.ci – Cote d'Ivoire
.ck – Cook Islands
.cl – Chile
.cm – Cameroon
.cn – China
.co – Colombia
.cr – Costa Rica
.cs – Serbia and Montenegro
.cu – Cuba
.cv – Cape Verde
.cx – Christmas Island
.cy – Cyprus
.cz – Czech Republic
.de – Germany
.dj – Djibouti
.dk – Denmark
.dm – Dominica
.do – Dominican Republic
.dz – Algeria
.ec – Ecuador
.ee – Estonia
.eg – Egypt
.eh – Western Sahara
.er – Eritrea
.es – Spain
.et – Ethiopia
.eu – European Union
.fi – Finland
.fj – Fiji
.fk – Falkland Islands (Malvinas)
.fm – Micronesia, Federal State of
.fo – Faroe Islands
.fr – France
.ga – Gabon
.gb – United Kingdom
.gd – Grenada
.ge – Georgia
.gf – French Guiana
.gg – Guernsey
.gh – Ghana
.gi – Gibraltar
.gl – Greenland
.gm – Gambia
.gn – Guinea
.gp – Guadeloupe
.gq – Equatorial Guinea
.gr – Greece
.gs – South Georgia and the South Sandwich Islands
.gt – Guatemala
.gu – Guam
.gw – Guinea-Bissau
.gy – Guyana
.hk – Hong Kong
.hm – Heard and McDonald Islands
.hn – Honduras
.hr – Croatia/Hrvatska
.ht – Haiti
.hu – Hungary
.id – Indonesia
.ie – Ireland
.il – Israel
.im – Isle of Man
.in – India
.io – British Indian Ocean Territory
.iq – Iraq
.ir – Iran, Islamic Republic of
.is – Iceland
.it – Italy
.je – Jersey
.jm – Jamaica
.jo – Jordan
.jp – Japan
.ke – Kenya
.kg – Kyrgyzstan
.kh – Cambodia
.ki – Kiribati
.km – Comoros
.kn – Saint Kitts and Nevis
.kp – Korea, Democratic People's Republic
.kr – Korea, Republic of
.kw – Kuwait
.ky – Cayman Islands
.kz – Kazakhstan



.la – Lao People's Democratic Republic
.lb – Lebanon
.lc – Saint Lucia
.li – Liechtenstein
.lk – Sri Lanka
.lr – Liberia
.ls – Lesotho
.lt – Lithuania
.lu – Luxembourg
.lv – Latvia
.ly – Libyan Arab Jamahiriya
.ma – Morocco
.mc – Monaco
.md – Moldova, Republic of
.mg – Madagascar
.mh – Marshall Islands
.mk – Macedonia, The Former Yugoslav Republic of
.ml – Mali
.mm – Myanmar
.mn – Mongolia
.mo – Macau
.mp – Northern Mariana Islands
.mq – Martinique
.mr – Mauritania
.ms – Montserrat
.mt – Malta
.mu – Mauritius
.mv – Maldives
.mw – Malawi
.mx – Mexico
.my – Malaysia
.mz – Mozambique
.na – Namibia
.nc – New Caledonia
.ne – Niger
.nf – Norfolk Island
.ng – Nigeria
.ni – Nicaragua
.nl – Netherlands
.no – Norway
.np – Nepal
.nr – Nauru
.nu – Niue
.nz – New Zealand
.om – Oman
.pa – Panama
.pe – Peru
.pf – French Polynesia
.pg – Papua New Guinea
.ph – Philippines
.pk – Pakistan
.pl – Poland
.pm – Saint Pierre and Miquelon
.pn – Pitcairn Island
.pr – Puerto Rico
.ps – Palestinian Territories
.pt – Portugal
.pw – Palau
.py – Paraguay
.qa – Qatar
.re – Reunion Island
.ro – Romania
.ru – Russian Federation
.rw – Rwanda
.sa – Saudi Arabia
.sb – Solomon Islands
.sc – Seychelles
.sd – Sudan
.se – Sweden
.sg – Singapore
.sh – Saint Helena
.si – Slovenia
.sj – Svalbard and Jan Mayen Islands
.sk – Slovak Republic
.sl – Sierra Leone
.sm – San Marino
.sn – Senegal
.so – Somalia
.sr – Suriname
.st – Sao Tome and Principe
.sv – El Salvador
.sy – Syrian Arab Republic
.sz – Swaziland
.tc – Turks and Caicos Islands
.td – Chad
.tf – French Southern Territories
.tg – Togo
.th – Thailand
.tj – Tajikistan
.tk – Tokelau
.tl – Timor-Leste
.tm – Turkmenistan
.tn – Tunisia
.to – Tonga
.tp – East Timor
.tr – Turkey
.tt – Trinidad and Tobago
.tv – Tuvalu
.tw – Taiwan
.tz – Tanzania
.ua – Ukraine
.ug – Uganda
.uk – United Kingdom
.um – United States Minor Outlying Islands
.us – United States
.uy – Uruguay
.uz – Uzbekistan
.va – Holy See (Vatican City State)
.vc – Saint Vincent and the Grenadines
.ve – Venezuela
.vg – Virgin Islands, British
.vi – Virgin Islands, U.S.
.vn – Vietnam
.vu – Vanuatu
.wf – Wallis and Futuna Islands
.ws – Western Samoa
.ye – Yemen
.yt – Mayotte
.yu – Yugoslavia
.za – South Africa
.zm – Zambia
.zw – Zimbabwe

What Causes Sandbox filter? New experiment reveals why sandbox filter exists. Is sandbox a side effect of trust rank?

What Causes Sandbox filter? New experiment reveals why sandbox filter exists. Is sandbox a side effect of trust rank?


There have been numerous discussions on the sandbox filter of google which holds sites for up to 16 months before the site starts ranking well in google results. So what causes this filter? After loosing patience in ranking client sites, we at Search Engine Genie conducted a test across about 15 sites. This test revealed interesting results; following is given a summary of the Anti - Google Sandbox filter experiment.



1. Does the sandbox filter really exist?
Based on our experiment it is understood that, sandbox filter does exist. But sandbox filter doesn't affect all sites; sites on the other hand are artificially linked to rank, through search engine optimization.

2. What causes sandbox filter?

This is an important question asked several times in forums, message boards and blogs. No one was able to give a definite answer for it. Even we had to struggle a bit to figure out what this sandbox is all about. Finally with our experiment using different strategies on about 15 sites we were able to find the cause of the sandbox filter. Our experiments prove that about 90% of theories circulating around in forums and other articles are wrong. Sandbox filter is caused purely by links and nothing else. It is just the abnormal growth of links in google algorithm's eyes which results in a site being placed in sandbox.

As we all know, internet is based solely on natural linking. Search engines and their ranking algorithms are the main cause for artificial linking. Now search engines like google woke up to the occasion and are combating link spam with sophisticated algorithms like the sandbox filter. Sandbox is a natural phenomenon affecting sites using any sort of SEO type methods to gain links. Next topic explains it. Sandbox filter relies on trust rank to identify quality of links not PageRank as it used before.

Trust rank is a new algorithm active with google. The name trust rank is common for Google, Yahoo! and other search engines. So we can use the word trusted links. Trusted links are links which are hand edited for approval, algorithmically given maximum credit to vote for other sites like links from reputed sites, links from authority of the industry like the .gov sites etc. Google's new algorithm sees what type of trusted links a site has, especially if the site is new. If a site starts off with SEO type links than naturally gained trusted links, the site will be placed in sandbox for a period of up to 1 year or even more.



3. What factors / methods lead to sandbox filter?

All types of SEO type link building will lead to sandbox filter.

a. Reciprocal link building:-

Reciprocal link building is one important method which will lead to definite sandbox / aging filter. When a site starts off with reciprocal link building, definitely their site will be sandboxed. It is because first of the reciprocal link building is not the way to build trusted links. Sites which are trusted / hand edited for approval do not have to trade links with other sites, they would rather voluntarily link out to other sites. No one can force them to link out or trade with them for a link. Most of the sites involved in reciprocal link building are very weak themselves. So if a site is involved in reciprocal link building they are not going to get trusted links. So if a new site grows with untrusted links they will be placed in the aging filter. We don't blame reciprocal link building, in fact we do it all the time for our clients, but reciprocal link building by itself, doesn't add any value to the end users, Plus reciprocal link building is built purely to manipulate Search engine result pages. So Google is not the one to be blamed to come out with such an amazing algorithm which can fight aggressive link building so effectively. For a new site we don't recommend reciprocal link building immediately, first build a trust for your site then do reciprocal link building, there are no issues that time. So how do you know you have built trust with google's algorithm? It is proved by ranking. If your site is ranking well for competitive and non competitive terms definitely you can assume that you are not in sandbox any more. We are talking about a minimum of 1 month ranking, not a day or week ranking for good terms.

b. Buying links: -

This is another most important method which will definitely lead to severe sandbox filter. Our experiment proved buying links for new sites will hurt the site badly when it comes to Google. When you start a site don't immediately go and buy lot of links. Buying links are not the best way to gain trusted links. Most of the sites were you buy links are monitored by Google actively. Even if you buy link from a great site in your field still it won't be considered a great trusted link. Especially site wide links (links which are placed throughout a site) are very dangerous for a new site this type of links will definitely delay the ranking of a site to a great extent. Even if you happen to buy link from great sites make sure you just get link from only one page, Also make sure that link is not placed in footer, it should be placed somewhere inside the content to make it look more natural. It is not worth buying links to for new sites. Give time for the site to grow up with naturally gained backlinks.

c. Directory submission: -

Directory submission has proved worthless when it comes to avoiding sandbox, though directory submissions don't directly cause sandbox but those links will definitely affect the reputation of a site especially when the site is new. As discussed before it is important to gain trusted links when the site is new. But when you do directory submissions as the first step to building links we recommend avoiding it because when search engines see those links, they will place your site into the aging filter. Most of the directories are newbie and start up directories. We can name just 4 or 5 directories which are trustworthy to gain. Ask yourself whether you will go to a directory to find relevant sites today? I am afraid no, Directories are a thing of the past and people use search engines to find all information. That is why search engines don't prefer to list directories in their listings, 2 directories which are an exception are the dmoz.org and the yahoo directory.

If you get a link from dmoz.org think you have got the best trusted link on internet, but dmoz takes up to 15 months to list a site, even they hold a site to grow to certain quality level. Yahoo directory is not as powerful as dmoz most possibly because it is paid and most of the paid information on internet is corrupted. Next to dmoz, yahoo directory is a safe and trusted place to get a listing. Other than that we don't recommend any directory whether it is paid of free.

We don't deny links from directories but it is not recommended to get link from useless / spam / unworthy directories especially if the site is new.

d. Links from comment spam / blog spam / guest book spam / message board spam:

If you are an aggressive SEO and has been using bad tactics, then you are not the people for Google, or at least not for the new Google. You are going to wait for ever to get top rankings in google with a new site if your initial links are coming from spam sources. These link tactics still work with yahoo and MSN but not with google anymore. If you launch a new site and take this path I would expect a sandbox period of about 2 years and by that time your site my be caught in some other sort of more severe penalty. Better don't do this with google.

e. Links through article reproduction:

Links through article reproduction have always been a good way to build links, but not anymore for good sites. Google has started being very brilliant in finding duplicate copies across the web. If you rely on article reproduction as backlinks then you are dealing with links from duplicate copies of the article. When Google thinks it as unworthy to its index, take a good article snippet and search google, you will see apart from 2 or 3 main copies and all the other copies will be supplemental results. This is one way to find whether the article is being deemed as spam. If you are getting links from these duplicate copies it will no way help the trustworthiness of your site. So it is better avoid doing article reproduction but make people link to a good article of your site from other sites.

f. Links from a network of sites you own.

Always avoid this when you run a new site. Some people especially SEO companies tend to connect a new client site to an existing network of other active clients plus the sites they own or have tie-up with. Just avoid this anymore for new sites. It just doesn't work for new sites and it only delays the sandbox process. The network you might have access to, doesn't have the trust to vote for a new site. So better avoid linking from your own network.

g. Don't participate in co-op link networks like digital point ad network or link vault:

Almost all the sites involved in these ad networks are not-worthy. Yahoo is very severe with these types of networks. Google also has good algorithms to identify these links. These links never work for new sites and it just delays the aging filter. Better avoid these types of link networks.

4. Is it some sort of hidden penalty?

Yes, it is a kind of penalty for a site. Penalty is a tougher word to give this filter but it is true that it is some kind of hidden penalty. Google treats sandbox filter a kind of mean penalty for a site and it just holds the site in penalty to prove its trust to the web. Google holds the site to see link growth patterns.

5. What is the inner working of the sandbox filter?

In our research we were able to find how this sandbox / aging filter works. First when google finds a site it gives an inception date (the date when google first found the site). From that time it sees the growth pattern of links. If the algorithm sees lot of not-trustworthy links coming into the site (especially if the site is new) it will place the site in hold from ranking for anywhere between 3 to 16 months. Once the site is in sandbox, Google's algorithm sees the growth of link patterns. If it sees a normal growth of both trusted / normal links it will release the site sooner out of sandbox. But if the algorithm sees growth of un-trusted links the site will be delayed from ranking much longer.

6. Is sandbox based on whois registration date?

No, based on our experiment, sandbox filter doesn't rely on whois for determining the age of a site.

7. How to detect if the site is in sandbox filter?

There are various ways to know if your site is in sandbox.

a. One very good method which has long been in existence is the allinanchor: keyword1 keyword2 search. If your site ranks for the allinanchor: search but not for the normal keyword search, then most probably your site is in the sandbox filter.

b. Next method is to check your ranking in other major search engines. If your site is ranking exceptionally well in Yahoo and MSN but not in Google then it could be another possible reason for the site to be in sandbox filter. But remember, Google's algorithm is more sophisticated than yahoo or MSN and judging that your site is in sandbox only by this reason is absolutely wrong.

c. Check the quality of your backlinks and compare it with your competitors. There are numerous tools which show link comparison. Compare them and see what your competitor has, see the location where the link is placed to your competitor sites.

d. Check for both competitive and non-competitive terms. If your site doesn't rank for both then most probably your site is sandboxed.

e. Check for duplicate content on your site. If your site has duplicate contents better fix them first before checking for sandbox problems. As I said before sandbox filter is caused purely based on links. You should go to check your link problems only if your on-page factors are fine and are of Google quality. Remember, all these information discussed are for sites which have quality content, unique business etc. These instructions are not for sites which use scrapped contents, made for AdSense sites, aggressive affiliate sites and sites that use other type of spam tactics. This is purely for sites which are having good quality domains and are still having problem ranking in search engines.

8. Does sandbox filter affect only certain areas of search?

No, Sandbox filter affects all areas. It is not keyword or topic based but more links based. Links are everywhere and the filter is applied everywhere.

9. Can a competitor sabotage a ranking using the evil sandbox filter of google?

This is a very good question. Having discussed all these information, if we don't address this issue people will back fire us. Our research was focused on this issue too. We included some older domains in the test and we successfully found that a competitor CANNOT sabotage an existing ranking of a site through these link filter algorithms. But why can't a competitor sabotage ranking?

Here I explain from the conclusion of our experiment:

What happens with Google is they want the site to be trusted before the site starts ranking very well. So in order to rank well a site should show the trust it has with Google's algorithm. But this trust is a onetime process. If the site establishes its trust with google and starts ranking well then they don't have to prove it again as long as the trusted links exist.

For example if a site is new and the site starts off with good trusted links, then the site doesn't get sandboxed at all. It will start ranking very well, say for a month and an aggressive competitor who has been watching this plans to destroy the ranking of the good site. He spams all blogs, message boards, guest books, posts sitewide links on many sites etc. So will he succeed in sabotaging the ranking of the good site? No, since the good site has already established trust with google algorithm all these links from bad pages will only boost the ranking of the good site. No way will it harm the good site. Google algorithm knows this very well.

So why cant a competitor do this when the site is still new and not ranking?
It is because he won't be aware that a site like that is growing strong, firstly. Unless a new site ranks well for targeted phrases the site is not a problem to any one's eyes. Only if the site ranks will it itch the eyes of evil SEO companies and competitors. Only then they will plan to kill the rankings. But, by that time it will be too late, since the site has already established a trust with google'algorithm based on quality backlinks which it possesses already. So if a site owner thinks of sabotaging the existing ranking of a site by sending bad spam backlinks to their site, then they should remember that they are just actually helping someone rank better than ever. This is one reason why Google always preaches that no one can harm a site's ranking other than the people responsible for the site.

So what is the proof for the above statement?

Our experiment is proof, though we are an ethical SEO company we know places where the bad backlinks are. We sent thousands of those backlinks to sites we monitor which are ranking currently well. We kept those links alive for 2 or 3 months and we were able to conclude that those bad links are doing nothing harmful to the site, In fact it was boosting those sites rankings and google's algorithm never cared of those backlinks.

We are not discussing canonical issues, 302 hijacking, 301 problems etc here, we are just addressing whether a competitor can sabotage ranking for a site through link spam. There had been cases where 302 hijacking by external domains hurt a site. Matt cutt's (Google's senior Engineer) blog is a proof for this. Dark SEO team hijacked Matt Cutts's blog through a tricky 302 redirect, but that is a different issue and this article deals with sandbox filter only.

10. Do we have the risk of loosing credit for untrusted links if it is gained for a new site before receiving trusted links?

Yes, this is a major risk factor. Sometimes it happens that most of the links are traded for a new site or are bought. A site owner would have possibly read in forums that reciprocal link building or buying links are the best ways to build links. Then he realizes that his site is not ranking. A site owner finally decides that he needs quality links he goes for branding and writes great articles, brings interesting stuff to the site which starts attracting high quality natural links. His site starts steadily improving and this time the site owner has the risk of loosing credit for the reciprocal links / bought links which was there before his site started acquiring trusted links. We have seen this happening with Google. So we recommend to site owners / SEO companies that they don't use link building methods for new sites which brings in not–trustworthy links. For a new site think of natural ways to build links. Imagine how great sites grew to this level. Not all sites started with 10,000 links to them. Successful sites take time to build their brand that is how search engines want new sites to be. Think of ways to bring in great links once your site is good enough and use all ethical link building strategies in books.

11. How to escape the sandbox filter?

a. Don't go for any of the artificial link building methods for a new site. Some of the methods to be avoided initially are reciprocal link building, directory submissions ( excluding dmoz and yahoo directory ) , buying links, article reproduction, links from co-ad networks, links from network of your own sites, links from spam sites, sitewide links etc.

b. Make the links grow more naturally at a slow rate for new sites.

c. Show a steady growth of links to Google. Make them understand that your site is trustworthy.

d. Attract people to link into your site naturally and think of the great Million Dollar Homepage concept. If a college going youngster can attract 60,000 links in a few months why not you? If you can even get 10% of the quality natural links your site will do really great.

12. How to get the site out of sandbox?

This is an important question which many people ask.

First think of ways to attract natural links, don't just run a site with commercial focus, today both users as well as search engines want information and commercial mixed. Take for example, Search Engine Genie which is a mixed site were we have tools, blog, articles and much more stuff to give out to users. But we also have commercial focus. Similarly there are great SEO sites that do the same. SEOMoz, SEOcompany.ca etc. Though these sites are new, they have come out very strong just because they are a valuable resource to search engines and users.

Submit to dmoz. First make sure your site is good enough to be accepted into dmoz by reading their guidelines and then submit your site.

Write great articles and make people to link to them. Make people aware that a good article exists on your site, participate in related forums post your article there and ask for an opinion. If people like your article you will get a ton of traffic and some people as a token of appreciation will post a link from their site to your article.

"Start a blog share to the world what you know about business, show them your expertise. Blogs attract lots of links today; you can even give your content to other blogs as feeds as well. "

Think of a great strategy like the million dollar homepage to build backlinks.

As Matt Cutts suggests, even basic interviews will attract good backlinks.

We have listed some but you can think more, think all the best possible ways to build natural links till your site gains the trust, once it has the necessary trust you can use all the backlink strategies.

Wait for 3 months at least because if you are building natural backlinks it takes time to kick into google's algorithm,

13. Does Search Engine Genie's experimental finding work?

Our experiment has been tested across sites which deal with various topics. We have expanded our research to many more sites we handle, for the new sites we handle the way to escape sandbox has worked well.

You can see this only after you see it happening for your site, so, if you have a new site test it for yourself and post the results through comments in this post.

14. My site has never been in the Sandbox. Why not?

Couple of reasons for that.

1. Your site would be an older site, Sites prior MAY 2004 didn't have this problem. In fact at that time we were able to rank a site with any sort of backlinks within a month. Now it is not the case.

2. You wouldn't have worried about SEO type backlinks.

3. You might possibly have registered an expired domain which was ranking well and had / have trusted links before it came to you. Google's expired domain penalty doesn't work properly for some sites. One of the sites which belong to our client got expired before 3 months. We tried contacting our client numerous times prior domain expiration but it seemed he was very sick and was in hospital for 6 months and we could not contact him. We tried backordering but we were in queue. The domain is a popular domain and was in pagerank 7 category of dmoz. The domain entered pending deletion period and finally it expired and now the person who first preordered got the domain and the domain is "www.theautomover.com". It is so sad that expired domain penalty don't work because that site is ranking well now, even if the content was completely changed and now belongs to a new owner. We appeal to google to monitor these expired domains more actively.

This article addresses latest improvements in Google with relation to sandbox filter. In fact some points discussed here are from the latest Google update Jagger.

Online shoppers turn to Google for search - report

A new report shows Google being the leading search engine for online shopping search users,



Online shoppers picked Google Inc. as their search engine of choice this December while making their holiday Web purchases, according to a report issued on Wednesday.
Internet measurement firm Hitwise found that 11.1 percent of all December shopping-related visits originated with Google, a 28 percent jump over last year. But online auction giant eBay Inc was the biggest driver of traffic to shopping sites, generating more than 13 percent of retail traffic.
Search engines Yahoo! Search and Microsoft Corp.'s MSN Search drove 4.05 percent and 0.79 percent of retail visits, respectively.
Online retail giant Amazon.com, which is the second most visited online shopping site after eBay, generated only 0.75 percent of visits to other shopping retailers, Hitwise found.




MSN reverse IP tool - MSN offers a new tool to check sites hosted on same IPs,

MSN has brought a great tool, This tool is currently given by whois.sc as a paid tool, MSN allows us to check sites hosted on same IP.



Lots of servers today hosts sites on shared IPs we have heard of sites sharing their IP with 500 other sites, Now MSN tool allows us to check that, Its listed here,



http://search.msn.com/docs/help.aspx?t=SEARCH_REF_AdvSrchOperators.htm#op_ip

Google Earth now available for Macs

Google has always been focusing their products to windows users, Its true windows users are the major users which use their search results,



but google has drifted a bit and is now offering google earth mac users,

Google says

" "Available in beta for 10.4.x OS [that's Tiger], Google Earth for Mac offers the same features as the PC version, such as animated driving directions, zoom in and out capabilities, 3D buildings view, and more. Google Earth Plus and Google Earth Pro are not yet available for Mac."

Google earth download page has the following message,



"Download Google Earth - Mac or PC
Google Earth is a broadband, 3D application that not all computers can run.
Desktop computers older than 4 years old may not be able to run it.
Notebook computers older than 2 years old may not be able to run it. "


Google update bigdaddy on and off, - Google's upcoming update is being tested repeatedly,





Google's upcoming update "Big Daddy" named by matt cutts of google is being tested, The test datacenter http://66.249.93.104/ comes online and goes offline repeatedly, As matt says these results are expected to be live in an other 2 weeks, so people check your rankings in this DC,




Matt cutts asking for feedback or various issues related to google,

Mattcutts of google has asked for feedback on various topics related to google,



He asks feedback on following issues,

Feedback: Webspam in 2006?
Feedback: Search quality in 2006?
Feedback: Products/features in 2006?
Feedback: Webmaster services in 2006?
Feedback: Communication/Goodwill in 2006?
Feedback: What did I miss?

Give your feedback in matts blog,



mattcutts.com/blog/

Review on the New Google Update – Big Daddy - New year special Bigdaddy

Review of New Google Update

Big Daddy:

Matt Google's senior engineer has asked for feedback on the new results which were live on test datacenters 64.233.179.104 , 64.233.179.99 etc,

Matt cutts has asked semiofficially for feedback to their new index, Currently the new results are live on http://66.249.93.104 , http://66.249.93.99 etc, the 66.249 range is steady and is showing new results from 1st of January 2006, Even as matt said 64.233.179.104 will also be the test DC we don't see that datacenter to be a steady one, Test results come and go in that particular DC,



Seeing the results on the test DC we see many issues fixed ,

1. Better Logical arrangement of sites in site: and domain only search,

Whether deliberately or by mistake google was unable to logically arrange site: search and domain only search, Homepage has been the most important page for many sites and google buried homepage when doing site:domain.com search or through domain only search, Now in the new datacenter we see a better logical arrangement of sites and pages in sites are listed in proper order compared to other Datacenters, Yahoo has been excellent in arranging site: search for a site, they show the best URL arrangement, Yahoo still shows best arrangement of pages for a site: search, Google has been known to hide many good information from end users, especially the link: search for a site, Google purposely shows less than 5% of links through link: search confusing many webmasters, so probably they do the same with site: search but atleast they have a fix for the homepage in the New big daddy datacenters'

2. Removal of URL only entries / URL only supplemental.

For a very long time Google has been showing URL only listing in SERPs( search engine result pages ) for some sites, Most possibly reasons for URL only listings / supplement results are duplicate pages across a site or across sites, pages which doesn't have any link to it, Pages which are once crawl able but later not crawl able, pages which no longer exists, pages which were not crawled by google for a very long time possibly because of some sort of automated penalty, pages which was once crawled well but due to some web host problems is not crawlable by search engines, redirect URLs etc, It seems Google has fixed this major issue, URL only listings don't add value to google's index and good that it has been removed from the search results, Though we see some supplement pages hanging around here and there most of those pages seem to be caused by duplicate content across sites,

Recently google is very severe on duplicate contents especially syndicating articles, if there are lot of copies of same articles published across sites google makes some of the duplicate copies supplement results, Only these results seem to stick in the new DC other than that no supplement results are found,



For example this is the result of the main www.google.com DC


Result in updated Datacenter New DC: 66.249.93.104 ( Verified )




Good indexing of pages for a site:

Recently google has been showing vague page counts for sites they indexed, For one site we work google has been showing 16,000 pages indexed where as the site itself doesnt have more than 1000 pages, But the new DC shows only 550 pages indexed which is a good sign, And all the 550 pages are unique pages without supplementals and we can expect them to rank soon, this is a very good improvement, we have seen the same accuracy across many sites we monitor, the new DC gives much better page counts,

Better URL canonicalization:

We see big improvement of google's understanding of the www and non-www version, we have checked it across a lot of sites, for example for the search dmoz.org and www.dmoz.org google lists only one version of the site that is dmoz.org, before they show one version URL only and the other one real since it causes a lot of duplicate issues, we see the same fix across sites we monitor, Most of the results are now very good due to google handling redirects pretty well, that is a very good sign for a lot of sites, best advantage are IIS hosted sites which don't have .htaccess site, lots of sites were unable to to 301 redirect to any one version of the URL, either www or non-www now they don't have to do that google can understand that both URLs are same for a site,

Better handling of 302 redirects:

the new datacenter is doing well in handling sneaky 302 redirects which used to confuse googlebot a lot, there has been numorous discussions on this 302 redirect issue, google has been closely monitoring this problem and finally they came out with a good fix in the new big daddy update DCs,

For example check here,

https://www.google.com/search?hl=en&q=nigerianspam

you can see the page of anybrowser.com having the same title as nigerianspam homepage




http://66.249.93.104/search?hl=en&lr=&q=nigerianspam&btnG=Search

you can see the page of anybrowser.com having the same title as nigerianspam homepage is removed / missing




Pagerank of canonical problem URLs not fixed,

Google has not fixed the pagerank of sites with canonicalization, Hope it will be fixed in the coming dates,

some jump scripts which uses 302 redirects are not fixed,

hope it will improve soon, example:

http://66.249.93.104/search?q=inurl:www.searchenginegenie.com+-site:www.searchenginegenie.com&hl=en&lr=&start=10&sa=N




Search Relevancy not accurate both in current results and new DC,

When we searched for DVD ( digital versatile disc ) in google we found google.co.uk ( google's UK regional domain ) ranking in top 5, that is too bad google doesnt have anything to do with DVD, we hope it will be fixed soon,

https://www.google.com/search?hl=en&lr=&q=dvd

http://66.249.93.104/search?hl=en&lr=&q=dvd

Possible duplication caused by new improved page indexing by google,

Google used to have 101 kb indexing limit for a page, now they relaxed that limit and they can crawl more than 500 kb, But it seems the 101 kb limitation index is still available and is visible often is search, we see it across our client sites, here is an example on wikepedia page:

https://www.google.com/search?hl=en&lr=&q=%22en.wikipedia.org%2Fwiki%2F2004_Indian_Ocean_earthquake



Not fixed in new datacenter too,

http://66.249.93.104/search?hl=en&lr=&q=%22en.wikipedia.org%2Fwiki%2F2004_Indian_Ocean_earthquake



That is all for review now more review will follow soon,

SEO Blog Team,

danny sullivan? - Who is danny sullivan? Editor of Search Engine Watch.

Who is danny sullivan?

Danny Sullivan is the Managing Editor of the famous Search Engine Watch. A well known authority in search engines, he is widely known as the "Search Engine Guru". His creativity and knowledge in writing articles is indescribable. Danny is also a well known speaker in conferences and is known for his eloquence.
Danny worked in news papers like LA Times and Orange County Register and then turned his concentration over to web. Danny Sullivan started his career in the web world in 1995, managing a web marketing company which later became Search Engine Watch.



Danny strongly believes search engine marketing has a future. In a conference he spoke, Dannie compared search engines to a 'reverse broadcast network'.
Danny successfully provide website consulting services, from site conception to internet publicity. He also authors the monthly search engine report newsletter. In the search engine strategies conferences focusing on search engine marketing issues, he writes expert session content.
Danny Sullivan is climbing heights in the field of search engine marketing, alert of changes in trend and making newer innovations in the field.




tim mayer - who is tim mayer? the search king of yahoo.

who is tim mayer?



Behind the product direction of Yahoo! Search technology is the brilliant brain of Tim Mayer. He has over 10 years of search experience in Yahoo added to his achievements in Overture, Inktomi, FAST Search & Transfer.
Tim was the Project manager of the SAEGIS and NameStake search platforms. A literature Graduate with an M.B.A. from Babson College he has been successful in all his career endeavors.
Tim spent two years in product management and product marketing in the web search division at Inktomi. Then Tim became the Vice President and General Manager of the FAST Web Search division.
At overture he successfully did production direction and also managed the major web search affiliate partner relationships.




matt cutts AKA googleguy - who is matt cutts ?

Matt Cutts is no new name in the online search engine marketing industry. Matt's name is not limited to one specific field. His credits include a respectable job in Google, a blog that interests all kind of traffic, and a commendable fondness on insects!



Matt Cutts joined Google in January 2000 as a software engineer and got a fabulous chance to implement the first test of the AdWords user interface. Matt spent most of his time in the quality group of Google and eventually implemented Google's SafeSearch, the family filter.
Matt has an M.S, from UNC-Chapel Hill, a double degree in Mathematics and Computer Science.
The blog developed by matt has become a great resource for those fascinated by search engine news. It provides a channel of communication with webmasters also.



Matt used to post with the nickname Googleguy in searchenginewatch and webmasterworld, even now he continues to post with that ID<

Matt's site: www.mattcutts.com/blog/

Scottie Claiborne - who is Scottie Claiborne? review on scottie of successful-sites

Who is Scottie Claiborne?

There are some people, who live in a dream world, and there are some who face reality; and then there are those who turn one into the other and Scottie Claiborne matches the third.



Right from her childhood, she was in touch with sales and marketing and took her first lessons in marketing from her mother's jewelry shop. Thus her sales ideas grew, found out new strategies in the game of marketing, caught the tactics behind the career and finally employed it as her vocation.

She enjoyed the fun in watching different sales techniques, how it worked and how it failed at times. Scottie found interesting how the chain of marketing ran successful that started from message, and ended up in audience through merchandise.

Scottie placed her first step in online marketing successfully when she became confident of managing sales in any stage. Since 1998, Claiborne has been developing corporate intranets and was managing online content management system for Levity Technologies, Inc. as Vice President of Product Development. She then started Right Click Web Consulting in 2001 with the idea of maximizing the potential of the web.

Right Click Web Consulting is a result of clear planning, right management and bright execution of ideas. She knew what a website was in need of and was upright in the decisions she took.



"Succeeding on the web is all about timing, creativity, and usability. It's more than marketing since a website acts as the billboard, storefront, and processing center all in one," says Scottie with a deep and strong knowledge about the networking world. Her Masters Degree in IT is an added feather to her colorful success.

Scottie Claiborne also contributes to newsletters and articles and takes an authoritative tone in the concepts she deal with. Staying ahead of Competitors, Web Analytics Terminology, Hiring a Search Marketer etc are right examples for her creative contributions to the field of online marketing. Scottie's articles carry an experienced pitch and could confidently handle any subject she was familiar with.

Scottie proved herself unbeatable. As days pass by, she might open up new doors in the field of online marketing.

Scottie's sites: www.rightclickwebs.com,
www.successful-sites.com

Do visitors from google images convert? very interesting thread discussion in webmasterworld.com,

There is an interesting thread in webmasterworld.com which discusses whether visitors from google images convert, In my point of view it all depends on the site we run, If the site is basically built around images and art it is best to feature in google image search,

this is an interesting posting from the same thread,



My site was basically built around a large photo gallery, so image search
is important to me, and generates a big share of my traffic.
But like a lot
of image galleries, the visitor may not be actually looking to buy anything.
Sometimes they are actually looking for images, prints, posters, etc, none of
which I sell.
But I do have ads on the site and a modest number of visitors
click the ads on any given image page.
What has worked well for me is
providing links to pages about stuff some percentage of visitors coming in
through image searches might be interested in, and putting ads on those pages
that they are more likely to click.
As a side note, because my images get
"borrowed" at a fairly high rate, I decided to label most of them with my url.
Those "borrowed" images appear to generate a pretty decent level of fairly
targeted traffic. Basically I turned the images themselves into ads for my site.
That might be harder to do with an image that comes from a product
description page, but it might be worth giving some thought as to how to make it
work in that situation.




Hacking mania continues - another PHPBB forum hacked by notorious hackers

An other PHPBB forum was hacked recently by notorious hackers, these hackers are everywhere they hack everything,
we saw the following message ,



Haha Hack Hack Hack Hacklendin Arkadaş Kaderin Böyley miş?

ThE By DeİsGeN HaCKeD_VaTaNİsT

DEfaCe TiM => HaCKeD_SeRvO CyBerNıghtmare VaTaNİsT

ßu SiTe HaCKeD_VaTaNİsT TaRaFıNDaN HacKLeNMiSDiR!!!

HaCKeD_VaTaNİsT

when we visited this forum

netscalped.com/phpbb%202.11/phpBB2/index.php?sid=652df0aa9ac66fe48ec761b8689a2e52





Live Help/Support/Chat


Recent Stories

Total Stories


Helpful Links