What Causes Sandbox filter? New experiment reveals why sandbox filter exists. Is sandbox a side effect of trust rank?
There have been numerous discussions on the sandbox filter of google which holds sites for up to 16 months before the site starts ranking well in google results. So what causes this filter? After loosing patience in ranking client sites, we at Search Engine Genie conducted a test across about 15 sites. This test revealed interesting results; following is given a summary of the Anti - Google Sandbox filter experiment.1. Does the sandbox filter really exist?
Based on our experiment it is understood that, sandbox filter does exist. But sandbox filter doesn't affect all sites; sites on the other hand are artificially linked to rank, through search engine optimization.2. What causes sandbox filter?
This is an important question asked several times in forums, message boards and blogs. No one was able to give a definite answer for it. Even we had to struggle a bit to figure out what this sandbox is all about. Finally with our experiment using different strategies on about 15 sites we were able to find the cause of the sandbox filter. Our experiments prove that about 90% of theories circulating around in forums and other articles are wrong. Sandbox filter is caused purely by links and nothing else. It is just the abnormal growth of links in google algorithm's eyes which results in a site being placed in sandbox.
As we all know, internet is based solely on natural linking. Search engines and their ranking algorithms are the main cause for artificial linking. Now search engines like google woke up to the occasion and are combating link spam with sophisticated algorithms like the sandbox filter. Sandbox is a natural phenomenon affecting sites using any sort of SEO type methods to gain links. Next topic explains it. Sandbox filter relies on trust rank
to identify quality of links not PageRank as it used before.
Trust rank is a new algorithm active with google. The name trust rank is common for Google, Yahoo! and other search engines. So we can use the word trusted links. Trusted links are links which are hand edited for approval, algorithmically given maximum credit to vote for other sites like links from reputed sites, links from authority of the industry like the .gov sites etc. Google's new algorithm sees what type of trusted links a site has, especially if the site is new. If a site starts off with SEO type links than naturally gained trusted links, the site will be placed in sandbox for a period of up to 1 year or even more.3. What factors / methods lead to sandbox filter?All types of SEO type link building will lead to sandbox filter.a. Reciprocal link building:-
Reciprocal link building is one important method which will lead to definite sandbox / aging filter. When a site starts off with reciprocal link building, definitely their site will be sandboxed. It is because first of the reciprocal link building is not the way to build trusted links. Sites which are trusted / hand edited for approval do not have to trade links with other sites, they would rather voluntarily link out to other sites. No one can force them to link out or trade with them for a link. Most of the sites involved in reciprocal link building are very weak themselves. So if a site is involved in reciprocal link building they are not going to get trusted links. So if a new site grows with untrusted links they will be placed in the aging filter. We don't blame reciprocal link building, in fact we do it all the time for our clients, but reciprocal link building by itself, doesn't add any value to the end users, Plus reciprocal link building is built purely to manipulate Search engine result pages. So Google is not the one to be blamed to come out with such an amazing algorithm which can fight aggressive link building so effectively. For a new site we don't recommend reciprocal link building immediately, first build a trust for your site then do reciprocal link building, there are no issues that time. So how do you know you have built trust with google's algorithm? It is proved by ranking. If your site is ranking well for competitive and non competitive terms definitely you can assume that you are not in sandbox any more. We are talking about a minimum of 1 month ranking, not a day or week ranking for good terms.b. Buying links: -
This is another most important method which will definitely lead to severe sandbox filter. Our experiment proved buying links for new sites will hurt the site badly when it comes to Google. When you start a site don't immediately go and buy lot of links. Buying links are not the best way to gain trusted links. Most of the sites were you buy links are monitored by Google actively. Even if you buy link from a great site in your field still it won't be considered a great trusted link. Especially site wide links (links which are placed throughout a site) are very dangerous for a new site this type of links will definitely delay the ranking of a site to a great extent. Even if you happen to buy link from great sites make sure you just get link from only one page, Also make sure that link is not placed in footer, it should be placed somewhere inside the content to make it look more natural. It is not worth buying links to for new sites. Give time for the site to grow up with naturally gained backlinks.c. Directory submission: -
Directory submission has proved worthless when it comes to avoiding sandbox, though directory submissions don't directly cause sandbox but those links will definitely affect the reputation of a site especially when the site is new. As discussed before it is important to gain trusted links when the site is new. But when you do directory submissions as the first step to building links we recommend avoiding it because when search engines see those links, they will place your site into the aging filter. Most of the directories are newbie and start up directories. We can name just 4 or 5 directories which are trustworthy to gain. Ask yourself whether you will go to a directory to find relevant sites today? I am afraid no, Directories are a thing of the past and people use search engines to find all information. That is why search engines don't prefer to list directories in their listings, 2 directories which are an exception are the dmoz.org and the yahoo directory.
If you get a link from dmoz.org think you have got the best trusted link on internet, but dmoz takes up to 15 months to list a site, even they hold a site to grow to certain quality level. Yahoo directory is not as powerful as dmoz most possibly because it is paid and most of the paid information on internet is corrupted. Next to dmoz, yahoo directory is a safe and trusted place to get a listing. Other than that we don't recommend any directory whether it is paid of free.
We don't deny links from directories but it is not recommended to get link from useless / spam / unworthy directories especially if the site is new.d. Links from comment spam / blog spam / guest book spam / message board spam:
If you are an aggressive SEO and has been using bad tactics, then you are not the people for Google, or at least not for the new Google. You are going to wait for ever to get top rankings in google with a new site if your initial links are coming from spam sources. These link tactics still work with yahoo and MSN but not with google anymore. If you launch a new site and take this path I would expect a sandbox period of about 2 years and by that time your site my be caught in some other sort of more severe penalty. Better don't do this with google.e. Links through article reproduction:
Links through article reproduction have always been a good way to build links, but not anymore for good sites. Google has started being very brilliant in finding duplicate copies across the web. If you rely on article reproduction as backlinks then you are dealing with links from duplicate copies of the article. When Google thinks it as unworthy to its index, take a good article snippet and search google, you will see apart from 2 or 3 main copies and all the other copies will be supplemental results. This is one way to find whether the article is being deemed as spam. If you are getting links from these duplicate copies it will no way help the trustworthiness of your site. So it is better avoid doing article reproduction but make people link to a good article of your site from other sites.f. Links from a network of sites you own.
Always avoid this when you run a new site. Some people especially SEO companies tend to connect a new client site to an existing network of other active clients plus the sites they own or have tie-up with. Just avoid this anymore for new sites. It just doesn't work for new sites and it only delays the sandbox process. The network you might have access to, doesn't have the trust to vote for a new site. So better avoid linking from your own network.g. Don't participate in co-op link networks like digital point ad network or link vault:
Almost all the sites involved in these ad networks are not-worthy. Yahoo is very severe with these types of networks. Google also has good algorithms to identify these links. These links never work for new sites and it just delays the aging filter. Better avoid these types of link networks.4. Is it some sort of hidden penalty?
Yes, it is a kind of penalty for a site. Penalty is a tougher word to give this filter but it is true that it is some kind of hidden penalty. Google treats sandbox filter a kind of mean penalty for a site and it just holds the site in penalty to prove its trust to the web. Google holds the site to see link growth patterns.5. What is the inner working of the sandbox filter?
In our research we were able to find how this sandbox / aging filter works. First when google finds a site it gives an inception date (the date when google first found the site). From that time it sees the growth pattern of links. If the algorithm sees lot of not-trustworthy links coming into the site (especially if the site is new) it will place the site in hold from ranking for anywhere between 3 to 16 months. Once the site is in sandbox, Google's algorithm sees the growth of link patterns. If it sees a normal growth of both trusted / normal links it will release the site sooner out of sandbox. But if the algorithm sees growth of un-trusted links the site will be delayed from ranking much longer.6. Is sandbox based on whois registration date?
No, based on our experiment, sandbox filter doesn't rely on whois for determining the age of a site.7. How to detect if the site is in sandbox filter?There are various ways to know if your site is in sandbox.
a. One very good method which has long been in existence is the allinanchor: keyword1 keyword2 search. If your site ranks for the allinanchor: search but not for the normal keyword search, then most probably your site is in the sandbox filter.
b. Next method is to check your ranking in other major search engines. If your site is ranking exceptionally well in Yahoo and MSN but not in Google then it could be another possible reason for the site to be in sandbox filter. But remember, Google's algorithm is more sophisticated than yahoo or MSN and judging that your site is in sandbox only by this reason is absolutely wrong.
c. Check the quality of your backlinks and compare it with your competitors. There are numerous tools which show link comparison. Compare them and see what your competitor has, see the location where the link is placed to your competitor sites.
d. Check for both competitive and non-competitive terms. If your site doesn't rank for both then most probably your site is sandboxed.
e. Check for duplicate content on your site. If your site has duplicate contents better fix them first before checking for sandbox problems. As I said before sandbox filter is caused purely based on links. You should go to check your link problems only if your on-page factors are fine and are of Google quality. Remember, all these information discussed are for sites which have quality content, unique business etc. These instructions are not for sites which use scrapped contents, made for AdSense sites, aggressive affiliate sites and sites that use other type of spam tactics. This is purely for sites which are having good quality domains and are still having problem ranking in search engines.8. Does sandbox filter affect only certain areas of search?
No, Sandbox filter affects all areas. It is not keyword or topic based but more links based. Links are everywhere and the filter is applied everywhere.9. Can a competitor sabotage a ranking using the evil sandbox filter of google?
This is a very good question. Having discussed all these information, if we don't address this issue people will back fire us. Our research was focused on this issue too. We included some older domains in the test and we successfully found that a competitor CANNOT sabotage an existing ranking of a site through these link filter algorithms. But why can't a competitor sabotage ranking?Here I explain from the conclusion of our experiment:
What happens with Google is they want the site to be trusted before the site starts ranking very well. So in order to rank well a site should show the trust it has with Google's algorithm. But this trust is a onetime process. If the site establishes its trust with google and starts ranking well then they don't have to prove it again as long as the trusted links exist.
For example if a site is new and the site starts off with good trusted links, then the site doesn't get sandboxed at all. It will start ranking very well, say for a month and an aggressive competitor who has been watching this plans to destroy the ranking of the good site. He spams all blogs, message boards, guest books, posts sitewide links on many sites etc. So will he succeed in sabotaging the ranking of the good site? No, since the good site has already established trust with google algorithm all these links from bad pages will only boost the ranking of the good site. No way will it harm the good site. Google algorithm knows this very well.
So why cant a competitor do this when the site is still new and not ranking?
It is because he won't be aware that a site like that is growing strong, firstly. Unless a new site ranks well for targeted phrases the site is not a problem to any one's eyes. Only if the site ranks will it itch the eyes of evil SEO companies and competitors. Only then they will plan to kill the rankings. But, by that time it will be too late, since the site has already established a trust with google'algorithm based on quality backlinks which it possesses already. So if a site owner thinks of sabotaging the existing ranking of a site by sending bad spam backlinks to their site, then they should remember that they are just actually helping someone rank better than ever. This is one reason why Google always preaches that no one can harm a site's ranking other than the people responsible for the site.So what is the proof for the above statement?
Our experiment is proof, though we are an ethical SEO company we know places where the bad backlinks are. We sent thousands of those backlinks to sites we monitor which are ranking currently well. We kept those links alive for 2 or 3 months and we were able to conclude that those bad links are doing nothing harmful to the site, In fact it was boosting those sites rankings and google's algorithm never cared of those backlinks.
We are not discussing canonical issues, 302 hijacking, 301 problems etc here, we are just addressing whether a competitor can sabotage ranking for a site through link spam. There had been cases where 302 hijacking by external domains hurt a site. Matt cutt's (Google's senior Engineer) blog is a proof for this. Dark SEO team hijacked Matt Cutts's blog through a tricky 302 redirect, but that is a different issue and this article deals with sandbox filter only.10. Do we have the risk of loosing credit for untrusted links if it is gained for a new site before receiving trusted links?
Yes, this is a major risk factor. Sometimes it happens that most of the links are traded for a new site or are bought. A site owner would have possibly read in forums that reciprocal link building or buying links are the best ways to build links. Then he realizes that his site is not ranking. A site owner finally decides that he needs quality links he goes for branding and writes great articles, brings interesting stuff to the site which starts attracting high quality natural links. His site starts steadily improving and this time the site owner has the risk of loosing credit for the reciprocal links / bought links which was there before his site started acquiring trusted links. We have seen this happening with Google. So we recommend to site owners / SEO companies that they don't use link building methods for new sites which brings in notâ€“trustworthy links. For a new site think of natural ways to build links. Imagine how great sites grew to this level. Not all sites started with 10,000 links to them. Successful sites take time to build their brand that is how search engines want new sites to be. Think of ways to bring in great links once your site is good enough and use all ethical link building strategies in books.11. How to escape the sandbox filter?
a. Don't go for any of the artificial link building methods for a new site. Some of the methods to be avoided initially are reciprocal link building, directory submissions ( excluding dmoz and yahoo directory ) , buying links, article reproduction, links from co-ad networks, links from network of your own sites, links from spam sites, sitewide links etc.
b. Make the links grow more naturally at a slow rate for new sites.
c. Show a steady growth of links to Google. Make them understand that your site is trustworthy.
d. Attract people to link into your site naturally and think of the great Million Dollar Homepage concept. If a college going youngster can attract 60,000 links in a few months why not you? If you can even get 10% of the quality natural links your site will do really great.12. How to get the site out of sandbox?
This is an important question which many people ask.
First think of ways to attract natural links, don't just run a site with commercial focus, today both users as well as search engines want information and commercial mixed. Take for example, Search Engine Genie which is a mixed site were we have tools, blog, articles and much more stuff to give out to users. But we also have commercial focus. Similarly there are great SEO sites that do the same. SEOMoz, SEOcompany.ca etc. Though these sites are new, they have come out very strong just because they are a valuable resource to search engines and users.
Submit to dmoz. First make sure your site is good enough to be accepted into dmoz by reading their guidelines and then submit your site.
Write great articles and make people to link to them. Make people aware that a good article exists on your site, participate in related forums post your article there and ask for an opinion. If people like your article you will get a ton of traffic and some people as a token of appreciation will post a link from their site to your article."Start a blog share to the world what you know about business, show them your expertise. Blogs attract lots of links today; you can even give your content to other blogs as feeds as well.
Think of a great strategy like the million dollar homepage to build backlinks.
As Matt Cutts suggests, even basic interviews will attract good backlinks.
We have listed some but you can think more, think all the best possible ways to build natural links till your site gains the trust, once it has the necessary trust you can use all the backlink strategies.
Wait for 3 months at least because if you are building natural backlinks it takes time to kick into google's algorithm,13. Does Search Engine Genie's experimental finding work?
Our experiment has been tested across sites which deal with various topics. We have expanded our research to many more sites we handle, for the new sites we handle the way to escape sandbox has worked well.
You can see this only after you see it happening for your site, so, if you have a new site test it for yourself and post the results through comments in this post.14. My site has never been in the Sandbox. Why not? Couple of reasons for that.
1. Your site would be an older site, Sites prior MAY 2004 didn't have this problem. In fact at that time we were able to rank a site with any sort of backlinks within a month. Now it is not the case.
2. You wouldn't have worried about SEO type backlinks.
3. You might possibly have registered an expired domain which was ranking well and had / have trusted links before it came to you. Google's expired domain penalty doesn't work properly for some sites. One of the sites which belong to our client got expired before 3 months. We tried contacting our client numerous times prior domain expiration but it seemed he was very sick and was in hospital for 6 months and we could not contact him. We tried backordering but we were in queue. The domain is a popular domain and was in pagerank 7 category of dmoz. The domain entered pending deletion period and finally it expired and now the person who first preordered got the domain and the domain is theautomover.com. It is so sad that expired domain penalty don't work because that site is ranking well now, even if the content was completely changed and now belongs to a new owner. We appeal to google to monitor these expired domains more actively.
This article addresses latest improvements in Google with relation to sandbox filter. In fact some points discussed here are from the latest Google update Jagger.