Google

Official statement – what Google feels about hidden text

A nice writing by a Google employee on what they feel about hidden text with examples:

“In our “Popular Picks” thread, Burchman asked for some clarification on what Google considers to be hidden text, as our Webmaster Guidelines explicitly state that you should avoid hidden text or hidden links. We have a few examples of how text can be hidden in this Help Center article: http://www.google.com/support/webmasters/bin/answer.py?answer=66353
As I’ve noticed other users with similar questions in this group, such as “What if my navigation menu uses display:none to hide text until a button is rolled over?” I figured this would be a good topic to cover in “Popular Picks.” The reason we perceive hidden text as a problem is that it serves content to search engines which users don’t see, and presenting different content to search engines and users makes it more difficult to properly rank pages. If we detect that this text is intended to deceive search engines, there could be a negative effect on how your site is indexed and ranked in Google, including removal from our index.
Because such strong action may be taken on sites violating this guideline, it’s understandable that many webmasters have expressed concerns about the possibility of Google incorrectly detecting legitimate content as hidden text. When trying to figure out if a page may have hidden text that Google would consider malicious, start by thinking about why the guideline was written in the first place: Google wants to see what the user sees. If the text that Google sees is the same that a normal user is supposed to see, then you should be good to go. If Google is seeing text that is intentionally hidden from the user in an effort to manipulate search engine rankings, you have some work to do.
Let’s try this approach with a page you may have seen before: http://www.google.com/
In the top-left corner, you’ll see a line of text: “Web Images Video News Maps Gmail more.”
Google sees this text, and so do you, the user. So far, so good.
Next, let’s make sure nobody wrote “search engine search find crawl index rank” in white text on the white background, with the intention of ranking for those terms. Google would see that, but a normal user wouldn’t. Take off your “normal user” hat for a second and do a “Select All” on the page (by hitting CTRL-A on a PC, or COMMAND-A on a Mac, for instance). This will make any white on white text appear. As you can see, no hidden text.
But let’s try one more thing: Render the page again without CSS enabled. The Web Developer extension for Firefox lets you do this pretty easily. Without CSS, you’ll see several words we didn’t see before: “Blog Search Blogger Books Calendar Documents Finance Groups Labs Orkut Patents Photos Products Reader Scholar.”
You may have also noticed that these words appear in Google’s text- only cache of itself, which is a good indication of how Google “sees” a site. But before you blog about your discovery of hidden links on a PR 10 site =), take a look at the page again with CSS enabled. This time, click on the “more” link, and voilà, the no-longer-hidden text appears. This text is part of the page’s functionality, and it is meant for the user to read and use, not just for search engines to index. This text adds value for the user, which Google rewards, so Google would not hurt this site’s ranking or remove it from the index for that reason. Many sites use similar methods for navigational menus and other functional elements, so please rest assured that the existence of display:none on your site is not on its own a one-way ticket out of Google’s index.
When thinking about your own site, ask yourself if all of the text is there for the user. If the answer is “yes,” great job! If the answer is “no,” do your best to change it to a “yes,” and call on your webmaster community (this group!) for advice if you need it. CSS menus and white space without hidden text should not be a cause of concern when it comes to Google indexing and ranking; the only thing you should be concerned about is how they affect your visitors.
In the “Popular Picks” thread we asked for non-site specific questions, but now that this has been separated into its own thread, here’s your chance to ask about a site you are still unsure about. Please also let me know if you would like further clarification on particular aspects this topic. “

Source: Google Groups

Is TPR penalty lifted for some sites

Some webmaster world members are noticing that the Toolbar Pagerank penalty is lifted for their sites. Google started imposing Toolbar Pagerank Penalties for sites that sell / buy links around January this year now it seems to be lifted for some sites. Though its being reported in forums we never noticed anything like that across our client sites. Probably its because we don’t sell or buy links for our clients due to our policy.

forum discussion here: http://www.webmasterworld.com/google/3729425.htm

Olympics and SEO – We will get you top 10 multiple rankings.

michael phelps and SEO top 10 rankings

Search engine genie will get you lots of multiple No.1 rankings we have an image portraying it 🙂

More information on 404 errors

Google webmaster central blog had been posting some interesting stuff on 404s for a while this time they had a posting on 404 errors on how they treat 410 errors. According to the official Google webmaster blog 410 errors are treated the same way as a 404 error. More from the webmaster blog:

How do you treat the response code 410 “Gone”?
Just like a 404.

Do you index content or follow links from a page with a 404 response code?
We aim to understand as much as possible about your site and its content. So while we wouldn’t want to show a hard 404 to users in search results, we may utilize a 404’s content or links if it’s detected as a signal to help us better understand your site. Keep in mind that if you want links crawled or content indexed, it’s far more beneficial to include them in a non-404 page.

What about 404s with a 10-second meta refresh?

Yahoo! currently utilizes this method on their 404s. They respond with a 404, but the 404 content also shows We feel this technique is fine because it reduces confusion by giving users 10 seconds to make a new selection, only offering the homepage after 10 seconds without the user’s input.

Should I 301-redirect misspelled 404s to the correct URL?

Redirecting/301-ing 404s is a good idea when it’s helpful to users (i.e. not confusing like soft 404s). For instance, if you notice that the Crawl Errors of Webmaster Tools shows a 404 for a misspelled version of your URL, feel free to 301 the misspelled version of the URL to the correct version. For example, if we saw this 404 in Crawl Errors:http://www.google.com/webmsters <-- typo for "webmasters" we may first correct the typo if it exists on our own site, then 301 the URL to the correct version (as the broken link may occur elsewhere on the web):http://www.google.com/webmastersHave you guys seen any good 404s?Yes, we have! (Confession: no one asked us this question, but few things are as fun to discuss as response codes. :) We’ve put together a list of some of our favorite 404 pages. If you have more 404-related questions, let us know, and thanks for joining us for 404 week!

How to start a multilingual site: Help from Google to make a google friendly multilingual site

This blog is all about of how to start a multilingual site & various pros in having a multilingual site. Multilingual site is a site where a person can have a site in different languages. But the first thing you’ll want to consider is if it makes sense for you to acquire country-specific top-level domains (TLD) for all the countries you plan to serve. This option is beneficial if you want to target the countries that each TLD is allied with, a method known as geo targeting. Geo targeting is different from language targeting. Geo targeting refers to the sites whose main target is in a particular region/location in the world & it allows you to lay down different geographic targets for different subdirectories or sub domains (e.g., /de/ for Germany). Where as language targeting is one which targets to reach all speakers of a particular language around the world & where you probably don’t want to limit yourself to a specific geographic location. In this case you don’t want to use the geographic target tool. Since its difficult to maintain & update multiple domains, its better to buy one non-country-specific domain, which hosts all the different versions of your website. In this case, there are two options which are recommended:
First option is to place the content of every language in a different sub domain. For our example, you would have en.example.com, de.example.com, and es.example.com.
Second option is to place the content of every language in a different subdirectory. This is easier to handle when updating & maintaining your site. For our example, you would have example.com/en/, example.com/de/, and example.com/es/.
There may arise a doubt for some that when same content is posted in different languages then will it result to a duplicate one?? Definitely not, but you should make sure that your site is well organized. And always avoid mixing languages on each page as this may confuse Googlebot as well as your users. It’s always good to have navigation & content in same language on each page. You can also know how many of your pages are recognized in a certain language by performing a language specific site search. Multilingual site is a benefit to the owner of the site & to the visitors of the site as they get information in their language. For example: when a person wants to know fashion designing institutions in London then he may type that query in search along with the language he needs the page to be displayed in. He feels so comfortable when he gets the information in the language he knows & understand.

Official post http://googlewebmastercentral.blogspot.com/2008/08/how-to-start-multilingual-site.html

Google bans webposition Gold position checker

Google has always been warning against automated position checkers which uses a lot of its resources. Now Google has taken a stronger hand and has blocked Webposition Gold software from performing automated ranking requests in Google. AUtomate rank requests creates a lot of junk queries and uses a lot of server resources of Google. Google has been issuing warning not to use webposition Gold but people continue to use it. Now Google has taken action and has blocked all web position gold queries. Web position Gold has an unique way of sending queries to Google and it seems google was able to detect it using their bot filter software.

We at Search engine Genie never use bulk keyword rank checkers. Our rank checkers are search engine friendly and allows only limited queries per day.

Another company wants a piece of Google pie – Mediaset

First we had Viacom then we had the Belgium newspaper group and now we have another company suing Google. Mediaset a media company is suing Google and Youtube for using copyrighted materials on their website.

According to reuters

“Mediaset, controlled by Prime Minister Silvio Berlusconi, joins others broadcasters seeking compensation from YouTube, a video-sharing website, for copyright infringement.
Mediaset filed suit in a Rome court, the company said in a statement on Wednesday. A YouTube spokeswoman said it did not see the need for the legal case.
“YouTube respects copyright holders and takes copyright issues very seriously,” the spokeswoman said in London. Google bought YouTube in 2006.
“There is no need for legal action … We prohibit users from uploading infringing material and we cooperate with all copyright holders to identify and promptly remove infringing content as soon as we are officially notified,” Google said in a separate statement.
Lawsuits and trials in Italian are often lengthy and it is forecast the outcome.
Mediaset said a sample analysis of YouTube at June 10 found “at least 4,643 videos and clips owned by us, equivalent to more than 325 hours of transmission without having rights”.
Mediaset said this was equal to the loss of 315,672 days of broadcasting by its three TV channels.”

Well i have always said a lawsuit against youtube.com is not the best idea since youtube is a public resource and cannot be threatened. We will loose the freedom of internet if Youtube looses its way by lawsuits.

Google knows the web is big – a informative post in Google blog,

Google is one of the biggest website. We’ve known it for a long time that the web is big. The first Google index in 1998 already had 26 million pages, and by 2000 the Google index reached the one billion mark. Over the last eight years, they’ve seen a lot of big numbers about how much content is really out there. Recently, even their search engineers stopped in awe about just how big the web is these days when their systems that process links on the web to find new content hit a milestone1 trillion-unique URLs on the web at once! So how many unique pages does the web really contain?? No one knows how many it contains but the number of pages out there is infinite! We don’t index every one of those trillion pages, many of them are similar to each other, or represent auto-generated content. But Google is proud to have the most comprehensive index of any search engine, and there goal is always been to index the entire world’s data. To keep up with this volume of information, their systems have come a long way since the first set of web data Google processed to answer queries. Then they did everything in batches- one workstation could compute the Pagerank graph on 26 million pages in a couple of hours, and that set of pages would be used as Google’s index for a fixed period of time. Today, Google downloads the web continuously, collecting updated page information and re-processing the entire web-link graph several times per day. This graph of one trillion URLs is similar to a map made up of one trillion intersections. So multiple times every day, they do the computational equivalent of fully exploring every intersection of every road in the United States. Google’s distributed infrastructure allows applications to efficiently traverse a link graph with many trillions of connections, or quickly sort petabytes of data, just to prepare to answer the most important question- your next Google search.

http://googleblog.blogspot.com/2008/07/we-knew-web-was-big.html

Yahoo and Microsoft gain more market share but still Google leads way ahead

Some of the popular websites are Google, Yahoo, Microsoft, AOL, & Ask. Among these search engines Yahoo & Microsoft showed increment where as Google declined to great extent compared to previous years. The percentage of searches handled in US by five search engines are 61.5%, 20.9%, 9.2%, 4.1%, & 4.3%. Finally, a change – Google slips while Yahoo and Microsoft gain. Now there comes a question whether Google is in trouble?? The answer will obviously be no because of raw number of searches, June 2008 was another record breaker for Google. But Google dropped from 61.8% in May 2008 to 61.5% in June 2008, the first time a share drop has been shown over the past year since December 2007 (when it went from 58.6% to 58.4%).On other hand Microsoft showed its first gain in the past year. After many months of incremental decline, Microsoft rose from an 8.5% share in May 2008 to 9.2% in June 2008. This is likely a factor in Microsoft’s rise. By this it has achieved great success but at the same time Microsoft is hoping that program will generate more than a 0.7% rise in its share, and that’s all it has gotten so far. Clearly the program isn’t a massive initial game changer that some thought it to be. Instead, if Cash back is going to be a success, clearly now it will be something that happens over time. Let’s see if that indeed happens in the coming months. The other hand even yahoo too is showing a rise. After months of drops with the occasional rise, Yahoo posts two straight months of gains, i.e. 20.4% in April 2008 to 20.6% in May, then 20.9 percent in June 2008. I think there’s great rise in Yahoo & Microsoft. Lets see the actual number of searches each handled versus market share:
Google: 7.1 billion
Yahoo: 2.4 billion
Microsoft: 1.1 billion
Ask: 501 million
AOL: 471 million
By this we can tell that Google still shows a gain. Google went over the 7 billion searches served mark. Whereas Yahoo, at 2.4 billion searches, & Microsoft, at just over 1 billion searches, which didn’t break any past records but at least got closer to territory it held a year ago. On the whole there is a great increment in yahoo & Microsoft compared to previous years & Google declined a bit but then too created a record!

How to submit a re-inclusion request Google’s official video – transcripted by SRequesting reconsideration in Google how to remove banned site – Video

Requesting reconsideration in Google how to remove banned site – Video transcript

Posted by Mariya Moeva, Search Quality Team

Hai I am Mariya Moeva from the Google Search Quality Team and I like to talk to you about reconsideration requests. In this video we will go over one how to submit a reconsideration request for your site. Lets take a webmaster lady here as an example. Ricky a hard working webmaster who works on his ancient politics blog everyday lets call it example.com one day he checks and sees that his site no longer appears in Google search results. Lets see some things to know whether he needs to submit reconsideration requests. First he needs to check whether his sites disappearance from the index may be caused by access issues. You can do that too by logging into your webmaster tools account on the overview page you will be able to see when was the last time Google-bot successfully accessed your webpage. Here you can also check whether there are any crawling errors for example if your server was busy or unavailable when we try to access your site you would get an URL unreachable message alternatively there will be URLs on your site blocked by your robots.txt file you can see this by URLs restricted by robots.txt.

If these URLs are not what you expected you can go to tools and select analyze robots.txt here you can see whether you robots.txt file is properly formatted and only blocking parts of your site that you don’t want Google to crawl. If google has no problems accessing your site check to see if there is a message waiting for you in the message center of your webmaster tools account. This is the place where Google uses to communicate with you an put information to you on webmaster tools account in the sites that you manage. If we see that there is something wrong with your site we may send you a message there detailing things you need to fix to get back your site in compliance with Google webmaster guidelines. Ricky logs into his webmaster tools account and checked that there are no new messages. He doesn’t find any messages if you don’t find any message in the message center check to see if your site has been on is in violation on Google’s webmaster guidelines you can find that in the help center under the topic creating a Google friendly site how to make my site perform best in Google. If you are not sure why Google is not including your site a great place to look for help is our Google webmaster help group there you will find many friendly and knowledgeable webmasters and Googler’s who will be happy to look at your site and give suggestions on what you might need to fix,

You can find links to both the help center and the Google help group at Google.com/webmasters to get to the bottom of why his site has disappeared from the index Ricky opens the webmaster guidelines and starts reading. In quality guidelines we specifically mention completing avoiding hidden text or hidden links on the page. He remember that at one point he hired a friend named Liz who claimed to say he knows something about web design and that he can make the site rank better in Google. Then he scans his site completely and finds blocks of hidden text on footer of all his pages. IF your site is in violation of Google webmaster guidelines and if you think this might have affected the way your site is ranked in Google now will be a good time to submit a reconsideration request. But before you do that make changes to your site so that it falls between the Google

S webmaster guidelines. Ricky removed all the hidden text from his pages now he can go ahead and submit a request for reconsideration. Login to your webmaster tools account under tools click on request reconsideration and follow the steps make sure you explain what you did wrong with your site and what steps you have taken to fix it.

Once you have submitted a request you will receive a message from us in the message center confirming that we have received it. We will then review for compliance with the Google webmaster guidelines. So that’s an overview of how to submit a reinclusion and reconsideration request. Thanks for watching and Good luck with web mastering and ranking.

Request a Free SEO Quote