Google's blind eye towards paid links:

Recently we are seeing more and more websites buying links casually to get their rankings boosted. Google who is opposing paid links openly are nowadays showing a blind eye towards mass link buying. We are a SEO company but we always stay away from buying links or recommend clients not to buy links. Its because we know that buying links are the biggest plague to hurt the search engines but the search engines seem to not realize it.



Yahoo is the only search engine which is serious about link buyers. Google seem to be never interested. They keep saying they want to fight it algorithmically and in this process they ban some innocent sites. Almost a year back our ranking got devalued because Google thought our widget links are paid links and devalued our rankings (https://www.searchenginegenie.com/widget/seo_statistics_widget.php ) .

More and more people resort to link buying these days. Almost all clients who come to us for SEO and have done SEO before have paid links to their site placed by previous SEO companies. So is automated detection working? I see it a big failure people just find innovative ways to hide links from Google's algorithmic detection and the result it hurts Google.

I expect Google to take more action on buying links since it hurts people like us who don't buy links. Our clients push us hard to buy links but still we stay away to stay within Google guidelines. If Google continues to show blind eye towards paid links companies like us too with resort to buying links since we don't have an ethical option to tackle aggressive competition.

Labels:


Few things which Google should do with this New Year

I want Google to introduce few more services and product this year and I wish new services and product to be like
  • To connect all services and apps together. I mean when you Gmail your photo to your friend, I want Google to upload the same to Picasa Web also. And allow more integration and more space on Picasa Web.
  • Google presentation should be compatible with PowerPoint. It should allow saving and editing of pdf attach from Gmail to Gdocs.
  • Google should allow users to get OpenId like blogger for Google accounts and Google Apps.
  • Profiles of Google should be expanded and integrated like face book.




  • Google sites should be more or less similar to blogger and allow html, scripts and iframes. It should allow posting no. of comment like blogger comment function.
  • Google should chrome for Mac.
  • It should digitize more of old newspaper around the world.
  • It should also digitize microfilm and books from the Family History Library in Salt Lake City, UT.
  • There should be sitemaps for images so that it helps to get more images into image search with keywords site owners provide.
  • Google should update Google Music search to include videos and song samples. Google should allow graphical AdSense ads but excluding animations.
  • Google should introduce separate drive call GDrive. More and more coverage in Google Earth is required.
  • Google should allow the user to send even Exe files from Gmail.
  • Synchronization of Itunes/iPhone to google calendar.


  • Other thing that I always wanted to see from them is a tool to locate and integrate datasets with similar axes, and then build reports and graph based on them and grand central coverage in UK and Europe.
  • I want see a much better bookmark manager.
  • A Google should have a central console which would contain all the Google web applications a user uses, so he logs on to the console, one button pressed would open whichever application he chooses within a shell so it easy to swop in between.
  • It should gears support for Gmail and calendar.
  • There should be integration between Grand Central and Gmail contacts.
  • They should improve Android in order to support more than one Google Apps accounts.
  • Google to take over Microsoft so that we can make a even better operating system. SMS support in Gmail for India.
  • There should be Community chat room in Orkut.
  • They should now introduce movies in you tube. There should be transparency in how they rank, search results is the only long term way to prove that they are not tampering with search engine to make more money.

These are the few wish which I want the Google to fulfill it as soon as possible this will help more and more users to use G-products.

Labels:


Message center info through Google API

The webmaster tool has just launched the Message Center GData API as a part of webmaster tools API. For those of you that are not familiar with GData, it's a protocol for reading and writing Data on the web. GData makes it very easy to communicate with many Google services, like webmaster tools. The webmaster tool GData API has been updated to allow you to get even more out of webmaster tools, such as setting graphic location or your preferred domain. If your site targets user in a particular geographic location, you can actually use our geographic tool to provide us with information that will help us determine your site appears in our country-specific search results and also improve our search result for geographic queries. This feature can be used only with the top-level domain, such as .com or .org. and if no information is entered in webmaster tools, we will continue to make geographic association largely based on top level domain and the IP address of the web server from which the context was served.

The webmaster tool GData API already allows you to add and verify sites for your account and to submit sitemaps programmatically. With the help of these webmaster tool GData API you can also access and update site-specific information. This is useful when you have large number of sites. With the webmaster tool API you can perform hundreds of operations in the time that it would take to add and verify a single site through the web interface.
This message center is nothing but just to get personalized information from Google in our webmaster console. Initially the message will refer to search quality issues, but over time we'll use the message center as a communication channel for more types of information. For our webmaster outside US we are also pleased to tell you that message center is capable of providing in all supported webmaster tools languages (French, Italian, German, Spanish, Danish, Dutch, Swedish, Russian, Chinese-Simplified, Chinese-Traditional, Korean, Japanese, etc.), across all countries.

Until now you were only able to access through the Message Center section of webmaster tools but now you can access even through GData. This way you don't need to continually check your messages in webmaster tools, you can retrieve the messages feed automatically and be informed as soon as possible of any critical issues regarding your sites.

Source: Google Webmaster central blog

Labels:


Googlebot and if-modified since


Interesting letter type responses from Google guys on the way Google understands how it handles If Modified since and other server errors.


"Hello Jimmy, Let's pretend there are no anachronisms in your letter, and get to the meat of the matter. Firstly, let's look at links coming from other sites. Obviously, these can be a great source of traffic, and you don't want visitors presented with an unfriendly 'Page not found' message. So, you can harness the power of the mighty redirect. There are two types of redirect—301 and 302. Actually, there are lots more, but these are the two we'll concern ourselves with now. Just like 404, 301 and 302 are different types of responses codes you can send to users and search engine crawlers. They're both redirects, but a 301 is permanent and a 302 is temporary. A 301 redirect tells me that whatever this page used to be, now it lives somewhere else. This is perfect for when you're re-organising your site, and also helps with links from offsite. Whenever I see a 301, I'll update all references to that old page with the new one you've told me about. Isn't that easy? If you don't know where to begin with redirects, let me get you started. It depends on your webserver, but here are some searches that may be helpful:Apache: https://www.google.com/search?q=301+redirect+apacheIIS: https://www.google.com/search?q=301+redirect+iisYou can also check your manual, or the README files that came with your server. As an alternative to a redirect, you can email the webmaster of the site linking to you and ask them to update their link. Not sure what sites are linking to you? Don't despair - my human co-workers have made that easy to figure out. In the "Links" portion of Webmaster Tools, you can enter a specific URL on your site to determine who's linking to it. My human co-workers also just released a tool which shows URLs linking to non-existent pages on your site. You can read more about that here.Yours informationally,"


Labels:


New record in processing 1 PetaByte of Data

According to Official Google blog

"At Google we are fanatical about organizing the world's information. As a result, we spend a lot of time finding better ways to sort information using MapReduce, a key component of our software infrastructure that allows us to run multiple processes simultaneously. MapReduce is a perfect solution for many of the computations we run daily, due in large part to its simplicity, applicability to a wide range of real-world computing tasks, and natural translation to highly scalable distributed implementations that harness the power of thousands of computers.In our sorting experiments we have followed the rules of a standard terabyte (TB) sort benchmark. Standardized experiments help us understand and compare the benefits of various technologies and also add a competitive spirit. You can think of it as an Olympic event for computations. By pushing the boundaries of these types of programs, we learn about the limitations of current technologies as well as the lessons useful in designing next generation computing platforms. This, in turn, should help everyone have faster access to higher-quality information.

We are excited to announce we were able to sort 1TB (stored on the Google File System as 10 billion 100-byte records in uncompressed text files) on 1,000 computers in 68 seconds. By comparison, the previous 1TB sorting record is 209 seconds on 910 computers.Sometimes you need to sort more than a terabyte, so we were curious to find out what happens when you sort more and gave one petabyte (PB) a try. One petabyte is a thousand terabytes, or, to put this amount in perspective, it is 12 times the amount of archived web data in the U.S. Library of Congress as of May 2008. In comparison, consider that the aggregate size of data processed by all instances of MapReduce at Google was on average 20PB per day in January 2008.It took six hours and two minutes to sort 1PB (10 trillion 100-byte records) on 4,000 computers. We're not aware of any other sorting experiment at this scale and are obviously very excited to be able to process so much data so quickly."

Labels:


Vulnerable sites Alert letter from Google now shown in Google webmaster tools

Google now shows security hole / vulnerable site warning mails in webmaster blog itself.

According to webmaster blog:

Recently we've seen more websites get hacked because of various security holes. In order to help webmasters with this issue, we plan to run a test that will alert some webmasters if their content management system (CMS) or publishing platform looks like it might have a security hole or be hackable. This is a test, so we're starting out by alerting five to six thousand webmasters. We will be leaving messages for owners of potentially vulnerable sites in the Google Message Center that we provide as a free service as part of Webmaster Tools. If you manage a website but haven't signed up for Webmaster Tools, don't worry. The messages will be saved and if you sign up later on, you'll still be able to access any messages that Google has left for your site.

One of the most popular pieces of software on the web is WordPress, so we're starting our test with a specific version (2.1.1) that is known to be vulnerable to exploits. If the test goes well, we may expand these messages to include other types of software on the web. The message that a webmaster will see in their Message Center if they run WordPress 2.1.1 will look like this:



Quick note from Matt: In general, it's a good idea to make sure that your webserver's software is up-to-date. For example, the current version of WordPress is 2.6.2; not only is that version more secure than previous versions, but it will also alert you when a new version of WordPress is available for downloading. If you run an older version of WordPress, I highly encourage you to upgrade to the latest version.

Labels: ,


Google blog search gets a new look



According to Google official blog

"Did you know that millions of bloggers around the world write new posts each week? If you're like me, you probably read only a tiny fraction of these in Google Reader. What's everybody else writing about? Our Blog Search team thought this was an interesting enough question to look into. What we found was a massive mix: entertaining items about celebrities, personal perspectives on political figures, cutting-edge (and sometimes unverified) news stories, and a range of niche topics often ignored by the mainstream media. Today, we're pleased to launch a new homepage for Google Blog Search so that you too can browse and discover the most interesting stories in the blogosphere. Adapting some of the technology pioneered by Google News, we're now showing categories on the left side of the website and organizing the blog posts within those categories into clusters, which are groupings of posts about the same story or event. Grouping them in clusters lets you see the best posts on a story or get a variety of perspectives. When you look within a cluster, you'll find a collection of the most interesting and recent posts on the topic, along with a timeline graph that shows you how the story is gaining momentum in the blogosphere. In this example, the green "64 blogs" link takes you inside the cluster and shows you all the blog posts for a story. We've had a great time building the new homepage and we hope you enjoy using it. "



Labels: ,


How google evaluates search - Google engineer talks

How google evaluates search:

Scott of Google has given a good insight of how Google evaluates search results: Look at this to get a good idea on how Google handles search evaluation:

"Evaluating search is difficult for several reasons.
  • First, understanding what a user really wants when they type a query -- the query's "intent" -- can be very difficult. For highly navigational queries like [ebay] or [orbitz], we can guess that most users want to navigate to the respective sites. But how about [olympics]? Does the user want news, medal counts from the recent Beijing games, the IOC's homepage, historical information about the games, ... ? This same exact question, of course, is faced by our ranking and search UI teams. Evaluation is the other side of that coin.

  • Second, comparing the quality of search engines (whether Google versus our competitors, Google versus Google a month ago, or Google versus Google plus the "letter T" hack) is never black and white. It's essentially impossible to make a change that is 100% positive in all situations; with any algorithmic change you make to search, many searches will get better and some will get worse.

  • Third, there are several dimensions to "good" results. Traditional search evaluation has focused on the relevance of the results, and of course that is our highest priority as well. But today's search-engine users expect more than just relevance. Are the results fresh and timely? Are they from authoritative sources? Are they comprehensive? Are they free of spam? Are their titles and snippets descriptive enough? Do they include additional UI elements a user might find helpful for the query (maps, images, query suggestions, etc.)? Our evaluations attempt to cover each of these dimensions where appropriate.



  • Fourth, evaluating Google search quality requires covering an enormous breadth. We cover over a hundred locales (country/language pairs) with in-depth evaluation. Beyond locales, we support search quality teams working on many different kinds of queries and features. For example, we explicitly measure the quality of Google's spelling suggestions, universal search results, image and video searches, related query suggestions, stock oneboxes, and many, many more."
Source: Google Blog: http://googleblog.blogspot.com/2008/09/search-evaluation-at-google.html

Labels:


Duplication problem with Google - how Google handle duplicates.

Official Google webmaster central blog has an interesting post on how Google handles duplicates.
Susan of webmaster central team state:

"When we detect duplicate content, such as through variations caused by URL parameters, we group the duplicate URLs into one cluster.
We select what we think is the "best" URL to represent the cluster in search results.
We then consolidate properties of the URLs in the cluster, such as link popularity, to the representative URL.
Here's how this could affect you as a webmaster:

In step 2, Google's idea of what the "best" URL is might not be the same as your idea. If you want to have control over whether www.example.com/skates.asp?color=black&brand=riedell or www.example.com/skates.asp?brand=riedell&color=black gets shown in our search results, you may want to take action to mitigate your duplication. One way of letting us know which URL you prefer is by including the preferred URL in your Sitemap.
In step 3, if we aren't able to detect all the duplicates of a particular page, we won't be able to consolidate all of their properties. This may dilute the strength of that content's ranking signals by splitting them across multiple URLs.

In most cases Google does a good job of handling this type of duplication. However, you may also want to consider content that's being duplicated across domains. In particular, deciding to build a site whose purpose inherently involves content duplication is something you should think twice about if your business model is going to rely on search traffic, unless you can add a lot of additional value for users. For example, we sometimes hear from Amazon.com affiliates who are having a hard time ranking for content that originates solely from Amazon. Is this because Google wants to stop them from trying to sell Everyone Poops? No; it's because how the heck are they going to outrank Amazon if they're providing the exact same listing? Amazon has a lot of online business authority (most likely more than a typical Amazon affiliate site does), and the average Google search user probably wants the original information on Amazon, unless the affiliate site has added a significant amount of additional value.

Lastly, consider the effect that duplication can have on your site's bandwidth. Duplicated content can lead to inefficient crawling: when Googlebot discovers ten URLs on your site, it has to crawl each of those URLs before it knows whether they contain the same content (and thus before we can group them as described above). The more time and resources that Googlebot spends crawling duplicate content across multiple URLs, the less time it has to get to the rest of your content.

In summary: Having duplicate content can affect your site in a variety of ways; but unless you've been duplicating deliberately, it's unlikely that one of those ways will be a penalty. This means that:

You typically don't need to submit a reconsideration request when you're cleaning up innocently duplicated content.
If you're a webmaster of beginner-to-intermediate savviness, you probably don't need to put too much energy into worrying about duplicate content, since most search engines have ways of handling it.
You can help your fellow webmasters by not perpetuating the myth of duplicate content penalties! The remedies for duplicate content are entirely within your control. Here are some good places to start.
"

Labels:


How To Write Winning Meta Titles:write great titles to keep your the best converting one

How To Write Winning Meta Titles:

There are many tips to write good Meta titles. A Meta title is a title, or name of your page. The title is shown by the browser, usually at the top of your computer screen, and tells a reader what page they are on. Meta titles are "read" by search engine robots, and viewed by site visitors.

The metal title is very vital for helping the page rank high in search engine returns and should be written to cater to search engine robots – not to site visitors. Meta titles should make sense to the reader, but the wording should be related to keyword search popularity and relevance to the rest of the web page plus other meta data and content.

The four most awful mistakes you can make when creating a meta title for your page are:

Not creating any title at all;
Naming your page the identical/same names as your website;
Naming all your pages the same name, or something similar to each other; and
Naming the page without linking it to your content and other Meta data.

Be certain to use keyword selector tools and keyword density tools to help you write your Meta title.
Examples of "Bad" Meta Titles:
The following instance Meta titles are too vague and do not give either robots or site readers enough information:
Flowers
Examples of Good Meta Titles:
· Flowers – How to Plant Flowers
· Population Statistics – 2008 United Kingdom Population Statistics
· Dessert Recipes - Best pudding Recipes
· Tax Tips – tips on how to pay less amount of tax

The above Title tags accomplish three things:

· they assist robots understand what is most important about the content on the page by repeating part of the keyword phrases that would be found in article titles and content;
· They make logic to people reading them; and
· By using plurals when prudent, it allows more possible keyword searchers (both on singular and plural or major keywords).

How Long Should a Meta Title Be in Length?
Normally, a title should be long enough to be clear; short enough to avoid being "truncated." Truncation happens when a title is very long. Search engine robots will only read so much character then move on. Different search engines read different numbers of characters but if you keep your titles less than 150 characters you will keep most vital search engine robots happy.
Tips on How to Create Meta Titles
When creating Meta titles:
Always replicate keyword phrases;
Do tie these phrases to your content and other meta data;
Do use plurals when doable;
Do bound the use of punctuation; and
Do use initial caps all through the title.

Labels:


Official statement - what Google feels about hidden text

A nice writing by a Google employee on what they feel about hidden text with examples:

"In our "Popular Picks" thread, Burchman asked for some clarification on what Google considers to be hidden text, as our Webmaster Guidelines explicitly state that you should avoid hidden text or hidden links. We have a few examples of how text can be hidden in this Help Center article: https://www.google.com/support/webmasters/bin/answer.py?answer=66353

As I've noticed other users with similar questions in this group, such as "What if my navigation menu uses display:none to hide text until a button is rolled over?" I figured this would be a good topic to cover in "Popular Picks." The reason we perceive hidden text as a problem is that it serves content to search engines which users don't see, and presenting different content to search engines and users makes it more difficult to properly rank pages. If we detect that this text is intended to deceive search engines, there could be a negative effect on how your site is indexed and ranked in Google, including removal from our index.
Because such strong action may be taken on sites violating this guideline, it's understandable that many webmasters have expressed concerns about the possibility of Google incorrectly detecting legitimate content as hidden text. When trying to figure out if a page may have hidden text that Google would consider malicious, start by thinking about why the guideline was written in the first place: Google wants to see what the user sees. If the text that Google sees is the same that a normal user is supposed to see, then you should be good to go. If Google is seeing text that is intentionally hidden from the user in an effort to manipulate search engine rankings, you have some work to do.
Let's try this approach with a page you may have seen before: https://www.google.com/
In the top-left corner, you'll see a line of text: "Web Images Video News Maps Gmail more."
Google sees this text, and so do you, the user. So far, so good.
Next, let's make sure nobody wrote "search engine search find crawl index rank" in white text on the white background, with the intention of ranking for those terms. Google would see that, but a normal user wouldn't. Take off your "normal user" hat for a second and do a "Select All" on the page (by hitting CTRL-A on a PC, or COMMAND-A on a Mac, for instance). This will make any white on white text appear. As you can see, no hidden text.

But let's try one more thing: Render the page again without CSS enabled. The Web Developer extension for Firefox lets you do this pretty easily. Without CSS, you'll see several words we didn't see before: "Blog Search Blogger Books Calendar Documents Finance Groups Labs Orkut Patents Photos Products Reader Scholar."
You may have also noticed that these words appear in Google's text- only cache of itself, which is a good indication of how Google "sees" a site. But before you blog about your discovery of hidden links on a PR 10 site =), take a look at the page again with CSS enabled. This time, click on the "more" link, and voilà, the no-longer-hidden text appears. This text is part of the page's functionality, and it is meant for the user to read and use, not just for search engines to index. This text adds value for the user, which Google rewards, so Google would not hurt this site's ranking or remove it from the index for that reason. Many sites use similar methods for navigational menus and other functional elements, so please rest assured that the existence of display:none on your site is not on its own a one-way ticket out of Google's index.
When thinking about your own site, ask yourself if all of the text is there for the user. If the answer is "yes," great job! If the answer is "no," do your best to change it to a "yes," and call on your webmaster community (this group!) for advice if you need it. CSS menus and white space without hidden text should not be a cause of concern when it comes to Google indexing and ranking; the only thing you should be concerned about is how they affect your visitors.
In the "Popular Picks" thread we asked for non-site specific questions, but now that this has been separated into its own thread, here's your chance to ask about a site you are still unsure about. Please also let me know if you would like further clarification on particular aspects this topic. "

Source: Google Groups

Labels:


More information on 404 errors

Google webmaster central blog had been posting some interesting stuff on 404s for a while this time they had a posting on 404 errors on how they treat 410 errors. According to the official Google webmaster blog 410 errors are treated the same way as a 404 error. More from the webmaster blog:

How do you treat the response code 410 "Gone"?
Just like a 404.

Do you index content or follow links from a page with a 404 response code?
We aim to understand as much as possible about your site and its content. So while we wouldn't want to show a hard 404 to users in search results, we may utilize a 404's content or links if it's detected as a signal to help us better understand your site. Keep in mind that if you want links crawled or content indexed, it's far more beneficial to include them in a non-404 page.

What about 404s with a 10-second meta refresh?

Yahoo! currently utilizes this method on their 404s. They respond with a 404, but the 404 content also shows We feel this technique is fine because it reduces confusion by giving users 10 seconds to make a new selection, only offering the homepage after 10 seconds without the user's input.

Should I 301-redirect misspelled 404s to the correct URL?

Redirecting/301-ing 404s is a good idea when it's helpful to users (i.e. not confusing like soft 404s). For instance, if you notice that the Crawl Errors of Webmaster Tools shows a 404 for a misspelled version of your URL, feel free to 301 the misspelled version of the URL to the correct version. For example, if we saw this 404 in Crawl Errors:https://www.google.com/webmasters <-- typo for "webmasters" we may first correct the typo if it exists on our own site, then 301 the URL to the correct version (as the broken link may occur elsewhere on the web):https://www.google.com/webmasters

Have you guys seen any good 404s?

Yes, we have! (Confession: no one asked us this question, but few things are as fun to discuss as response codes. :) We've put together a list of some of our favorite 404 pages. If you have more 404-related questions, let us know, and thanks for joining us for 404 week!

Labels:


How to start a multilingual site: Help from Google to make a google friendly multilingual site

This blog is all about of how to start a multilingual site & various pros in having a multilingual site. Multilingual site is a site where a person can have a site in different languages. But the first thing you'll want to consider is if it makes sense for you to acquire country-specific top-level domains (TLD) for all the countries you plan to serve. This option is beneficial if you want to target the countries that each TLD is allied with, a method known as geo targeting. Geo targeting is different from language targeting. Geo targeting refers to the sites whose main target is in a particular region/location in the world & it allows you to lay down different geographic targets for different subdirectories or sub domains (e.g., /de/ for Germany). Where as language targeting is one which targets to reach all speakers of a particular language around the world & where you probably don't want to limit yourself to a specific geographic location. In this case you don't want to use the geographic target tool. Since its difficult to maintain & update multiple domains, its better to buy one non-country-specific domain, which hosts all the different versions of your website. In this case, there are two options which are recommended:

First option is to place the content of every language in a different sub domain. For our example, you would have en.example.com, de.example.com, and es.example.com.
Second option is to place the content of every language in a different subdirectory. This is easier to handle when updating & maintaining your site. For our example, you would have example.com/en/, example.com/de/, and example.com/es/.
There may arise a doubt for some that when same content is posted in different languages then will it result to a duplicate one?? Definitely not, but you should make sure that your site is well organized. And always avoid mixing languages on each page as this may confuse Googlebot as well as your users. It's always good to have navigation & content in same language on each page. You can also know how many of your pages are recognized in a certain language by performing a language specific site search. Multilingual site is a benefit to the owner of the site & to the visitors of the site as they get information in their language. For example: when a person wants to know fashion designing institutions in London then he may type that query in search along with the language he needs the page to be displayed in. He feels so comfortable when he gets the information in the language he knows & understand.

Official post https://googlewebmastercentral.blogspot.com/2008/08/how-to-start-multilingual-site.html

Labels: ,