Maile Ohye of Google defines IP delivery, geolocation, and cloaking

Hi My name is Maile Ohye and I am a support Engineer with Google webmaster central. Today I want to discuss a fairly advanced topic of IP delivery well talk about the background and some considerations if you choose to use IP delivery on your site.

Today’s topic will consist of 4 areas. First we will discuss the background on IP delivery and this is why a webmaster choose to implement technique such as IP delivery. Next Ill talk about how Google.com serves our users. And you can see some of the ways we server users based on their IP addresses. And after you see some of the techniques at Google.com ill show you some ones not to use. These are examples that are post in the case of major websites that uses IP delivery but on sub-optimum methods. These are things to shy away from and last our recap on design considerations. So when you want to design your site for IP delivery first you would have a question why would a webmaster have to choose IP delivery.

One major reason is IP delivery helps target information to users. So lets say you have a .com URL and your Business is all in English and you are doing very well in United States and if you want to broaden your market place and perhaps server users in Europe then you realize a potential customer in Germany will have different needs than a American user. For example they might have different languages as well as different regional concerns such as what’s your shipping tax when you ship product to my country. That’s where IP delivery comes into play, IP delivery is the process of delivering specific contents to users based on their IP address. So if you can detect a users IP address when a request comes in and understand what region they are coming from then you will be able to target specific content such as Ads that are more pertinent to their region so this might be say you have a user coming from California based on their IP address you might say we have low shipping cost to California or if they are detected to be from Germany you might say oh you don’t have taxes for handling for users in Germany. So lets make this more concrete by seeing what Google bot does. So that this scenario where you have a user in Switzerland by detecting their Switzerland based IP address and your browser is set to german which is the language of the region if you then visit www.google.com rather than being shown the content of www.google.com you are actually likely will be redirected to www.google.ch and this is Switzerland’s top level domain. And here you see the content and it will be in German as well, so in this instance google not only utilizes the IP address but also the language settings. Now a slight variation on this scenario lets say your users are in Switzerland and you are just vacationing actually in America which is just there. So you still have the Switzerland IP address but you might have your browser set as English settings. So now lets say this user visits www.google.com instead of being redirected as a formal user their URL might usually remain www.google.com .
And they see similar contents as most of us see in United states but this page will be updated since they have Switzerland IP address and it will have a link saying go to Google Switzerland. So here is an area where Google uses an IP address to serve this user better information. So now you saw some things what Google uses. What are some of the How not to’s. Well these are some of the mishaps we see on the web and one idea is a website might choose to bought in the market place and by translating all of their existing content but serving all these modified contents on their same URL but haven’t modified their site structure. But this is going to be problematic because URLs has to be probably unique should be largely same content for URL. There will some issues that will arise when these things happen. For example users cannot share their URLs among different people if they are not from the same IP range so I see a great product on your URL and I am in America and I want to share the URL to my friend in Japan you might want to do something totally different or update the complete thing in Japanese but that might not the way we want to share it we might want to see the same content and an other side-effect of using the same URLs for different content such as different languages is that you need to remember search engine crawlers can come from all over the world.
We can have a number of IP addresses so lets say your .com site serves 90% of your users in English and you tried to reach 5% that are in German so you rather have search results show English Contents. But lets say a search engine crawls you from German IP and you give them all german contents for those URLs its very possible that a search engine can overwrite your English data with that German content which can result in titles and description in different language than you desire. An other how not to is to serve Googlebot specifically different contents than you do for users. This is called cloaking and it’s a violation of our webmaster guidelines so remember if you are implementing IP delivery then you want to server Googlebot the same content you serve to users with a similar IP address. So how you have some things that Google uses and also some of the things How not to use lets just consider some design considerations. AS we discussed earlier keep each URL consistent serve largely same content on each URL, this means that if you have dynamic portions that you contain them or limit them to small areas.

So for Google we have link that says go to Google Switzerland. An other thing that you can use is the same product that everyone use like regional coupons which says low shipping cost to Germany. In tandem with that idea you can create separate URLs with more varied content so if you translated your contents to different languages remember you also want to create sub-domains, sub-directories or even obtain a top-level domain for that information and if you choose to do that say you have German content now you might put that on example.com/de or on a Top Level Domain example.de. And if you use sub-domains or sub-directories remember that if they are verified you can use webmaster tools for Geo-location and there you can take example.com/de and target that to the location of Germany. And last keep in mind all your users ip address and utilizing IP delivery doesn’t solve all your problems. You need to understand users and their browser settings because you might have a English user who might be on a vacation in Germany. So keep in mind that you can use the except language header that comes with the request. To give your users the most optimize results. So thanks so much for watching this section on IP delivery. For more webmaster information please checkout webmaster central at google.com/webmasters

Maile Ohye

How to prevent Google Bowling – Interesting discussion in WMW

I came across an interesting discussion in webmasterworld about how to prevent damage to a site by links from other sites. You can read more about the discussion here. http://www.webmasterworld.com/google/3677877.htm

Tedster a long term member and webmaster world Administrator answers pretty well. I am very impressed with his answer and agree 100%.

Even if you block IP addresses or redirect pages, the links that point to your
website are still there, and they’re on other sites that you can’t control.
Those links will have whatever effect they have with Google’s also.
There is
one thing that protects a website against Google Bowling – a solid backlink
profile of its own. The more your “real” quality backlinks grow, the less anyone
else’s malicious actions can affect it.

Regardless of spam backlinks or now as tedster says a good backlink profile will automatically remove any bad PR received from bad links.

Pagerank craze whitebar, Greybar of Greenbar

Picture says it all. People like to have more green in their Google Toolbar Pagerank bar

Martin Buster Good post on Brett’s Link theme pyramid.

Martin Buster Webmasterworld Moderator made an interesting post where he discusses about links using Brett Tabke’s Link theme Pyramid he says

“1. Anchor text should match the page it’s linking to. If the anchor says red widgets, particularly for a page meant to convert for red widgets, it should have the phrase red widgets on the page.
I know some people will say this opens you up to OOP but I think as long as there are variations in the links, then you’re good to go. Because of the natural non-solicited links I’ve received on some sites, I’ve become a believer in the ability of the linking sites relevance to a query being able to transfer over to the linked-to page.
Why would one consider a page about red and blue widgets to be relevant for blue widgets? Looking at it from the point of view of relevance to the query, does it make sense to return a page about red and blue when the user is looking for blue? Looking at it from the point of conversions, if someone is querying for blue doesn’t it make sense to return a page dedicated to blue?
PPC advertisers understand the value of having a landing page that matches the query. PPC advertisers understand the value of an optimized ad for inspiring targeted and converting click-through. Organic SEO should follow suit. A dedicated organic page can utilize a specific title and meta description for the same purpose. This means building specific links to specific pages.
I don’t think it’s adequate for the search user to query babysitting for boys and get a page for babysitting in general. So why build links to a general page when a specific page will not only be more relevant but convert better?
2. Hubs Getting back to Brett’s theme pyramid, imo general anchors should point to general pages. Specific anchors should point to the specific pages. I don’t understand why people are trying to obtain specific anchors to general pages.
Why are hub pages being created that are simply a big page-o-links to specific pages? Hubs are great starting points, imo they should be more than a page of links. I think this is especially critical for e-commerce where high level topics include brands or kinds of products and sub-pages include models or specific manufacturers.
These second level pages can be cultivated to perform for more general terms, but also in conjunction with, for example buy-cycle long tail phrases like reviews, comparison, versus, etc. Take that into account for the link building.
3. Is the home page really the most relevant page of the link? Here is another place where link building is wasted, imo. I think it makes sense to focus on relevance/links to supporting pages that then create a groundswell of relevance back to the home page for the more general terms.
Reviewing affiliate conversions and AdSense earnings, it’s been my experience that specific pages perform better than general home pages. If you’re lucky or by design people will click through to the pages they are looking for. But shouldn’t you be showing those pages to the user first? And don’t you think the search engines want to show those specific pages too? I think this may explain some ranking drops some people are experiencing for home pages that used to rank for multiple terms.
4. Longtail Matching This is where on page SEO comes into play. This refers to geographic and buy-cycle phrases. Building partial matches works, imo. Someone showed me a site that was a leader in specific searches but those pages would perform better if they had the names of cities and provinces on the page. Ranking for Babysiting for Boys is fine, but Babysiting for Boys + (on page) Tampa is better. “

Source: webmasterworld.com/link_development/3676520.htm

matt cutts says widgets are ok as long as its not abused

In a recent interview with Eric, Matt Cutts Google’s Web Spam Head has agreed that widgets are a type of link bait if used in a proper way.

We at Search Engine Genie provide PageRank Button a useful Widget where users don’t need to have Google toolbar to view the PageRank of your page. They can just view the PageRank from the button we provide you. Our PageRank Button users our custom coding to query Google’s Database to query for PageRank of a page and will display it for you on your website. We are in process of developing more widgets especially a widget which will query Google, Yahoo, MSN for Number of pages indexed, number of back links and will display it on your page. It will be released in a week.

Matt Cutts also re-iterated Widgets for the purpose of Spamming the search engines cannot be accepted. Some of them are like hiding links in a Web Counter or linking to any random site for the benefit of pushing the linking page’s rankings etc. Also links in non-embedded tag or no script when using widgets is spam. We at Search Engine Genie never resort to those type of tactics.

We already discussed this before many aggressive SEOs contact word press theme developers and insert their link as a credit back to their site. This is again an aggressive tactic and Matt Cutts has warned not to use such tactics to boost Search Engine Rankings.

You can expect lot more widgets from Search Engine Genie in coming future. Our programmers are working towards it. We want Search Engine Genie to be an useful hub webmasters and site owners.

Good Luck,
Search Engine Genie Team

Not Enjoying all these spams

I get really ****** off when i see emails like this ending up in our company mail iD.

Providing you 100% Manual Directory Submissions With Page Rank Details.

I am Niraj from India.
Providing you 100% Manual Directory Submissions With Page Rank Details.
I have one year of experience of directory submissions work.
and I am done 1000 directory submissions work all are manually in 8 to 10 hours or so.
I am doing it every day. It’s my Job and passions as well.
And therefore I get $500 (Rs. 20000) per month. All because of Directory Submissions work.
You would be quite get surprise that my salary increase because of that below reason.
Before six month my salary is $250 (10,000) and I get other same kind of directory submissions job in one of our competitor company suddenly I put resignation in my firm.
And my manager told me that why u should leave our firm. My answer is that I will get $375 (Rs. 15000) per month suddenly my manager double my salary put my increment 100% and give me $500 (Rs. 20000) per month. And told me that never think to leave our company.

I am the only person who done 1000 directory submissions in a one day (8 to 10 Hours).

And then I got an idea to start my business own.

And providing manually Directory Submissions Service.

At this below price I will get some order in small amount directory submissions work like 5 to 10 sites directory submissions in one month.
100 Directories $10.00250 Directories $20.00500 Directories $35.00750 Directories $ 45.001000 Directories $65.00

But I can’t get any regular client yet for directory submissions work.
I think you are one that client that’s why I describe you all that.

R u give me regularly that kind of work.
I have a staff of five person they can submit 500 Directory per day as well.
I am sure u will find grate quality as well.
RegardsNiraj Patel”

I would have expected atleast some good english this letter is horribly written i can’t believe how he can expect to get any business from it.

Google And Microsoft still fighting for yahoo

Microsoft Google Fight for Yahoo

Google has recently signed up a contract with Yahoo to serve Ads on Yahoo’s Search Results. Though critics call this deal against ethics and will spoil the competition in Search Engine industry Yahoo is ready to go ahead with it.

On the other Hand Microsoft offered Yahoo A billion dollar Annually for Advertising in Yahoo’s search results. According to a Reuters News Source

“Microsoft Corp offered Yahoo Inc $1 billion in cash to buy its search business in a deal that would have delivered $1 billion in additional annual operating income to Yahoo, a source familiar with Microsoft’s thinking said on Friday.
In an alternative to a full acquisition, Microsoft would have taken control of Yahoo’s search business, delivering the company better rates for advertisements tied to its search results than Yahoo’s current Panama advertising system, the source said.
Microsoft would have also paid $8 billion to take a 16 percent stake in Yahoo, which would have valued the company’s stock at $35 a share, the source said.”

Mattcutts reiterates Yahoo Directory has plenty of Pagerank Internally.

Webmaster world has an interesting discussion about why Yahoo Directory is showing a grey bar for lot of pages. We already discussed here why a webpage has Grey Pagerank display. Most probably the reasons for Grey Pagerank is one of the reason we addressed in our article. Now webmaster world members are thinking Google has purposely imposed a Grey bar penalty for yahoo directory to reduce the Competition. For major sites like Yahoo directory Grey pagerank display could just be the problem inner pages are facing for many established sites. Google has imposed some sort of automated filter for inner pages of a site if the pages don’t have good external links or doesn’t feel its valuable. We see this across many of our pages in our site. I feel this is just temporary and Matt Cutts has cleared this up.

Matt Cutts replied this thread ” It looks like it’s just a matter of canonicalizing upper vs. lowercase as to why some of the subdirectories look the way they do in the toolbar. I just wanted to reiterate that the Yahoo Directory has plenty of PageRank in our internal systems.”

Search Engine Genie May traffic reached 140,000 Uniques

We are proud to announce that we reached 140,000 Unique Visitors in the Month of May. May was the largest month for us and we are very happy that we became a good Hub for webmasters. It totalled 3,622,285 hits.

Our most favourite and the most visited area is our tools section. If you haven’t checked out that page please check it out here http://www.searchenginegenie.com/seo-tools.htm

June Ranking drop reported for lot of sites

Webmaster world members report ranking loss for lot of sites in first week of June. We had some of our client sites shaken but not to the extent discussed in that thread here

www.webmasterworld.com/google/3668739.htm

A member posted the following message which will give a little insight on what is being discussed

“Initially, I thought there were some problems with the geo filtering being screwed up with regards to the UK, but I’m not sure about this statement in isolation. Maybe it’s a bug and maybe it’s not.
Since it’s hard to pinpoint we really need a lot of information to decipher the problem or adjustment that G has put in place . If it is only UK related sites, then my hunch is that this could be rolled out in other regions with more aggressive filtering.
Some things I’d like to know or qualify , especially from the long established members here or folks with long established sites, are things like this :
– is this purely selective – is it only high PR sites and if so what level of PR – is it only UK related sites and does that mean the TLD and/or hosting [ chief suspect is geo filtering issues ] – do these sites have an inbalance of linking techniques e.g. lot’s of navigation IBL’s , footer IBL’s , – do these sites publish frequent content
My observation so far is that it has been
– highly selective in a competitve niche to a minority of sites [ total stability around the site i watch ]
– effected by a combination of factor [ not sure what right now ie one factor in isolation isn’t enough to send a site down, more one event combined with another is. This is because I’m observing others using the same techniques which are unaffected ]
-effected by introduction of high PR link/s , leading to – recent upward PR increases [ not visible on toolbar ]which have caused a re assessment of the site’s “trust” rank. – it only effects UK TLD’s [ need more info on this ] – a further discounting of low value pages bringing the overall PR down – thin affiliate sites or sites with aggregated content been only effected [ not 100% on this – just some sites I’m watching ] – linking stagnation or momentum altered – lack of fresh , original content
Actually, my feeling is that it’s nothing new , it’s just more aggressive in it’s selection of sites
In the case of one site I’m observing, for any phrase or content [ exact match and broad match ] the whole site has been tanked to between -40 to -60 on phrases that should rank. Not one single exception. When it initially disappeared from Google it was completely off the index for 3 days.
This might correspond to recent discussion by Googler John Mu about the – 60 penalty which appears to have been acknowledged , which would seem to be based on a recent change at G. Some folks have reported improvements coming back quickly with a site clean up.
But there’s not enough reports to be sure.
My concern is that the recovery could be indefinite and have effected the trust rank of effected sites – and of course communicating with G through WMT leaves webmasters are exposed to hand checks which could expose other inadvertent nasties and one way communication. “

Request a Free SEO Quote