Online shoppers turn to Google for search – report
A new report shows Google being the leading search engine for online shopping search users,
Online shoppers picked Google Inc. as their search engine of choice this December while making their holiday Web purchases, according to a report issued on Wednesday.
Internet measurement firm Hitwise found that 11.1 percent of all December shopping-related visits originated with Google, a 28 percent jump over last year. But online auction giant eBay Inc was the biggest driver of traffic to shopping sites, generating more than 13 percent of retail traffic.
Search engines Yahoo! Search and Microsoft Corp.’s MSN Search drove 4.05 percent and 0.79 percent of retail visits, respectively.
Online retail giant Amazon.com, which is the second most visited online shopping site after eBay, generated only 0.75 percent of visits to other shopping retailers, Hitwise found.
For more read click here,
Google Earth now available for Macs
Google has always been focusing their products to windows users, Its true windows users are the major users which use their search results,
but google has drifted a bit and is now offering google earth mac users,
Google says
” “Available in beta for 10.4.x OS [that’s Tiger], Google Earth for Mac offers the same features as the PC version, such as animated driving directions, zoom in and out capabilities, 3D buildings view, and more. Google Earth Plus and Google Earth Pro are not yet available for Mac.”
Google earth download page has the following message,
“Download Google Earth – Mac or PC
Google Earth is a broadband, 3D application that not all computers can run.
Desktop computers older than 4 years old may not be able to run it.
Notebook computers older than 2 years old may not be able to run it. “
Google update bigdaddy on and off, – Google’s upcoming update is being tested repeatedly,
Google’s upcoming update “Big Daddy” named by matt cutts of google is being tested, The test datacenter http://66.249.93.104/ comes online and goes offline repeatedly, As matt says these results are expected to be live in an other 2 weeks, so people check your rankings in this DC,
Matt cutts asking for feedback or various issues related to google,
Mattcutts of google has asked for feedback on various topics related to google,
He asks feedback on following issues,
Feedback: Webspam in 2006?
Feedback: Search quality in 2006?
Feedback: Products/features in 2006?
Feedback: Webmaster services in 2006?
Feedback: Communication/Goodwill in 2006?
Feedback: What did I miss?
Give your feedback in matts blog,
mattcutts.com/blog/
Review on the New Google Update – Big Daddy – New year special Bigdaddy
Review of New Google Update
Big Daddy:
Matt Google’s senior engineer has asked for feedback on the new results which were live on test datacenters 64.233.179.104 , 64.233.179.99 etc,
Matt cutts has asked semiofficially for feedback to their new index, Currently the new results are live on http://66.249.93.104 , http://66.249.93.99 etc, the 66.249 range is steady and is showing new results from 1st of January 2006, Even as matt said 64.233.179.104 will also be the test DC we don’t see that datacenter to be a steady one, Test results come and go in that particular DC,
Seeing the results on the test DC we see many issues fixed ,
1. Better Logical arrangement of sites in site: and domain only search,
Whether deliberately or by mistake google was unable to logically arrange site: search and domain only search, Homepage has been the most important page for many sites and google buried homepage when doing site:domain.com search or through domain only search, Now in the new datacenter we see a better logical arrangement of sites and pages in sites are listed in proper order compared to other Datacenters, Yahoo has been excellent in arranging site: search for a site, they show the best URL arrangement, Yahoo still shows best arrangement of pages for a site: search, Google has been known to hide many good information from end users, especially the link: search for a site, Google purposely shows less than 5% of links through link: search confusing many webmasters, so probably they do the same with site: search but atleast they have a fix for the homepage in the New big daddy datacenters’
2. Removal of URL only entries / URL only supplemental.
For a very long time Google has been showing URL only listing in SERPs( search engine result pages ) for some sites, Most possibly reasons for URL only listings / supplement results are duplicate pages across a site or across sites, pages which doesn’t have any link to it, Pages which are once crawl able but later not crawl able, pages which no longer exists, pages which were not crawled by google for a very long time possibly because of some sort of automated penalty, pages which was once crawled well but due to some web host problems is not crawlable by search engines, redirect URLs etc, It seems Google has fixed this major issue, URL only listings don’t add value to google’s index and good that it has been removed from the search results, Though we see some supplement pages hanging around here and there most of those pages seem to be caused by duplicate content across sites,
Recently google is very severe on duplicate contents especially syndicating articles, if there are lot of copies of same articles published across sites google makes some of the duplicate copies supplement results, Only these results seem to stick in the new DC other than that no supplement results are found,
For example this is the result of the main www.google.com DC
< 
Result in updated Datacenter New DC: 66.249.93.104 ( Verified )
Good indexing of pages for a site:
Recently google has been showing vague page counts for sites they indexed, For one site we work google has been showing 16,000 pages indexed where as the site itself doesnt have more than 1000 pages, But the new DC shows only 550 pages indexed which is a good sign, And all the 550 pages are unique pages without supplementals and we can expect them to rank soon, this is a very good improvement, we have seen the same accuracy across many sites we monitor, the new DC gives much better page counts,
Better URL canonicalization:
We see big improvement of google’s understanding of the www and non-www version, we have checked it across a lot of sites, for example for the search dmoz.org and www.dmoz.org google lists only one version of the site that is dmoz.org, before they show one version URL only and the other one real since it causes a lot of duplicate issues, we see the same fix across sites we monitor, Most of the results are now very good due to google handling redirects pretty well, that is a very good sign for a lot of sites, best advantage are IIS hosted sites which don’t have .htaccess site, lots of sites were unable to to 301 redirect to any one version of the URL, either www or non-www now they don’t have to do that google can understand that both URLs are same for a site,
Better handling of 302 redirects:
the new datacenter is doing well in handling sneaky 302 redirects which used to confuse googlebot a lot, there has been numorous discussions on this 302 redirect issue, google has been closely monitoring this problem and finally they came out with a good fix in the new big daddy update DCs,
For example check here,
http://www.google.com/search?hl=en&q=nigerianspam
you can see the page of anybrowser.com having the same title as nigerianspam homepage
http://66.249.93.104/search?hl=en&lr=&q=nigerianspam&btnG=Search
you can see the page of anybrowser.com having the same title as nigerianspam homepage is removed / missing
Pagerank of canonical problem URLs not fixed,
Google has not fixed the pagerank of sites with canonicalization, Hope it will be fixed in the coming dates,
some jump scripts which uses 302 redirects are not fixed,
hope it will improve soon, example:
http://66.249.93.104/search?q=inurl:www.searchenginegenie.com+-site:www.searchenginegenie.com&hl=en&lr=&start=10&sa=N
Search Relevancy not accurate both in current results and new DC,
When we searched for DVD ( digital versatile disc ) in google we found google.co.uk ( google’s UK regional domain ) ranking in top 5, that is too bad google doesnt have anything to do with DVD, we hope it will be fixed soon,
http://www.google.com/search?hl=en&lr=&q=dvd
http://66.249.93.104/search?hl=en&lr=&q=dvd
Possible duplication caused by new improved page indexing by google,
Google used to have 101 kb indexing limit for a page, now they relaxed that limit and they can crawl more than 500 kb, But it seems the 101 kb limitation index is still available and is visible often is search, we see it across our client sites, here is an example on wikepedia page:
http://www.google.com/search?hl=en&lr=&q=%22en.wikipedia.org%2Fwiki%2F2004_Indian_Ocean_earthquake
Not fixed in new datacenter too,
http://66.249.93.104/search?hl=en&lr=&q=%22en.wikipedia.org%2Fwiki%2F2004_Indian_Ocean_earthquake
That is all for review now more review will follow soon,
SEO Blog Team,
matt cutts AKA googleguy – who is matt cutts ?
Matt Cutts is no new name in the online search engine marketing industry. Matt’s name is not limited to one specific field. His credits include a respectable job in Google, a blog that interests all kind of traffic, and a commendable fondness on insects!
Matt Cutts joined Google in January 2000 as a software engineer and got a fabulous chance to implement the first test of the AdWords user interface. Matt spent most of his time in the quality group of Google and eventually implemented Google’s SafeSearch, the family filter.
Matt has an M.S, from UNC-Chapel Hill, a double degree in Mathematics and Computer Science.
The blog developed by matt has become a great resource for those fascinated by search engine news. It provides a channel of communication with webmasters also.
Matt used to post with the nickname Googleguy in searchenginewatch and webmasterworld, even now he continues to post with that ID
Do visitors from google images convert? very interesting thread discussion in webmasterworld.com,
There is an interesting thread in webmasterworld.com which discusses whether visitors from google images convert, In my point of view it all depends on the site we run, If the site is basically built around images and art it is best to feature in google image search,
this is an interesting posting from the same thread,
My site was basically built around a large photo gallery, so image search
is important to me, and generates a big share of my traffic.
But like a lot
of image galleries, the visitor may not be actually looking to buy anything.
Sometimes they are actually looking for images, prints, posters, etc, none of
which I sell.
But I do have ads on the site and a modest number of visitors
click the ads on any given image page.
What has worked well for me is
providing links to pages about stuff some percentage of visitors coming in
through image searches might be interested in, and putting ads on those pages
that they are more likely to click.
As a side note, because my images get
“borrowed” at a fairly high rate, I decided to label most of them with my url.
Those “borrowed” images appear to generate a pretty decent level of fairly
targeted traffic. Basically I turned the images themselves into ads for my site.
That might be harder to do with an image that comes from a product
description page, but it might be worth giving some thought as to how to make it
work in that situation.
How to fix/remove supplemental results from google,
Steveb of webmasterworld has an excellent posting on how to remove supplement results, I agree 100% with what he says and I recommend his posting to everyone who have supplement results in google and want to remove them, Supplement results are mostly caused when a page of a site once existed and later removed by the site owner of because of any other problem, Supplement results are also caused when a page which is crawled once had links to it then the links dropped off completely,
Here is his posting,
“Google’s ill-advised Supplemental index is polluting their search results in many ways, but the most obviously stupid one is in refusing to EVER forget a page that has been long deleted from a domain. There are other types of Supplementals in existence, but this post deals specifically with Supplemental listings for pages that have not existed for quite some time.
The current situation: Google refuses to recognize a 301 of a Supplemental listing. Google refuses to delete a Supplemental listing that is now a nonexistent 404 (not a custom 404 page, a literal nothing there) no matter if it is linked to from dozens of pages. In both the above situations, even if Google crawls through links every day for six months, it will not remove the Supplemental listing or obey a 301. Google refuses to obey its own URL removal tool for Supplementals. It only “hides” the supplementals for six months, and then returns them to the index.
As of the past couple days, I have succeeded (using the below tactics) to get some Supplementals removed from about 15% of the datacenters. On the other 85% they have returned to being Supplemental however.
Some folks have hundreds or thousands of this type of Supplemental, which would make this strategy nearly impossible, but if you have less than twenty or so…
1) Place a new, nearly blank page on old/supplemental URL.
2) Put no actual words on it (that it could ever rank for in the future). Only put “PageHasMoved” text plus link text like “MySiteMap” or “GoToNewPage” to appropriate pages on your site for a human should they stumble onto this page.
3) If you have twenty supplementals put links on all of them to all twenty of these new pages. In other words, interlink all the new pages so they all have quite a few links to them.
4) Create a new master “Removed” page which will serve as a permanent sitemap for your problem/supplemental URLs. Link to this page from your main page. (In a month or so you can get rid of the front page link, but continue to link to this Removed page from your site map or other pages, so Google will continually crawl it and be continually reminded that the Supplementals are gone.)
5) Also link from your main page (and others if you want) to some of the other Supplementals, so these new pages and the links on them get crawled daily (or as often as you get crawled).
6) If you are crawled daily, wait ten days.
7) After ten days the old Supplemental pages should show their new “PageHasMoved” caches. If you search for that text restricted to your domain, those pages will show in the results, BUT they will still ALSO continue to show for searches for the text on the ancient Supplemental caches.
8) Now put 301s on all the Supplemental URLs. Redirect them too either the page with the content that used to be on the Supplemental, or to some page you don’t care about ranking, like an “About Us” page.
9) Link to some or all of the 301ed Supplementals from your main page, your Removed page and perhaps a few others. In other words, make very sure Google sees these new 301s every day.
10) Wait about ten more days, longer if you aren’t crawled much. At that point the 15% datacenters should first show no cache for the 301ed pages, and then hours later the listings will be removed. The 85% datacenters will however simply revert to showing the old Supplemental caches and old Supplemental listings, as if nothing happened.
11) Acting on faith that the 15% datacenters will be what Google chooses in the long run, now use the URL removal tool to remove/hide the Supplementals from the 85% datacenters.
Will the above accomplish anything? Probably not. The 85% of the datacenters may just be reflecting the fact that Google will never under any circumstances allow a Supplemental to be permanently removed. However, the 15% do offer hope that Google might actually obey a 301 if brute forced.
Then, from now on, whenever you remove a page be sure to 301 the old URL to another one, even if just to an “About Us” page. Then add the old URL to your “Removed” page where it will regularly be seen and crawled. An extra safe step could be to first make the old page a “PageHasMoved” page before you redirect it, so if it ever does come back as a Supplemental, at least it will come back with no searchable keywords on the page.
Examples of 15% datacenter: 216.239.59.104 216.239.57.99 64.233.183.99 Examples of 85% datacenter: 216.239.39.104 64.233.161.99 64.233.161.105 “
Nichebot.com google rank checker not working any more for multiple phrase queries,
Nichebot.com hosts a google position checker which searches google’s top 1000 results to yield rankings, It seems nichebot.com/ranking rank checker is not working any more, Is it that google banned their IP or they are having internal coding errors, It is unfortunate that a good tool is not working any more, We hope they fix the error pretty soon,
Monograph of Google Analytics
Google has re-casted Urchin and launched Google Analytics, “It tells you everything you want to know about how your visitors found you and how they interact with your site. It does a great job to focus your marketing resources on campaigns and initiatives that deliver ROI. It automatically tags keyword destination URLs and improve your site to convert more visitors.”
The best thing is, it’s now free and seamlessly integrates with those running an Adwords
campaign.Google Analytics is the only product that can automatically provide AdWords ROI metrics, without you having to import cost data or add tracking information to keywords. Google Analytics tracks all of your non-AdWords initiatives as well.
The move could present a significant threat to rivals in the Web analytics sector, who may be forced to change their pricing strategies in response to Google’s free service. The service is currently available in 16 international languages.
Blogroll
Categories
- AI Search & SEO
- author rank
- Authority Trust
- Bing search engine
- blogger
- CDN & Caching.
- Content Strategy
- Core Web Vitals
- Experience SEO
- Fake popularity
- gbp-optimization
- Google Adsense
- Google Business Profile Optimization
- google fault
- google impact
- google Investigation
- google knowledge
- Google panda
- Google penguin
- Google Plus
- Google Search Console
- Google Search Updates
- Google webmaster tools
- google-business-profile
- google-maps-ranking
- Hummingbird algorithm
- infographics
- link building
- Local SEO
- local-seo
- Mattcutts Video Transcript
- Microsoft
- Mobile Performance Optimization
- Mobile SEO
- MSN Live Search
- Negative SEO
- On-Page SEO
- Page Speed Optimization
- pagerank
- Paid links
- Panda and penguin timeline
- Panda Update
- Panda Update #22
- Panda Update 25
- Panda update releases 2012
- Penguin Update
- Performance Optimization
- Sandbox Tool
- search engines
- SEO
- SEO Audits
- SEO Audits & Monitoring
- SEO cartoons comics
- seo predictions
- SEO Recovery & Fixes
- SEO Reporting & Analytics
- seo techniques
- SEO Tips & Strategies
- SEO tools
- SEO Trends 2013
- seo updates
- Server Optimization
- Small Business Marketing
- social bookmarking
- Social Media
- SOPA Act
- Spam
- Technical SEO
- Uncategorized
- User Experience (UX)
- Webmaster News
- website
- Website Security
- Website Speed Optimization
- Yahoo










