MSN Live Search
We always had the habit of debating Google and Yahoo for a change now let us debate between Yahoo and MSN or Yahoo vs. the new Microsoft Bing Search engine.
Primarily most of us will just conclude Yahoo is the much better compared to MSN. In most cases I agree with that when you see the below image from hit link it shows a clear trend where yahoo is well ahead than MSN.
Yahoo was once very dominant and was getting about 30% of search engine traffic. But as Google grew it made it very dominant and much stronger in some places in US market share of Google is well above 90%. 3 years back Yahoo results were powered by Google so virtually Yahoo’s market share is Google’s market share. Now they are an independent search engine and they power a lot of search engines. So I am sure Yahoo is a search engine here to stay.
As a SEO company we monitor a lot of logs of our clients. In most cases we see consistent performances from Yahoo when rankings are similar in all top 3 search engines. So I see no point in comparing who is better Yahoo or MSN. But wait it’s not all over. After years of being an underdog Microsoft cant anymore to get back their search engine market share. They came down with a bang with the search engine called Bing. So did Bing work indeed yes? They excess advertising of Bing and the improved quality of results made Bing instant famous and already a hit. Bing as far as I tested has much better quality results than the MSN live search and MSN Global search. I am sure Bing is here to stay we need to see few more months whether they can steal any percentage of Market share from Google. It’s just 15 days from launch and too early to say whether they will be dominant. But there are early signs it will be a success story.
Wall street Journal reports
“The number of times people clicked on ads listed next to Microsoft Corp. (MSFT) search results jumped about 8% in the week since the software giant released its newly revamped search engine, dubbed Bing, the world’s largest search engine marketing firm said Thursday.”
“Microsoft’s share rose to 11.1 percent in the June 2-6 period, Bing’s first week in operation, from 9.1 percent a week prior, ComScore said on its Web site. Average daily penetration among searchers, a measure of how many people are being reached by the product, rose to 15.5 percent from 13.8 percent.”
We never know what the future holds for search engines. I am sure there will be some real tough competitors for Google.
Microsoft said, Google and Yahoo to cut the time they keep users search-engine records and to agree to the European demand. On Sunday in telephone interview, Brendon lynch, the company’s director of privacy strategy said Microsoft is able to meet the requirements, it wants to waits until its larger search rivals get in to board.
A group of European union officials dubbed the articles 29Data Protection Working Party have asked search engine to purge their user records after six months.
Brendon Lynch said proposals are feasible, but they want them to adopt it industry wide. If Microsoft alone adopts the 29Data Protection Working Party then it would not have broad impact on its users in Europe because the market share there is very small.
Cutting the length of time that search engines maintain such records could munch into advertising revenue, the core source of sales for Google and Yahoo. The companies rely on users queries to target advertising that have raised privacy question, since all the search engine have track where customer go online and what they read and buy.
In April, the Article 29 group said Search engine providers break their EU privacy law, by retaining the online search data for more than six months. The association is made up of data-protection officials from the 27 EU nations and from three non-Eu countries, including Norway.
Google have decided to cut the time and to keep data to nine months. In July 2007, Microsoft said after 18 months it will remove identifiers from individual search data. In the same month Yahoo said it would adopt a 13-month cutoff. Both Microsoft and Yahoo follow Google in European internet search traffic. According to the research company ComScore, Google has almost 80 percent of the market, while Microsoft and Yahoo together acquire about 4 percent.
I have many question me how to check the backlinks or links coming into their site in various search engines. Well i already wrote an article based on that here. http://www.searchenginegenie.com/backlink-strategies.htm this article isey a bit old but works great still. Today the top 3 search engines are more friendly to webmasters and are willing to share a percentage of what they know about your backlinks.
1. Google: Traditionally Google used to show most of the backlinks to a site ( link: ) but way back in 2002 they broke that comment and started showing backlinks only with PR 4 and above. Then later in 2005 they broke that too and started showing very less sometimes less than 2% of backlinks a site really possesses. This is had been the case for more than 2 years. But in 20o6 started a massive webmaster communication programme. They opened up something called Google sitemaps ( now called Google webmaster central ) . Later they capitalized on that and due to massive support they got from webmasters and now the webmaster tools shows a lot of data very useful for webmasters. One of that is the backlinks to a site/page/from inner pages etc. To check the backlinks what Google shows you need to first verify your website to prove you are the owner. Now we can check backlinks if we login to Google webmaster tools here.
Once logged in and site verified:
We go to Dashboard >> Links and we can check backlinks what Google shows.
Remember even this is not accurate here google shows atleast 25% backlinks so you can count that what they are showing is somewhat correct.
Yahoo: Yahoo is the only search engine who never hesitated to share their backlink data. In yahoo we can just use link:http:// to check for a single page or linkdomain: to check for an entire site. There are lot of ways you can check backlinks in yahoo especially filtering out sites. Please check those stuff here.
MSN: MSN was showing backlink in link: command before a year but they broke the command and stopped showing all backlinks. Now they opened up communication and started http://webmaster.live.com/ where you can verify your site like Google and check backlinks.
Its good to see search engines share more with webmasters in recent days i hope we see more from them in future.
Webmaster center has launched a new data on august 6th called crawl error & back link reports. The below information tells how the site owners can use the launched data.
Last fall when webmaster center launched the Live Search Webmaster Center in beta, the goal was to establish a long term relationship with webmasters and help them achieve their goals by addressing the most common questions we hear, and help them understand how Live Search sees their site. In an effort to improve upon those goals, today they’ve have launched a significant update to our Webmaster Center and brought the Center out of Beta! This update includes several new features that provide webmasters more information about how Live Search is crawling and indexing their sites, as well as a few features to make the data more actionable.
Crawl issues & reports:
The “Crawl Issues” which is a new feature allows webmasters to find four types of issues as follows:
File Not Found (404)
Blocked by REP
Long Dynamic URLs
For each issue webmaster center returns the URL & the data encountered.
-File Not Found: It lists all the pages that MSNbot tried to crawl and received an HTTP response code of 404. Generally, URLs listed here are from typos in links from other sites. You often can’t fix the link, but you can 301 transmit the typo to the correct page (for both a better user experience and reclaimed backlinks).
-Blocked By REP: It lists all pages that MSNbot tried to crawl but didn’t because they were blocked by the site’s robots.txt file or robots Meta tag. You should review this list and make sure you aren’t accidentally blocking access to pages you want indexed.
-Long Dynamic URLs: It lists all pages that have been flagged as having “exceptionally long query strings.” Microsoft says these URLs could lead MSNbot into an infinite loop as it tries to crawl all variations of potential parameter combinations and recommends webmasters find ways to shorten these dynamic URLs.
Unsupported Content Types: It lists all pages that are classified with content types that Live Search doesn’t index.
The crawl issue reports & download functionality features join the existing set which includes:
-out bounding linking data
In the beta of the Live Search Webmaster Center they offered a limited look into back link data. They’ve significantly enhanced this tool, giving webmasters access to more data about their referring links. The new backlinks feature shows the total count of backlinks to a site. You can view a list of the top URLs in the tool or can download up to 1,000.
Making data more actionable:
Webmasters are analytical and rarely work alone. They often need to be able to grab as much data as they can, and take it offline into Excel or some type of database for analysis and collaboration with a client, marketing or engineering partner. To enable that, they’ve built a few new features into all our reports, both the new ones and the old ones.
-Advanced filtering: This way one can quickly scope the results to zoom into the data they need, without having to sift through all the results.
-Downloading data: For times when webmasters want to view a lot of results, they also provide a download option that can give access to the first 1,000 results in a CSV file that can be easily opened with Microsoft Excel or imported into a custom reporting tool. This can help a webmaster analyze the results and share them with colleagues.
-More than just a set of tools: When they launched webmaster center, few resources were launched to help the site owners to engage with them.
Google Yahoo MSN now have a common protocol for sitemaps and they also have same rules for understanding robots.txt file
As per live search blog,
“We at Live Search are pleased to announce another collaboration with Yahoo and Google aimed at making webmasters’ lives easier. Webmasters have long used the Robots Exclusion Protocol (REP) to control how search engines access and display their content. The REP offers an easy and efficient way to communicate with search engines, and is currently used by millions of publishers worldwide.
Over the past few years, we have been working with Yahoo and Google to agree on common ways for webmasters to communicate with search engines. Our previous efforts include support for the Sitemaps protocol .While most search engines already comply with the REP, this is the first time the three major search engines have come together to detail how we actually implement the protocol. This effort makes it easier for webmasters to know how REP directives will be handled by search providers.
You can view the details of how we implement the REP at Documentation for the Robots Exclusion Protocol.
MSN webmaster live search blog recently reported how search engines have joined hands on using robots exclusion protocol to not only block them but also allow how users will guide the crawlers. I am sure this is not new but there are some tags described here which are definitely found by Google but not commonly used by other search engine crawlers.
One tag that immediately comes to my mind if the NOODP tag. If this tag is used in meta tag it will tell the Search engine not to show title and description of a Dmoz listing. We at search engine genie use this tag and this was first introduced before around 30 months by Google. Good to know that MSN and Yahoo too understands this tag.
NOODP META Tag
Then we have the crawl Delay settings in robots.txt first introduced by Yahoo. When we set a crawl delay we tell the crawler to fetch the pages from the server at the specified interval of time.
According to MSN Live search blog Microsoft now uses crawl delay and you can set the crawl delay and expect Yahoo and MSN to follow it. Google still doesn’t want to use crawl delay. Its understandable on Google’s point of view since a crawl delay is something very disturbing for a site.
For detailed reading follow the link above to MSN Live search official blog
As a SEO company we regularly monitor top search engines and the quality of Search results provided by them. Of the top 3 Search Engines ( Google, Yahoo , MSN ) MSN has the most ridiculous search results. Believe it or not their search results have never improved in Quality for the past 2 years. We say this not because our sites and client sites are not ranking in their search. In fact we have much better ranking in MSN than Yahoo and Google. Our frustration is towards sites that rank in places that never should be.
Still there is lot of spam in MSN results sites which uses excessive cross linking , excessive paid links, keyword stuffing all work well with MSN results. If you see yahoo especially with the latest update their results are almost competing with Google results. Their relevancy has improved a lot but MSN is still lagging behind very badly.
Their Geo targeting goes crazy when i search for generic phrases like car transport , auto transport etc. When i search from Indian Office i see results of sites in Australia, UK, India, US all mixed. I feel this is poor geo targeting.
I seriously want MSN to improve their search results its nowhere near the quality of Google. OK enough of ranting back to Job,
So some of your have problems getting indexed in MSN that is why you are here, Here is a simple checklist how to get good indexing by MSN live bot, MSN’s search engine crawler. Few posts back we had a posting about MSN live bot hitting us hard which is actually good news for us so how did we do it here is a small checklist you can do to get the same love from MSN search engine crawler.
1. As always we say good backlinks are similar to guaranteed paid inclusion into any search engines. There is no exception for MSN search engine get good links get listed in leading directories like Dmoz, yahoo directory etc your site is sure to indexed.
2. Submit your URL through webmaster.live.com a platform like google webmasters where you can validate your site and submit your URL,
3. Create a good html sitemap and link it from your homepage most of the search engines including MSN live bot like to crawl a site not too deep unless you have very good site. SO make sure most of your pages are accessible from the top level important pages like the Homepage.
4. Get listed in Dmoz like google MSN too loves Dmoz a listing in dmoz will guarantee a listing in MSN search.
5. Sometimes MSN live bot have problem reading your robots.txt, make sure you don’t mess too much with your robots.txt file it could cause problem for MSN live bot in crawling your site.
6. Check for canonical issues. MSN live bot still has some problem with canonical issues. Make sure you just use one version of the site either www or non-www. You can redirect either one to the other one.
7. MSN has problem with affiliate links so if you are part of an affiliate or offer affiliate services make sure your links coming in our going out have any problem with MSN live bot crawler.
8. Also if you made major changes to your URLs like changing shopping cart then you need to wait since it takes time for MSN bot to pick up new URLs. if you don’t see good crawler activity try submitting a sitemap.
9. Make sure your site is not banned or penalized from MSN search. A penalty will definitely affect your website crawling by MSN search crawler.
Search Engine Genie Blog Team.
Recently MSN Live Search crawler bot is hitting us hard its most probably because we are working hard on unique contents on our site, as well as the tools, blogs etc. We are getting lot of new natural backlinks and this could be an important reason we are seeing increased crawler activity from all the crawlers.
Also our SEO forum has started to pick up we have seen some new posters become regular in our forum they have posted some interesting postings. Also we have started 5 new blogs A web design blog, PPC blog, Link building blog, Programming blog, personal company blog etc.
Also we have a new search engine experts directory where listing your site is for totally free even you can include your phone numbers and email IDs. Only adding active URLs are chargeable its just launched with about 60 profiles are existing search engine experts.
Our Chief technical Officer who goes by the name Tara has started a blog on our own and she will be sharing her personal thoughts from tomorrow,
Also we are launching our newsletter from this month and we are accepting subscriptions from tomorrow, there are ton of new things coming to our website. This is just an effort to catch up to our competitors.
Keep visiting our site to see great new innovative things coming up,
Search Engine Genie Blog Team,
MSN search live team has upgraded their web crawler which crawls millions pages per day. They included some new interesting feature which reduces the load placed on some innocent servers.
2 important features as described by MSN search blog
HTTP Compression: HTTP compression allows faster transmission time by compressing static files and application responses, reducing network load between your servers and our crawler. We support the most common compression methods: gzip and deflate as defined by RFC 2616 (see sections 14.11 and 14.39). Compression is currently supported by all major browsers and search engines. Use this online tool to check your server for HTTP compression support.
The following links provide configuration information for IIS, and Apache.
Configure Compression in IIS
Configure Apache using GZIP or using deflate
Conditional Get: We support conditional get as defined by RFC 2616 (Section 14.25), generally we will not download the page unless it has changed since the last time we crawled it. As per the standard, our crawler will include the “If-Modified-Since” header & time of last download in the GET request and when available, our crawler will include the “If-None-Match” header and the ETag value in the GET request. If the content hasn’t changed the web server will respond with a 304 HTTP response
To check if your site already supports the “If-Modified-Since” HTTP header, you can use this online tool to check your server for HTTP Conditional Get support. Alternatively, you can check using Fiddler for Internet Explorer, or Live Headers for Firefox. Each of these tools allows you to create a custom GET request and send it to your server. You’ll want to make sure that your request includes the “If-Modified-Since” header like the following simplified sample:
GET /sa/3_12_0_163076/webmaster/webmaster_layout.css HTTP/1.1
If-Modified-Since: Tue, 22 Jan 2008 01:28:49 GMT
You should receive a server response similar to the following simplified sample:
HTTP/1.x 304 Not Modified
Check out MSDN for more information on using
- 2013 seo trends
- author rank
- Bing search engine
- Fake popularity
- Google Adsense
- Google panda
- Google penguin
- Google Plus
- Google webmaster tools
- Hummingbird algorithm
- link building
- Mattcutts Video Transcript
- MSN Live Search
- Negative SEO
- Paid links
- Panda and penguin timeline
- Panda Update
- Panda Update #22
- Panda Update 25
- Panda update releases 2012
- Penguin Update
- Sandbox Tool
- search engines
- seo predictions
- seo techniques
- SEO tools
- social bookmarking
- Social Media
- SOPA Act
- Webmaster News