Search engines allowing promotion of sex selection in India
Google , Yahoo and MSN have been accused of allowing sex selection ads in their sponsored results. Dr. Sabu Mathew George filed a writ petition highlighting the violation of Preconception and Prenatal Diagnostic Techniques Act by the websites. Particularly Yahoo, Google and MSN were accused of allowing ads to run despite of repeated warning notice sent to them.
As per the article
“The Supreme Court on Wednesday issued notice to the Centre, Google India, Yahoo India and Microsoft Corporation on a petition seeking a ban on popular online search engines promoting sex selection techniques.
A three-Judge Bench of Chief Justice K.G. Balakrishnan and Justices P. Sathasivam and J.M. Panchal issued notice on a writ petition filed by Dr. Sabu Mathew George highlighting the violation of Preconception and Prenatal Diagnostic Techniques Act by the websites.
Counsel Sanjay Parikh submitted that despite bringing the websites to the notice of the departments concerned, no steps were taken to block them. He said the petition was filed for full and effective implementation of the Act.
He sought a direction to the Centre to block all websites, including those of Google, Yahoo and Microsoft, that violated the Act.
Dr. George wanted a direction to the Centre to take punitive and deterrent action against these three companies. “
source: hindu.com/2008/08/14/stories/2008081459841300.htm
Yahoo Answers a threat to content publishers –
A webmaster world member complains about the domination of Yahoo answers. Yahoo answers a platform where users can ask questions and other experts in same field can answer the question. The person who asks the question has the option to decide which one is the relevant answer based on votes or by experience of the posting expert.
Today Yahoo answers is the NO.1 site for getting solutions from experts. We have forums for all topics but Yahoo answers have provided a clean solution to problems.
Sandy of webmasterworld asks
“Since last few months we had been observing this.. hopefully others can clarify more. We used to get decent results from yahoo search engine to some topics on our site but lately yahoo answers is ranking on top results for same phrases and keywords today.
No way to get on top ahead of yahoo answers now for those phrases “
Well as with any search engine the quality of the domain and the internal linking plays a important role in search engine rankings. Yahoo answers have an excellent internal linking and they are bound to dominate the results if the contents are unique. I feel that is what he is saying. There is nothing much he can do to fight against yahoo answers. One way he can try is to improve the content quality of his site and get more stronger links. Yahoo answers loose value over a period of time and i am sure his site will be on top once the power of the Yahoo answers topic fades off.
SEG
How to start a multilingual site: Help from Google to make a google friendly multilingual site
This blog is all about of how to start a multilingual site & various pros in having a multilingual site. Multilingual site is a site where a person can have a site in different languages. But the first thing you’ll want to consider is if it makes sense for you to acquire country-specific top-level domains (TLD) for all the countries you plan to serve. This option is beneficial if you want to target the countries that each TLD is allied with, a method known as geo targeting. Geo targeting is different from language targeting. Geo targeting refers to the sites whose main target is in a particular region/location in the world & it allows you to lay down different geographic targets for different subdirectories or sub domains (e.g., /de/ for Germany). Where as language targeting is one which targets to reach all speakers of a particular language around the world & where you probably don’t want to limit yourself to a specific geographic location. In this case you don’t want to use the geographic target tool. Since its difficult to maintain & update multiple domains, its better to buy one non-country-specific domain, which hosts all the different versions of your website. In this case, there are two options which are recommended:
First option is to place the content of every language in a different sub domain. For our example, you would have en.example.com, de.example.com, and es.example.com.
Second option is to place the content of every language in a different subdirectory. This is easier to handle when updating & maintaining your site. For our example, you would have example.com/en/, example.com/de/, and example.com/es/.
There may arise a doubt for some that when same content is posted in different languages then will it result to a duplicate one?? Definitely not, but you should make sure that your site is well organized. And always avoid mixing languages on each page as this may confuse Googlebot as well as your users. It’s always good to have navigation & content in same language on each page. You can also know how many of your pages are recognized in a certain language by performing a language specific site search. Multilingual site is a benefit to the owner of the site & to the visitors of the site as they get information in their language. For example: when a person wants to know fashion designing institutions in London then he may type that query in search along with the language he needs the page to be displayed in. He feels so comfortable when he gets the information in the language he knows & understand.
Official post http://googlewebmastercentral.blogspot.com/2008/08/how-to-start-multilingual-site.html
LIVE SEARCH WEBMASTER CENTER GAINS CRAWL ERROR & BACKLINK REPORTS:
Webmaster center has launched a new data on august 6th called crawl error & back link reports. The below information tells how the site owners can use the launched data.
Last fall when webmaster center launched the Live Search Webmaster Center in beta, the goal was to establish a long term relationship with webmasters and help them achieve their goals by addressing the most common questions we hear, and help them understand how Live Search sees their site. In an effort to improve upon those goals, today they’ve have launched a significant update to our Webmaster Center and brought the Center out of Beta! This update includes several new features that provide webmasters more information about how Live Search is crawling and indexing their sites, as well as a few features to make the data more actionable.
Crawl issues & reports:
The “Crawl Issues” which is a new feature allows webmasters to find four types of issues as follows:
File Not Found (404)
Blocked by REP
Long Dynamic URLs
Unsupported Content-Types
For each issue webmaster center returns the URL & the data encountered.
-File Not Found: It lists all the pages that MSNbot tried to crawl and received an HTTP response code of 404. Generally, URLs listed here are from typos in links from other sites. You often can’t fix the link, but you can 301 transmit the typo to the correct page (for both a better user experience and reclaimed backlinks).
-Blocked By REP: It lists all pages that MSNbot tried to crawl but didn’t because they were blocked by the site’s robots.txt file or robots Meta tag. You should review this list and make sure you aren’t accidentally blocking access to pages you want indexed.
-Long Dynamic URLs: It lists all pages that have been flagged as having “exceptionally long query strings.” Microsoft says these URLs could lead MSNbot into an infinite loop as it tries to crawl all variations of potential parameter combinations and recommends webmasters find ways to shorten these dynamic URLs.
Unsupported Content Types: It lists all pages that are classified with content types that Live Search doesn’t index.
The crawl issue reports & download functionality features join the existing set which includes:
-indexing details
-penalty information
-robots.txt validator
-out bounding linking data
-sitemap submission
Backlinks data:
In the beta of the Live Search Webmaster Center they offered a limited look into back link data. They’ve significantly enhanced this tool, giving webmasters access to more data about their referring links. The new backlinks feature shows the total count of backlinks to a site. You can view a list of the top URLs in the tool or can download up to 1,000.
Making data more actionable:
Webmasters are analytical and rarely work alone. They often need to be able to grab as much data as they can, and take it offline into Excel or some type of database for analysis and collaboration with a client, marketing or engineering partner. To enable that, they’ve built a few new features into all our reports, both the new ones and the old ones.
-Advanced filtering: This way one can quickly scope the results to zoom into the data they need, without having to sift through all the results.
-Downloading data: For times when webmasters want to view a lot of results, they also provide a download option that can give access to the first 1,000 results in a CSV file that can be easily opened with Microsoft Excel or imported into a custom reporting tool. This can help a webmaster analyze the results and share them with colleagues.
-More than just a set of tools: When they launched webmaster center, few resources were launched to help the site owners to engage with them.
SEO widget – SEO statistics widget launched.
We recently developed a cute tool which will display your,
Alexa rank, Google data yahoo data, Pagerank and other search engine data on your website. Please check out your New widget here. http://www.searchenginegenie.com/widget/seo_statistics_widget.php
This widget is a must have for your website it shows who is online on your website. This is an additional feature apart from the other features.
Publicis to buy performics from Google
Performics a search marketing firm and formerly a part of Double click is now being sold to Publicis for an undisclosed amount. Industry experts had been debating about a potential buy out performics due to the conflict of interest of Google’s polices. Google has the policy Dont be evil and if they run performics many of the other companies will complain that performics customers will get a boost from Google.
Also Search Engine Experts believe if Google runs performics some employees in performics will learn the GOogle search ranking algorithm and this will make their life easier in ranking their client sites.
Washington post reports
“Publicis, which aims to generate 25 percent of its sales from the Internet by
2010, said it would acquire Chicago-based Performics Search Marketing from
Google for an undisclosed amount.
The unit will boost Publicis’s strategic
entity unveiled in June called Vivaki, aimed at spurring growth at its digital
advertising units such as Digitas and Zenith-Optimedia.
In a statement,
Publicis quoted the research house Jupiter Media saying the global search market
was worth an estimated $9.9 billion in 2008 and is projected to grow at 12
percent compound annual growth rate through 2012. “
Google bans webposition Gold position checker
Google has always been warning against automated position checkers which uses a lot of its resources. Now Google has taken a stronger hand and has blocked Webposition Gold software from performing automated ranking requests in Google. AUtomate rank requests creates a lot of junk queries and uses a lot of server resources of Google. Google has been issuing warning not to use webposition Gold but people continue to use it. Now Google has taken action and has blocked all web position gold queries. Web position Gold has an unique way of sending queries to Google and it seems google was able to detect it using their bot filter software.
We at Search engine Genie never use bulk keyword rank checkers. Our rank checkers are search engine friendly and allows only limited queries per day.
Yaho search index and algorithm update
Yahoo recently updated their ranking algorithm. People in leading forums first noticing this change and now its live in Yahoo search. Yahoo previously announced that there will be constant updates happening to their algorithm and now it seems its happening.
Yahoo calls their update a weather report. They released a weather report about their search engine algorithm update recently.
According to yahoo search blog
“We’ll be rolling out some changes to our crawling, indexing and ranking algorithms over the next few days. As you know, throughout this process you may see some ranking changes and page shuffling in the index, but expect the update will be completed soon.
Please visit the Site Explorer Suggestion Board to share your thoughts or check in with other Yahoo! Search users. “
Another company wants a piece of Google pie – Mediaset
First we had Viacom then we had the Belgium newspaper group and now we have another company suing Google. Mediaset a media company is suing Google and Youtube for using copyrighted materials on their website.
According to reuters
“Mediaset, controlled by Prime Minister Silvio Berlusconi, joins others broadcasters seeking compensation from YouTube, a video-sharing website, for copyright infringement.
Mediaset filed suit in a Rome court, the company said in a statement on Wednesday. A YouTube spokeswoman said it did not see the need for the legal case.
“YouTube respects copyright holders and takes copyright issues very seriously,” the spokeswoman said in London. Google bought YouTube in 2006.
“There is no need for legal action … We prohibit users from uploading infringing material and we cooperate with all copyright holders to identify and promptly remove infringing content as soon as we are officially notified,” Google said in a separate statement.
Lawsuits and trials in Italian are often lengthy and it is forecast the outcome.
Mediaset said a sample analysis of YouTube at June 10 found “at least 4,643 videos and clips owned by us, equivalent to more than 325 hours of transmission without having rights”.
Mediaset said this was equal to the loss of 315,672 days of broadcasting by its three TV channels.”
Well i have always said a lawsuit against youtube.com is not the best idea since youtube is a public resource and cannot be threatened. We will loose the freedom of internet if Youtube looses its way by lawsuits.
Google knows the web is big – a informative post in Google blog,
Google is one of the biggest website. We’ve known it for a long time that the web is big. The first Google index in 1998 already had 26 million pages, and by 2000 the Google index reached the one billion mark. Over the last eight years, they’ve seen a lot of big numbers about how much content is really out there. Recently, even their search engineers stopped in awe about just how big the web is these days when their systems that process links on the web to find new content hit a milestone1 trillion-unique URLs on the web at once! So how many unique pages does the web really contain?? No one knows how many it contains but the number of pages out there is infinite! We don’t index every one of those trillion pages, many of them are similar to each other, or represent auto-generated content. But Google is proud to have the most comprehensive index of any search engine, and there goal is always been to index the entire world’s data. To keep up with this volume of information, their systems have come a long way since the first set of web data Google processed to answer queries. Then they did everything in batches- one workstation could compute the Pagerank graph on 26 million pages in a couple of hours, and that set of pages would be used as Google’s index for a fixed period of time. Today, Google downloads the web continuously, collecting updated page information and re-processing the entire web-link graph several times per day. This graph of one trillion URLs is similar to a map made up of one trillion intersections. So multiple times every day, they do the computational equivalent of fully exploring every intersection of every road in the United States. Google’s distributed infrastructure allows applications to efficiently traverse a link graph with many trillions of connections, or quickly sort petabytes of data, just to prepare to answer the most important question- your next Google search.
http://googleblog.blogspot.com/2008/07/we-knew-web-was-big.html
Blogroll
Categories
- AI Search & SEO
- author rank
- Authority Trust
- Bing search engine
- blogger
- CDN & Caching.
- Content Strategy
- Core Web Vitals
- Experience SEO
- Fake popularity
- gbp-optimization
- Google Adsense
- Google Business Profile Optimization
- google fault
- google impact
- google Investigation
- google knowledge
- Google panda
- Google penguin
- Google Plus
- Google Search Console
- Google Search Updates
- Google webmaster tools
- google-business-profile
- google-maps-ranking
- Hummingbird algorithm
- infographics
- link building
- Local SEO
- local-seo
- Mattcutts Video Transcript
- Microsoft
- Mobile Performance Optimization
- Mobile SEO
- MSN Live Search
- Negative SEO
- On-Page SEO
- Page Speed Optimization
- pagerank
- Paid links
- Panda and penguin timeline
- Panda Update
- Panda Update #22
- Panda Update 25
- Panda update releases 2012
- Penguin Update
- Performance Optimization
- Sandbox Tool
- search engines
- SEO
- SEO Audits
- SEO Audits & Monitoring
- SEO cartoons comics
- seo predictions
- SEO Recovery & Fixes
- SEO Reporting & Analytics
- seo techniques
- SEO Tips & Strategies
- SEO tools
- SEO Trends 2013
- seo updates
- Server Optimization
- Small Business Marketing
- social bookmarking
- Social Media
- SOPA Act
- Spam
- Technical SEO
- Uncategorized
- User Experience (UX)
- Webmaster News
- website
- Website Security
- Website Speed Optimization
- Yahoo




