Google Plus Author Rank and its necessity

Google has come up with a revolutionary change for Internet marketers. The Author Rank concept is a new one that boosts up the Author’s Google Plus Profile and in a way generates more traffic for the website for which the Author had come up with the particular quality oriented content. For a long time now Google has been striving hard to come up with a comprehensive algorithm designed for generating ranks for websites based on several parameters. And finally it has come up with an interesting concept known as Author Rank that breaks down contents posted in millions of websites and scrutinizes its quality deeply and then generates a rank for the particular website based on relevancy and informative contents and other services provided through the website.

To put it in simple words, Google has forced Internet Marketers to make optimal use of its campaign to boost more traffic for their websites.

The Concept of Author Rank and its Major Role:

The Author’s Google+ profile could possibly rank in Google’s personalized searches provided it is related to the web user’s search for information. In the other words, the search phrase entered by the viewer should be related to personalized searches and this would generate a rank for the Google+ profile page of the author.

The Author Rank SEO technique makes your content stand out in the search result pages generating more traffic for your website indirectly. It is certainly quite easy to understand and implement. Quality contents generated by a particular author are now targeted and the author’s profile page gets a boost in its Page Rank along with the Page where the content is posted. Obviously, the content would contain links to the particular website. As the content becomes more visible now, your website is indirectly exposed and it derives traffic which is good for business.  Increasing the visibility of the contents and the website has been one among the major concerns of most SEO experts.

Top  reasons why you need to implement Rel=Author concept on your contents:

  1.     Makes your quality contents more predominant in the results.
  2.     Improves credibility and exposure.
  3.     Increases click through Rate
  4.     Comprehensive and versatile metrics in Webmaster Tools.

These metrics provides a better assessment of how your website is performing in terms of business generations and where are the scopes of improvement for more business. Once the Authorship is up and running, you have an additional metric known as author stats in your webmaster Tools account providing information regarding how many times the particular quality content appeared in search result pages and how many times the particular link was clicked on etc.

Ultimately, your website is more visible amongst millions of other similar websites for viewers to take a look into your website and its services. The authorship program can be easily implemented and all you need to do is follow Google’s directions for the necessary. You need a Google Plus profile that links to the page where Author Stats details are present. You also need a link from your website’s homepage to your Google Profile Page.

And every time, you generate a content, link your author name in your byline to Google Plus profile page or your author profile page with a simple code namely Rel=Author tag. And the more quality contents you develop pertaining to that particular author name, it improves the author rank of that particular profile.

Author Ranking is highly necessary nowadays for better search engine optimization and in turn business.

TAGS: ,

Crawl Errors Update from Google Webmaster Tools

One of the most important things to do for webmasters is to register and verify their site on Google webmaster tools.

The Google webmaster tools provides a variety of information like the site configuration details – sitemaps, site links, crawler access etc, details about your links, keywords, search queries in the section your site on the web. The Diagnostics section provides vital information like HTML suggestions, Malware, Crawl Errors, Crawl Stats. The Labs section shares interesting details like site performance and instant reviews.

Checking Google webmaster tools at least once every day would be in the schedule of majority of the webmasters and one vital section that one would check are the crawl errors and now in Google webmaster central blog one can see a posting update about crawl errors.

The posting starts stating that enhancements have made with respect to crawl errors and now crawl errors are further divided into 2 sections

    Site errors
    URL errors

Site Errors

To make errors more user friendly these categories have been made. The errors like DNS resolution failures, connectivity issues, problems fetching robots.txt files, before these errors were reported as URL errors henceforth they will be reported in the section Site Errors as they are not URL related errors. If the frequency of these errors is high then alerts will be sent to you. If your site is devoid of these errors as in most sites, you will just see friendly check marks across these issues indicating your website has no such errors.

The URL Errors

If Google webmaster indicates URL errors It means ‘it was able to resolve your DNS, connect to your server, fetch and read your robots.txt file, and then request this URL, but something went wrong after that’. Separate categories will be displayed for Google news or mobile (CHTML/XHTML).

Errors Presentation

There were 100,000 errors of each type shown but there was no way to interpret which errors had more priorities. Keeping this in mind now, Webmaster tools will present the 1000 most important errors for each category. Once the webmaster has fixed them he can view details about them.

Sites having more than 1000 errors will be able to see the total errors for each category and for those who need further details, Google is considering adding an API to download all errors.

One more important update is that the Robots.txt blocked pages errors have been removed as Google webmaster tools find the pages blocked via Robots.txt to be intentionally blocked by the webmaster. Soon these errors will be updated under crawler access in site configuration.

User Friendly Error Details

Clicking on the error URL will open an additional pane with useful information like last tried crawling information, when problem first occurred and an error briefing.From the details pane you can check the URL error on clicking it. There are also options provided to mark the error as fixed, see other pages that link to this URL and more such useful details. There is an option of having Googlebot fetch the URL to double check your error whether it’s fixed.

The priority list would be such errors “fixing broken links on your own site, fixing bugs in your server software, updating your Sitemaps to prune dead URLs, or adding a 301 redirect to get users to the “real” page.” The priority would be decided based on a number of factors whether the link is added in the sitemap and the no of internal links the URL has and various other factors. If you are a user with full permission you can mark the error as fixed and it would be removed from your top-errors list unless Googlebot encounters the same errors again.

These changes hopefully will help webmasters a lot in getting their site errors corrected soon.

TAGS: , , ,

Google Plus: SEO Influence Debated

The advent of Google plus has influenced SEO a lot and most people in the SEO industry will be familiar that one can see search suggestions, when they start their search queries with the + sign. For instance a search query ‘+ a’ would give one suggestions like +Allstate Insurance, +Audi USA,+ ASOS and on clicking one of these you will get yourself landed on the Google plus page of the company. Even though it seems like a beneficial factor of SEO, it’s highly doubtful on how many normal web users will start their search queries with the + sign. Some would probably look into this as a Google service influencing the organic search results.

At business insider I came across an interesting write up about the influence played by Google plus from experts. The first opinion came from an employee of the company that mattered the most- it was from Google employee Daniel Dulitz

His opinion was that surfacing the best content even without SEO is the goal of Google and it was in this interest Google are focusing on 100 or more quality signals and believed that they still have a long way to go. He also stated that the Google plus button helps to build reputation online and with changing times we need to update ourselves.

Based on his statements we can see that Google plus must be put to good use by business owners by building social network pages and connecting with their fans. Also getting content with rich snippets and people recommending content via plus one are interesting strategies to be played with. There must also be focus laid upon converting customers into fans with the page badge.

Sean Carlos shares his view that once Google launched direct connect one could see search results taking you to the Google plus page of the companies and if you perform searches while you are logged into Google plus then one would be invited to add the company to their circles.

Other experts too participated and share some interesting data that 57% of fortune 100 companies have a Google + page but 98% have not enabled Google direct connect.
One can see in the support pages of Google on how to enable direct connect by adding the snippet of the code on the site. Also only 3% of fortune 100 companies have the Google plus badge and only 11% of fortune 100 companies have +1 buttons.

Even though one can’t read too much into this data, we can almost get the message that a great amount of awareness is lacking from small businesses. Brand building and reputation management online are things to be focused more upon. It’s essential that steps are taken by these businesses to get themselves on Google plus ASAP.

The need of the hour also calls for these companies to hire social media experts to promote their websites online. Facebook and twitter are great social networks to promote businesses and Google plus follows.

Social media optimization is something which could play a major role in future rankings. For now, we can see links playing a major part in rankings this can very well be influenced later on by social media presence. Promoting our websites on social networks would involve in-depth understanding of the website features. Like in Google plus we need optimized pages and profiles using the features. To start with we must have Direct Connect enabled. One must have awareness of Google plus badges, button. Share photos, videos, connect with people establish a strong connection between your Google plus page and your website, post using hash tags, cross promotion and much more.

The Google plus influence on SEO will be debated for long but what all Small business owners need to do is to pull up their socks and get their company active on Google plus. Google plus is an opportunity to promote your site through an authentic Google service and this is an opportunity you will repent a lot if you let it slip right through your hands. So Plus this post and get started with Plus ones for your company

TAGS: , ,

Google Privacy Policy – A breach of privacy?

The search giant Google is taking steps to ensure that they are on song with the law enforcement and have stated new privacy laws for their products.The popular Google products like YouTube, Gmail also have not been spared by the privacy policy. In the Google privacy policy page one can find the details of the data they are going to collect from the users.

It says they are going to collect personal information such as name, email address, telephone no or credit card. Google will also be tracking your device data such as operating system, unique device identifiers and associate these information with your Google account.Google will also be collecting search queries of logged on users, your telephone log information, your Internet protocol address. They will also be collecting cookies to identify your browser or your Google account.

Also there will be location data collected when using location enabled Google services. They will be collecting GPS signals from mobile devices and sensor data to give information about nearby Wi-Fi access points and cell towers.The other details mentioned they will be collecting are Local storage data like browser web storage and application data caches. Finally they mention they will be tracking cookie data from other sites using Google features.

Google further explains that the information they collect is to help them to provide better services to their users.

Any amount of explanation by them doesn’t seem to satisfy most of the countries. Most of them have requested Google to delay their privacy update. Even though Google claims that this act is for making the services better, the common man can see it as a breach of privacy.

How can someone be comfortable with their personal data like search queries involving their personal issues, sexual orientation and other confidential details being tracked. There have been a great number of countries opposing this act and recently in New York Times one could see the headline “France Says Google Privacy Plan Likely Violates European Law”. In the news article a line was quoted by the French privacy agency known as CNIL “Our preliminary investigation shows that it is extremely difficult to know exactly which data is combined between which services for which purposes, even for trained privacy professionals.”

The French privacy agency have the power to fine companies up to 400,000 $ for privacy breach.

The users of Android powered smart phones may have no other option but to ditch their phones to get away from the Google Privacy act as the policy involves mobile, OS tracking data to.

Privacy advocates have slammed Google that they are forcing users to share data which when given a choice they wouldn’t have. Also this move is being seen by many as an evil trick by Google to promote their ads to specific users based on their data tracking. While promoting their business is commendable, the promotion coming at the cost of user privacy really needs to be reconsidered again.

While the web is flowing with a range of updates requesting Google to get their privacy act delayed and Governments asking time to review them in detail. There seems to be no response from the other side.

One must also take into account the French are acting based upon the European commission instruction and they were assigned to do the initial check. So this law could go against the whole of Europe and many other countries around the world.

In addition to this based on a survey taken it was found that most Google users were not aware of the privacy policy, despite being promoted by Google in the past weeks. The survey also gives us the shocking census that only 1 in 8 users have gone and read the privacy policy of Google. The remaining 7 users are still ignorant of the change and are continuing using Google services.

Big Brother watch, a British civil liberties and privacy pressure group have called for an enquiry on the Google privacy law and how it complies with the British data protection law.

At this time we would like to Guide users on how to delete their web history before Google starts tracking it

Step 1: Login to your Gmail account

Step 2: At the top right corner of your screen you can view your email id, click on it

Step 3: Once you click on it, you will be displayed a dropdown and you can click on privacy if you would like to know the details of Google privacy or click on account settings

Step 4: In account settings you will have a section called services and you will have the text “View, enable, or disable web history” parallel to it you can find the linked text Go to web history

Step 5: On clicking that link you will have the option to remove all web history

There is also another face of this privacy act with Google having data that many countries have requested user data to be handed over and also had asked Google to block services or remove data that they found to be affecting their country.

January – June 2011

China
Three requests to remove 121 items from services. Google removed ads in response to two of those requests.

Cook Islands
Google received content removal requests

France
User data requests recorded an increase of 29% compared with the previous reporting period of Google

Germany
User data requests recorded an increase of 39% compared with the previous reporting period of Google

Russia
User data requests reached the reporting threshold at Google

South Korea
User data requests recorded an increase of 36% compared with the previous reporting
Period of Google

Spain
User data requests recorded an increase of 28% compared with the previous reporting period of Google

USA
User data requests recorded an increase of 29% compared with the previous reporting period of Google

You can find detailed reports for previous years to in the Google transparency report page.

This data seems to be arousing a lot of interesting debates as if countries oppose user data collection by Google and later on request Google for user data then the whole process would need to be talked over by individual Governments.

Even as many feel that this may hurt their online privacy, the Google official blog states that Google still remains committed to data liberation and also that they don’t sell your information, or share it externally without the user’s permission unless in dire consequences like a court order.

Given all these facts and details how Google are going to cope with privacy laws of nation and how users are going to react to it will be one of the modern day dramas to unfold online soon.

TAGS: , ,

SOPA Act and Its potential effects on Search Engines

SOPA is an abbreviation for Stop Online Piracy Act passed by the US government on October 26th 2011 in order to protect online copyrights. The bill allows US government to fight online trafficking in copyrighted intellectual property and counterfeit goods. It is built and generated from the earlier bill Protect IP Act. It allows copyright holders to seek court orders against websites accused of copyright infringement. The actions if proven guilty includes barring online advertising networks, payment facilitators such as PayPal from doing business with the accused, barring search engines to track and mark such websites and in general seeking internet service providers to completely ban the website from being viewed that is accused.

The bill also makes unauthorized streaming of copyrighted content a felony which means the viewer and the website owner could face legal ramifications for the doings. The bill is more like a internet censorship that gives immunity to internet services that voluntarily take action against websites that involves in infringement which makes it all the more profitable for copyright holders.

SOPA is actually quite different from Protect IP Act. It is more like a companion bill to the Protect IP Act that is aimed at websites or web companies hosting unauthorized content from movies, songs or software’s. Many copyright holders have been fighting against such websites for a long time now as they seem to lose their jobs or profits because of these copyright issues. Movie makers invest so much on their movies and when there are websites releasing torrent files of the movie when the movie is currently running on screen itself, people prefer to download the movies and watch without paying for it and the movie makers undergoes tremendous amount of loss. It is more of a Piracy concern that triggers the tremors between the website industries and other industries.

Website owners on the other hand believe that SOPA could break the internet itself which is quite alarming and there will be legal ramifications against almost every website invariably. They believe this act could be quite stringent and strangulate the entire web company industry.

SOPA is not just about Internet Piracy. The Act emphasizes on the fact that online infringement has become epidemic. Copyright owners feel that extreme measures ought to be taken to combat such infringes. However, for the time being they say that only “egregious” violators will be dealt with severely. But you know about politicians; that is just a false promise in order to safely pass the bill for the moment.  Those in favor specifically suggest that only rouge sites that purposely steal content and involve in the sin called illegal distribution would be targeted. But then the bill clearly states that every site that uses song clips, trailers, create GIF’s using scenes from copyrighted movies could potentially be forced to remove the content and shut down permanently. This tremendously cripples the internet and put every site in danger of violating SOPA.

SOPA would be responsible for websites to disappear:

China already has censorship issues with Google search engine. People who are against the SOPA bill can prove in many ways that the censorship rights proposed or defined in the SOPA bill isn’t much different.

To hinder foreign websites from stealing contents that belongs to a particular country or foreign sites streaming movies that belong to a specific owner, SOPA requires an internet service provider to ban or make websites disappear that is responsible for copyright issues. It states that it is ok to endanger internet security, censor sites as long as it is in the name of IP enforcement.

It makes things more complicated as it blurs the distinction between the site’s host and the members who post contents on it that eliminates internet safe harbors for shared contents that violates many other bills passed earlier regarding the same. The site owner could also be dealt with severely and legally for not monitoring his/her site or for not taking sufficient actions against members who posted copyrighted content.

If SOPA starts blacklisting domains, it means that thousands of websites associated with the offender will also get banned. Even if none of these websites apart from one particular website did not offend or violate any laws. What happened to wiki leaks could now systematically happen to many other websites in a streamlined and linked fashion as long as someone believes that IP rights have been violated.

This itself is enough to threaten the entire internet system which simply means that if this bill does get passed, every website invariably will get affected and internet might no longer exist. It will make the task for search engines even more difficult and complicated as they will have to identify such websites, mark them and ban them and if such policies are included, search engines will be very stringent banning every website for the slightest issue. Eventually search engines will become jobless as there won’t be any website only.

SOPA also creates an online monopoly. SOPA also puts barricades for advertising networks and payment sites such as PayPal. It gives copyright owners to have every authority to tell these organizations to stop funding or providing services to the accused websites. This means that these organizations or systems will have to cut down all services for the website being accused from the moment it has been accused even before the court passes its verdict and the website is proven guilty of its accusations. It financially chokes the website to death and they incur heavy losses. It creates a monopoly by allowing copyright owners to dictate terms and creating overheads or strangulating many other business firms. It also threatens popular search engines such as Google or Yahoo that fits under the bill meaning that any violation of privacy laws, even if there is one small post related to copyright terms on its search result pages, these search engine organizations also face the threat of legal ramifications. The solutions outlines in SOPA Act is draconian that forces URL’s to be removed and domains to be banned causing major trauma and strangling many business organizations that markets using the internet which includes popular search engines such as Google and Yahoo as well. All in the name of Internet Censorship and the irony is that where will internet be if this bill gets passed when you have nothing left to censor anymore?

Fundamentally somehow this has become about politics and jobs. SOPA might aid copyright owners for the time being but the future looks bleak as it imposes a great threat to the internet itself and many other business organizations as well. It might give great returns to copyright owners at present but drastically and dynamically changes our world as it will shrink and strangulate many other industries, cutting jobs and many sectors leading to worldwide unemployment. Something more like the great depression period. It manipulates our entire world to its own destruction as poverty and unemployment will spread like virus. As we know it, popular sharing sites such as YouTube wouldn’t exist only which stands as a great marketer for the same copyright owners. People often tend to develop their reputation about the copyright content and watch movies in theaters as well which brings in profits for the copyright owners. But once this gets affected, these copyright owners might thrive for the time being but in the long run even they will run out of business.

Many people and business organizations have already started to make their stand against the bill and demonstrate how drastic the cons of passing the bill could be as the US government seeks to reform the bill before making its judgment.

To summarize everything, one can simply state that SOPA will destroy our internet which include search engines as well. It doesn’t monitor and protect IP in the name of copyrights, but rather strangulates every business organizations threatening many sectors and financial inflow which makes our world economy look bleak. And without internet, ease of access to information or the first amendment of US law which is free speech will have no meaning only. There won’t be any websites to share information or market organizations after some point of time.

TAGS: ,

Fake popularity – Fake Buzz

Rumors do spread like a virus and victimize millions when proper evidences for the story have been provided. Creating a buzz has become quite a sensation amidst millions of web users as well. And every year people do keep falling for these rumors created by websites in the form of contents eventually.

It is a simple fact that when a lie is told several times and pressed upon the factor, it eventually seems like the truth. The need for pressing is out of the question when the lie has been told with several evidences that substantiate the rumor.

A pretty good example for such pranks or buzz being created can be pointed towards famous search engines such as Google as well. Every year they seem to come up with one topic or the other with substantiating evidences for their story to fool its viewers on April fool’s day. Once they even came up with the concept of setting a station on the far side of the moon with bewildering evidences such as photos, thesis, evidences for research works, testing proofs, contents containing theories along with practical outcomes etc. They even published a 10 page article about the story and the revolution they have achieved to make its viewers believe that a life in moon is pretty much possible and people were pretty eagerly looking forward for that day to become a reality. A day later Google published that it was a prank pulled out organization’s experts as an April fool joke.

Another good example for Google being involved in such activities is as follows. Recently this year on April fool’s day they came up with the Google teleportation services. “Through the search is to let Google take you through time and space, most want to reach your arrival time , place, with an immense way to perceive everything you want to perceive”. Sort of like a fortune cookie that reveals your future relative to a search phrase you enter in the search engine. It was a funny hoax and it fooled a lot of people as when they clicked on the search button, they landed up with a list of websites that fooled them in a sarcastic way. It was a funny prank pulled by Google experts with harmless intentions. In fact many web users did get fooled and enjoyed it thoroughly as well.

Fake Popularity or Fake buzz with wrong intentions:

It is understandable when a buzz or a rumor is created with harmless intentions in order to just fool millions of web users such as the ones created by Google and other search engines.

However, when a rumor or a buzz is created as a search engine optimizing strategy, it spoils the whole purpose of websites. It is not intended to fool web users but rather develop business for one’s website. The popularity of a website largely depends upon the amount of traffic it derives. Based upon the traffic, backlinks and keywords pertaining to that website, search engines rank them accordingly. Of course relevancy of the contents displayed in the website providing credential information for millions of web users is also taken into account by the search engines before ranking them.

There are some websites online which tends to create fake stories or fake buzz just for deriving that traffic to their website enabling more backlinks as well which makes it possible for them to improve their rankings and be listed among the top 10 of result sets pertaining to a keyword or a search phrase. Also if the quality of the information provided by the website does not serve its purpose or tend to be useful for millions of web users, they amalgamate the search engine’s efficiency and performance in providing valuable results. As an expert I have come across 6 such websites being listed among the top 10 result sets pertaining to a keyword or a search phrase.

As an example, I came across a reputable news website creating a fake story about a bomb blast asking people to be more precautious which spread like a wild fire amidst both the media as well as the communities. The moment it was published on the site, many other news websites started taking points from that article and published on behalf of their websites referring to this particular website as the source for such an information or news. As a result this particular website got several backlinks from many other news websites which boosted their ranking among the search engine result sets for that week alone.

Of course, such a hoax was identified later by the search engines and pulled that particular website’s rank down eventually. Backlinks play a crucial role when it comes to deciding the fate of the website and its rank. Creating fake stories for generating fake popularity and more traffic for one’s website is a punishable offense as it manipulates both the search engines as well as millions of web users in a horrible way when the prank could be quite harmful by nature. It is very important for website owners not to indulge in such activities to boost the business from their respective websites. If such a practice is found to be true, it will lead to a ban of your website as well. And search engines are quite serious about such pranks and other Spam’s as well that the moment they find such activities being adopted by your website they will immediately terminate your website without any explanation or further notice. It is more like a black mark for your business reputation as well which will hinder your growth eventually as trust among the communities becomes a major issue and a serious concern for your organization.

Are links in footers treated differently than paragraph links?

Andres from Boston, MA asks, “Does Google treat links in footers differently than links surrounded by text (e.g. in a paragraph)?”

Well if you go back and read the original page rank paper they said that the links were distributed completely uniformly, page rank was distributed without any regard, whether the link was at the top of the page, the bottom of the page, in the footer, in the text all that sort of stuff. In general our link analysis continues to get more and more sophisticated to the point where, what we compute today is still called page rank and still bears resemblance to the original page rank but it’s much more sophisticated than the original page rank used to be. So we do reserve the right to treat links and footers a little differently. For example if something is in the footer, it might not carry the same editorial weight so someone might have setup a single link and that might be something across the entire site, whereas something that is in actual paragraph of text is a little more likely to be in the editorial link. So we do reserve the right to treat those links differently in terms of how we consider them for relevance how we consider them for reputation, how much we trust them and all those sorts of things.

When will Rich Snippets become widely available?

Adeel from Manchester, UK asks, “When do you reckon Rich Snippets will be made widely available? Can I suggest a tool in Google Webmaster Tools that lets you view (or preview) Rich Snippets from your site?”

It is a good suggestion. The fact is we are going to take it slowly at first to sort of make sure that people are doing good things for rich snippets for example, you can imagine going crazy with rich snippets but it actually costs a click to go down if user didn’t like those sort of rich snippets. So we’ve been relatively cautious my expectation is that we are going to check out sites and sort of white list them like, this site is the next one to receive rich snippets and this is the next one. But as we grow more confident that it doesn’t hurt the user experience and that the users like it and they expect us to continue to roll that out one more broadly. So there is documentation, you can go ahead and read up on it and a lot of rich snippets are pretty understandable type for example the number of stars on our review and things like that you can usually get a pretty good idea of what your snippets are going to look like. Just for looking at snippets from other sites already look like and that’ll give you an idea what it will look like when it goes live for your site.

Why are the UK SERPS still really poor with irrelevant non UK sites (US/Aus/NZ) ranking very high Google.co.uk since early June?

Guavarian from the UK asks, “Why are the UK SERPS still really poor with irrelevant non UK sites (US/Aus/NZ) ranking very high Google.co.uk since early June?”

And it’s absolutely true, if you do a search for car insurance on google.co.uk. You are more likely to see for example, tescofinance.com or churchil.com some sites that are definitely UK focused; they have even mentioned UK in the title but are not necessarily .co.uk. Well the short answer is that as we get better or more willing to show .coms if we think it is relevant to giving countries. So I think everybody kind of got used to the idea that if you search on google.co.uk you are only going to get .uk’s and that’s not really the right attitude because if the best result for the British searcher is something that ends in .com we still want to show that to the British searcher. So that is probably a change that we are not going to revert if you see things where it’s in relevant.com or .com that has nothing at all to do with the UK or Australia or New Zealand or whatever it is your country market is, then we’d definitely be interested in hearing about that. But as we continue to know more about which websites are associated with which countries I do expect that well startassure.com comes a little bit more often in some different countries. So that’s just something we are getting a little bit better on detecting. Tescofinance.com is still about the UK market and is still really useful to a British surfer and we are willing to show .coms a little more to people over time.

Which search media does return the more reliable information: Google or Twitter?

Martino in Trento, Italy asks, “Which search media does return the more reliable information: Google or Twitter?”

Now don’t go hating on Twitter, trying to make people bust heads. Twitter has many many great uses. It’s great for breaking real time sort of news, it’s fantastic in asking your friends. And Google on the other hand, we try to return really reliable, really reputable information. So if you are sorting by date Twitter is fantastic, if you want an answer to a question that’s been around for a while Google looks great for that. So try both for different situations. If you don’t have as many friends then you may not be able to get the questions that you want answered on Twitter. And I wouldn’t be surprised if spammers are eventually see traffic on Twitter and they are like muhahahaha because if you are only sorting by date then any new fad that comes along or spam will try to jump on to that. So they are different, they are good for different things. They use whatever works best for you.

Request a Free SEO Quote