pagerank

Does Google use data from social sites in ranking?

A query came from Web SEO Analytics.”They asked, Hello, Matt. A recent article of Danny Sullivan’s suggests that Google uses Twitter and Facebook links as a ranking signal. Can you confirm this? Can you elaborate a little bit more on this?”

Mattcutt clarifies

We do use Twitter and Facebook links and ranking, as we always have, in our web search rankings. But in addition, we’re also trying to figure out a little bit about the reputation of an author or a creator on Twitter or Facebook .he made clear it with an e.g. , “I filmed a video back in May 2010, where I said that we didn’t use that as a signal. And at the time, we did not use that as a signal. But now, we’re taping this in December 2010, and we are using that as a signal.” So the exhaustive place, if you really want comprehensive information, is to go look up Danny Sullivan’s article, and we can leave that as a link in the description of the video. But essentially, the web search quality team has a lot of different groups and a lot of different offices. So people, including the original blog search team, people who worked on real time search, have been working on using these sorts of things as a signal.

Primarily, it has been used a little bit more in the real-time sort of search, where you might see individual tweets, or other links showing up and streaming up on the page. We’re studying how much sense it makes to use it a Little more widely within our web search rankings.

Now, there’s a few things to remember. Number one is, if we can’t crawl a page, if we can’t see page, then we can’t really assign page rank to it, and it doesn’t really count. So if we’re able to obtain the data, then we can use it. But you know, if, for some reason, a page is forbidden for us to crawl, or we’re not able to obtain it somehow, then we wouldn’t be able to use that within our rankings. This is something that is used relatively lightly, for now, and we’ll see how much we use it over time, depending on how useful it is, and how robust it ends up being. The one thing I would caution people about, is don’t necessarily say to yourself, aha.Now I’m going to go out and get reciprocal follows, and I’m going to get a ton of followers, just like people used to get a ton of links. In the same way that page rank depends on not just the number of links, but the quality of those links, you have to think about, what are the followers who mean quality? You know, who are the people who actually are not just bots, you know, or some software program, or things like that? So it is a signal that we’re starting to use a  little bit more. You’ll see it most within our sort of real time search, as it’s streaming through. But we’re looking at it more broadly within web search as well.

Pagerank update after a long wait:

It seems Google has started to move away from PageRank or at least they are trying their best to make people move on from their PageRank obsession. PageRank from the day it came into existence has been the talk of the search engine world. Almost all webmaster knows about PageRank and its importance. Even lot of webmasters go one step ahead and learn how PageRank works and find short cut ways to get high PageRank.

Google is well aware of this PageRank obsession and they are doing their best to stop it. Recently Google removed PageRank from webmaster tools. They used to say which page is the most popular on your site every month in webmaster tools now they removed it from there. Just do a Google search
http://www.google.com/search?hl=en&source=hp&q=pagerank+removed&rlz=1R2GGLL_en&aq=f&oq=&aqi= and you will see how many blogs are talking about this news. Just a simple thing and people talk about it to the core. It just illustrates the importance of PageRank and its obsession on people’s mind.

Now after a long time Google has updated its toolbar PageRank. You can see from this list ( http://www.searchenginegenie.com/google-toolbar-PageRank-update-list.html ) last PageRank update was end of June and this once has come after almost 4 months. That’s virtually just 3 updates a year.
Before we used to have updates every 20 days. People just wait to see their green bar increase on Google toolbar. Now that has gone down since every site has to wait at least 4 months to get PageRank or see any update / upgrade / downgrade to their PageRank. I feel Google is doing the right thing by moving people away from PageRank obsession. I hope Google maintains the current trend.

2 Non US government Pagerank 10 sites

As you might are already aware of we maintain only updated and comprehensive PageRank 10 list in internet today. PageRank is one of the important and most popular crazy among webmasters. Recently we scanned all country level government sites for any new PageRank 10 sites.

We found 1 that is already not in list ( http://english.gov.cn/ ) Chinese government website. After a lot of research we found only 2 country government sites that are Pagerank 10 apart from US government website ( USA.gov ) . They are english.gov.cn and india.gov.in . I wonder how from so many countries only 2 government websites have PageRank 10. Does it signifies something i cannot guess 🙂

Alexa and sub domain rankings;

Alexa has improved a lot than what it was before in ranking sites. Before Alexa used to rank websites just based on alexa toolbar users. Now the criteria for ranking has changed they have tie up with other ranking companies and they use their own toolbar data with other company data to decide the final rankings. Still I feel alexa rankings are skewed and influenced more by toolbar related factors.

Well here the question whether Alexa sees sub domain as a separate entity to rank it? In most cases no alexa rarely sees separate sub domains as a different entity. I have seen BlogSpot domains having Alexa rank of 500. The 500 is not for BlogSpot sub domain but the usage of blogspot.com itself.

A word from the official Alexa blog:

Alexa’s traffic rankings are for top level domains only (e.g. domain.com). We do not provide separate rankings for subpages within a domain (e.g. domain.com/subpage.html) or sub domains (e.g. subdomain.domain.com) unless we are able to automatically identify them as personal home pages or blogs, like those hosted on Geocities and Tripod. If a site is identified as a personal home page or blog, its traffic ranking will have an asterisk (*) next to it: Personal Page Avg. Traffic Rank: 3,456*. Personal pages are ranked on the same scale as a regular domain, so a personal page ranked 3,456* is the 3,456th most popular page among Alexa users.

So they don’t separate if the don’t identify them automatically. They have such a dumb automated algorithm and it never detects proper sub domains. I feel Alexa need to improve their algorithm on working with sub domains because sub domains are actually different websites.

CNN a PR 10 now:

After a long stay in Pagerank 9, CNN has finally got a full Pagerank 10 from Google. We have the most updated list of Pagerank 10 sites available in internet now http://www.searchenginegenie.com/pagerank-10-sites.htm . CNN was PR 10 before a year now they retained their PR 10. Though pagerank is just a number it’s a ultimate symbol of quality in Google’s point of view. There is just a handful list of sites with Pagerank 10 and CNN is one among them. Also Adobe flash player page and acrobat page got pagerank 10 which is again good news.

Pagerank obsession had been in existence for more than 8 years. From the day Google came into existence pagerank fever grabbed many search engine lovers and followers.

Pagerank update – April 2009

Google is updating its toolbar pagerank data, its confirmed and we have PR data for almost all of our websites changed. You can check your altered pagerank here:

http://www.searchenginegenie.com/future-pagerank-checking.html

Google ranks a page of a website?

Google ranks pages not sites we know that but the real question does Google rank a page based on keyword relevance on that page or keyword relevance throughout the website? Based on my, admittedly very targeted, sites I have to say Google is now sophisticated enough to analyze whole site for relevance than just the ranking page. Pages, yes, words are analyzed and targeted. But Google also seems to understand a sites “keywords” and rewards with better SERPS when those are entered.

I look at it like this. I have a site about widgets, 1,000 pages primarily aimed at widgets. Each page is about a particular aspect of widgets, green ones, round ones, making widgets, etc. A couple of my pages on that site are about an apparently disconnected subject, e.g. plankton. The reason that a couple of plankton pages are there is because they are the source of the material which is used to make widgets.

I have loads of these “disconnected” pages, all written well and in the same style as the base widget pages. But they don’t rank for plankton. My guess is that Google can’t connect plankton and widgets and therefore the plankton pages are considered as less value.

Increase in traffic and rankings can cause manual review:

We have seen this across some major sites. If your site is doing very well and if its starts appearing for some rare competitive keywords I am sure your site will be subject to manual review by search engine specialists.

Search engines are very careful about the quality of their SERPs. We know from some internal knowledge that Google uses people to manually review their search engine results. Main flag areas are if a search engine detects big boost in backlinks from high pagerank websites then you can be sure to raise a red flag. A manual review is not always bad but again it depends on the reviewer. A reviewer sees in Google point of view and they are pretty strict on their guidelines even a site going little bit out of bound might raise questions in reviewers mind. We should be careful especially gaining lot of backlinks in short time since Google is always watching us.

Does pagerank for a website affect crawling rate?

Pagerank is just a value assigned based on pagerank of other pages linking to it. Crawling rate for a site depends on various other factors and its definitely not pagerank alone. Martin Buster of webmasterworld gives a good explanation of the myths behind pagerank and crawling.

“I can’t be more emphatic about the falseness of this emphasis on PageRank 4. It has to die. If you are going to get ahead you must walk away from this myth. It’s a number that was arrived at in relation to backlink searches many years ago. The situation that gave rise to the myth went away, it ended, but the myth endured. I’ll explain.

History lesson
Many years ago Google used to show the backlinks of sites with a PR of 4 or more. This caused webmasters to make the erroneous assumption that PR 4 is the threshold between a good ranking and a bad ranking, that Google did not count links from -PR4 sites. Otherwise, why didn’t they show them in the backlink searches? It could be said to have been a reasonable assumption but at the time the Googlers were saying this wasn’t the case.

To the webmasters, because Google didn’t show links from sites with less than PR 4, they assumed that -PR4 meant you were crawled less, had less authority, etc. Over PR 4 meant your site had finally arrived.

Then during a London Pubcon DaveN suggested to Matt Cutts that this scheme was inaccurate and Matt Cutts agreed. Not long after he arrived back at the Googleplex their search engine began showing a sample of backlinks across a range of PR.

Stop and examine the facts

Anyone who has ever ranked a site with an under PR 4 site knows that the assumption that -PR 4 is less worthy is an assumption without foundation. Anyone who has watched their rankings jump with – PR4 backlinks understands that the PR 4 threshold is absolutely false.

The superstition continues

So even though Google began showing PR 4 backlinks, to this very day many webmasters still cling to the mistaken notion that PR 4 is a significant threshold. It is not. This belief in the superiority or meaningfulness of PageRank 4 meets the definition of superstition: “A belief in something not justified by reason or evidence.” It’s a myth. The healing powers of PR 4 is a superstition.

So what determines crawling?

What determines crawling is the amount of links you have. Each link is a new door, so to speak, for a bot to find you. One can have thousands and thousands of links and still rank under 4, yet be better and deeper crawled than a PR 4 with less inbound links.”

Why Google Ranking Drop

Site ranking was quite consistent for sometime. But today they are minus across the board. What are the potential causes? Is this a manual or automated?

Most of the ranking drop are first manual and then move to an automated system. The most common cause for across the board ranking drop linking that seems manipulative or involves “bad neighborhood”. Undetected servers are culprit these days which hack the cloaks for Googlebot.

Use of Iframe sometime triggers a penalty but that is Google’s head ache because there is no such restriction of using Iframe unless the frame is legitimate. These ideas that I have listed are just the possibilities that could happen. If your site rank drops wait for few days before taking action. It may be Google bug which might get clear on its own. But try to a lot sometime to scrutinize your site for technical issues, undetected hacks and so on.

If you’re ranking don’t come back quickly then send a reconsideration request. In that case it is always good to have a few things to say about clean-up efforts you have made.

Request a Free SEO Quote