Does the position of keywords in the URL affect ranking?
Interesting question from Adeel from Manchester, UK “Does the position of keywords in the URL have a significant impact: example.com/keyword/London is better than example.com/London/keyword?
Truthfully, I wouldn’t really obsess about it at that level detail, it does help a little bit to have keywords in the URL but it doesn’t help so much that you should go stuffing a ton of keywords into your URL. If there is a convenient way that is good for users, where you have four – five keywords, that might be worthwhile. But I wouldn’t obsess about it to the level of how deep is the URL in the path or how am I combining it. For example, on my blog when I do a post I’ll take the first 4-5 words or 2-3 words related to that post and I’ll use that as the URL. But you don’t need to make 7,8,10 or 20 words because that just looks like spamming users and people will probably not click through as much in the first place. So position is going to be very very second order kind of thing of keywords and URLs. I would not worry about that so much as having great content that people want to link to and people want to find out about.
Is redirecting a large number of domains suspicious?
Cweave from Dallas asks a really interesting question “When permanently redirecting (301) a large number of domains (read: more than 10) to 1 domain does Google flag this as suspicious? What considerations does Google look at? For the purposes of this question let’s assume this as a consolidation move.
I think there is plenty of valid reasons why somebody might do this. For example, if you Google, there are ton of people who have registered Google typos and we try to get those as we don’t want people to get confused to get malware. So we end up with a portfolio with lots of Google related domains even things like Google sex and Google porn. And so I think it’s perfectly logical to have misspelling of Google and all that stuff just to a 301 to Google’s home page. So that’s what I think Cweave was talking about when they said consolidation move. At the same time if we see a ton of 301’s all going to one domain, then we might take a look at that, you could certainly imagine someone trying to abuse that or do spam so we could take a second look or scrutinize that. But if all you are doing is trying to consolidate misspellings or a bunch of brands, and by brands I mean a bunch of domains that you have registered that are very in-severe domain and you really only have that one domain, I don’t foresee that being a problem because people would check it out or someone reported it as a spam thing, if we took a look we’ll just see oh yeah they are just consolidating their brand. So Google might take a look but I don’t consider that to be a large problem.
Will Google use non-link references as a signal?
This question comes from Boston, EA; Eric Enge asks “Do you think web search will ever make use of references (web site mentions that are not links) as a ranking signal?
So there are two answers. The first one is, I never want to take a ranking signal off the table like, I’ve joked, that if the face of the moon can help us rank the search results better, I’m willing to use the face of the moon. At the same time think about how people would attack the use of references. Right now a lot of people rely on getting links, if all they have to do is example.com in text and then you can leave as comments all over the web and all over blogs and all over forums it will almost be anywhere you can stamp any user generated content people would be leaving those references. So that’s the sort of reason why you might be skeptical about why we’d use this sort of signal, because people could abuse that sort of thing they could just leave mentions of the URLs even if they can’t generate links. But I’ll say we are willing to look at it, we would run the analysis we would say is there a way to pull out that signal from that noisy data where we could find a way to improve it. But that would definitely be a sort of thing that people would try to abuse it.
Should large corporations use rel=canonical?
Terry Cox from Orlando, Florida asks “In regards to the new canonicalization tag, does it make sense for large corporations to consider placing that tag on every page due to marketing tracking codes and large levels of duplicate URLs like faceted pages and load balancing servers?
So this is a great question, should you put the canonical tag on every single page, well there is a short term answer and a long term answer. Short term answer is I would probably say not right now, take a little bit of time, study your site architecture, think about URL normalization, beautification, whatever you want to call it think about the structure of URLs you want to have and take a few weeks or few months or couple of months you sort of assess where you want to go. I don’t think you should throw the canonical tag on every single page on your site immediately and move it around, because it is a powerful tool and people do have the ability to shoot themselves on their foot. So on the plus side we’ve seen a quarter of a million pages show up within just a few days, where people are using this canonicalization tag, which is fantastic. It is good to see the traction and the adoption move very quickly. On the down side we have seen one company, very large company, computer company, I won’t call them out by name, where they had a home page and the home page was doing a redirect and they also had a canonical tag and the canonical tag pointed to a page that we hadn’t crawled at all and so those sort of cases can be very difficult to try to do the right thing, and we do the right thing. But it can take us a couple of days to sort it out or go and find that URL and crawl it. So I wouldn’t just jump in deep in the pool without doing some planning. The longer term answer is that it doesn’t hurt to have this on every single page of your site. Ideally you’d find other ways to solve the canonicalization but it doesn’t hurt to say on every single page this page maps to this canonicalized, very pretty, very preferred version of this URL. But what you want to do is, you want to make sure that, it is absolute URLs, ideally goes in one hub, it’s a logical system that you designed you haven’t just jumped out and started playing around with. I don’t see any harm in having that sort of thing because we’ll just follow those, what we almost think of is many 301 redirects within that site and we’ll try to canonicalize according to those suggestion. We don’t guarantee that we’ll do it but it should work just fine with no problems. So feel free to do that but take some time and plan it out a little bit.
Will Google provide a rank-checking service?
Mark Lykle, from Oslo, Norway asks “When will Google create a software similar to Web Position so that SEOs, spam fighters and regular webmaster can check rankings etc. without violating the guidelines? Why not make a better product instead of going to war against these programs?”
Well, I wouldn’t call it going to war; I mean our guidelines have said the same things that they have said for 5, 6 or 7 years, which is essentially please don’t hit us with automated queries. And the reason that we’ve said that is because people do hit us with automated queries and that takes up some server capacity. So if someone is scrapping Google if we know that person then we may write to them and say, hey please stop scrapping, it does violate our guidelines, it does takes server capacity, we’d appreciate it if you wouldn’t scrap us. And then we do have automated system to protect ourselves against denial service attacks, scrappers, there are some viruses, Trojans and malware that try to spread themselves, by doing searches on Google on vulnerable software and so we try to find those things and block it. So if something is taking a sizable amount of our server resources we do have automated systems that attempt to stop that. That said we do have tools for example in the webmaster tools council at google.com/webmasters we can sign in and you can see the sorts of words that you are ranking for and the sorts of words that people click through on through your site for. I think we have a philosophy that it doesn’t do you as good as to pay really a ton of attention to ranking reports. It’s much better to look at your server logs to look at what are the queries that people are really showing up for and may be try to find queries that you rank at number 5 or number 4 that you can rank at number 2 or number 3 or 1 or queries that you rank on the second page that can be moved to the first page. You can also look at those queries and try to improve your ROIs, so if one percent of the people who land on your site convert into people who subscribe you newsletters or buy your products. If you can improve that so that more people convert, that’s a much faster way to improve your bottom line than just trying to rank for everything when it isn’t necessarily relevant. So I think it’s a little bit of philosophy that we don’t want to encourage people to get obsessed with their rankings when in fact they should be paying attention to what they already have in their server logs and thinking about how to convert better and thinking about those sorts of terms rather than getting obsessed with rankings. That said I would support if we had more ability for people to see the sorts of things that they rank for in Google’s Webmaster Council, it’s just a question of resources, is it better to support something like a canonical link tag, which takes the engineer working on it or ranking reports. At least historically we have said lets have all these newer features let’s show all of your back links lets show you what does your latency looks like when Google boff fetches your page and not concentrate or obsess about ranking reports. That’s the little bit of background how we feel about it.
Two questions about nofollow
Let’s talk a little bit about nofollow. Here are a few questions regarding this: Vince Samios from UK asks “Do you feel the widespread and blanket use of nofollow tags is devaluing Google’s search algorithms?”
Let me inter-check before I finish the question, even though SEO’s may feel like nofollow is everywhere on the web, if you look at the percentage of links that have nofollow, it’s actually a pretty minuscule percentage. So nofollows aren’t that common on the web compared to how the perception of them might be.
(Continues with the question) “Examples such as Wikipedia, where all external links are nofollow. Does Wikipedia mean nothing to Google’s algorithms?”
And Jonaths from Brighton, UK asks “Do Google take into account quality factors from nofollowed links when the links come from well established authority websites, such as Wikipedia?”
We are not trusting or taking into account the links from Wikipedia because they are nofollows. So don’t bother to spamming Wikipedia, it’s not going to make any difference in search engine rankings if you get a link because, that will be nofollow. If you have a great resource and people find it via Wikipedia and it’s just fantastic and people link to that because of that, or you getting traffic from a link in terms of direct surfers or visitors, then that might benefit your site. But it’s not going to get any search engine ranking boost just because Wikipedia links to you with those nofollow links. Now let me take a one slight detour and mention that, if a particular site does have trust in the person who is making the link then there is plenty of good reasons to make that link flow page rank and take the nofollow off. For example, Wikipedia has experimented with all kinds of different ways to improve their process, may be anonymous said that it has to be approved before they go live. So you could certainly imagine a scenario which Wikipedia editor, who is very trusted, who had made a ton of edits without them ever being reverted there are other editors they have asked for, however they want to define trust those links might for example take the nofollow off. So a very simple thing when you are being under attack from a spam register at that nofollow tag and then it doesn’t benefit spammers anymore. But if you run a blog or forum or Wikipedia or whatever and you can come up with a good metric to say, ok these are the links that we do trust that we do think that are editorially given and are valuable for users then there is plenty of good reasons to go ahead and say make those links flow page ranks. But in general nofollow links are relatively small percentage of the web and it does prevent lot of sites from getting spammed. We don’t use those links from Wikipedia currently, but if Wikipedia want it to put them on newly asked policies and place, I would definitely support that.
Does anchor text carry through 301 redirects?
Mharris from NY asks, “Does anchor text carry through all 301 redirects? Will there be a penalty for sites that do this as their sole way of link building?”
Typically, anchor text does flow through 301 redirects, but don’t promise that, that will always happen. So the question is does it carry through all, not necessarily, we deserve the right to score not only links and how we determine the weights in trust of the links and also the trust that we have for redirects. I can tell you that if your sole method of link building is trying to get 301 redirects that’s going to be pretty conspicuous, because we log all the redirects we see, just like we log all the links that we see. And so if all of your incoming anchor texts is through 301 redirects, that’s going to appear pretty strange. Especially because, whenever we go looking for our tools that would be a pretty abnormal thing to do, so my advice is make a great site that attracts links naturally because it’s a fantastic resource and don’t worry about trying to get some page ranks or some anchor texts in some way that search engines might not be able to catch or other people may not be able to follow. Because if you get that organic long term sort of links, the links that are given freely because you have a great resource those are the links that typically last the best and have the most impact.
Is Google putting more weight on brands in rankings?
First question is, Can you verify that Google is putting more weight on “brands” in search engine rankings? If the answer is “Yes” – what is Google’s definition of a brand? Inspired by Aaron’s Wall’s post: http://www.seobook.com/google-branding. That comes from Monica, Madison, WI.
So I’ll try to give a pretty complete answer to this. I was planning on talking about a little bit more at pub-con in Austin in just a couple of weeks. But inside of Google at least within the search ranking team, we don’t really think about brands. We think about words like trust, authority, reputation, page rank, high quality and so the Google philosophy on search results has been the same pretty much for ever. It’s that if somebody comes to Google and types an “x”, we want to return high quality information about ‘x’. And sometimes that’s a brand search; sometimes that’s an information search; sometimes that’s navigational; sometimes it’s transactional, so there are all sorts of different information needs that people have. First off, Yes, Google has made a change in our rankings; it’s one of over 300 or 400 changes that we make every year. So I wouldn’t call this an update, I would call it just a simple change. If we have to refer to it, one of the people did a lot of work on it, his name was Venz and this particular change we talk about is Venz’s change within the Google places. So I wouldn’t really call it an update, but I would say that, there has been at least a change in how we do some rankings. It doesn’t affect a vast majority of queries it’s more likely and most people haven’t even noticed it; I mean Aaron talked about it and I think even before that people at webmaster were talking about it. But it affects relatively a small number of queries, it’s not like it affects a ton of long queries or anything like that. I don’t think of it as putting more weight on brands, we don’t really think about “brands” in search quality that much. For example, if you type eclipse, if Google was really focused on brands we might return Mitsubishi eclipse you know number one or something like that. And if you actually go to Google and type in eclipse, we’ve got eclipse.org because it’s a development environment, we’ve got Nasa’s eclipse website and there are some commercial results, for example; Eclipse is the name of the book in the Twilight series, so we’ve got a page from amazon. But it’s not like we always try to return brands, we try to return whatever we think the best results are for users. So the net update of this change is pretty simple, we try to return high quality results, we think a lot about trust, reputation, authority, page rank and so what you should be doing doesn’t change. Try to make a great site, try to make a site that is so fantastic that you sort of become known as an authority in your niche. It doesn’t have to be a big niche, it doesn’t have to be a huge well-known keyword, it can be a smaller niche and if you are still the expert and that’s the sort of thing that people want to link to or talk about, the sort of things that people really enjoy and those are the sort of sites the experts that we want to bring back.
How is Google helping Google Analytics users with site speed?
Today’s webmaster video question comes from Polyana, Sao Paulo, Brazil. Polyana asks, ‘ When analyzing rankings for highly competitive keywords in our industry, we have found sites not as optimized as ours ( on-page), and that have few links & little content are still ahead of us. What gives? Why are “unoptimized” sites ranking so well?”
Well, the thing that I want to avoid is the impression that it’s only the optimization that would make you rank. In others, there are lots of different factors that would make you rank well, but fundamentally, we try to look at on-page contents as well as off-domain links and it’s not the case that just because somebody has done optimization, it is automatically better than the site that hasn’t done optimization. There are lots of sites from schools and students and people that hand-write their html and they might not necessarily get every single thing optimized, but that doesn’t mean that it’s not a good resource. So, another thing is we typically do not show all of the back links for a site to your competitors. If you log into Google webmaster tools then we give you a very exhaustive list. But even if you go into yahoo link explorer or anywhere else, you are going to get only a subset or different sampling of the links that point to a particular competitor’s site. The reason we do that is link: originally; we did not have the storage space to return all the back links and then over time that sort of became a tradition. So there might very well be links from very high rate page ranks or very reputable sites pointing to that particular other page that’s allowing it to rank. So, you know it’s always tough whenever you are talking about it in terms of other people in your industry; we always want to look at it and say, that’s not good as a page or not as good a site as my site. But bear in mind that you can absolutely have links that you might not know about as far as two competing sites or to your own sites that your competitors might not know about, and then we try not to put so much emphasis that you have to do SEO because we want sites to be able to rank well on the basis of mirror, if they are good they should show up in search results, that’s our basic philosophy.
Interesting Quote Request
We are currently using RomanCart, which quite frankly is an fairly great shopping cart. But like all rented/leased carts, we find limitations when we need something that they do not have on their schedule to add. Sometimes it is (at least to us) fairly obvious that it is badly needed and should be added. For example, You cannot override the ground shipping and make it free. Now before you get excited and start making obvious assumptions, we can make it free by entering zero lbs and that works, but if a customer wants to select 2 or 3-day, they cannot.
We are looking for a new “very secure” shopping cart and have looked at several free and paid. OSCommerce (which is actually fairly good once we made some changes like 1 page checkout, etc. In addition to the basic stuff, we would also like to see the following. XCart which does not integrate with Google and PayPal. PdAdmin Pro (which we bought) but does not integrate with Google Checkout or PayPal. We know we can add this functionality ourselves, but would rather find a cart with it, so we can focus on our business.
Source code is almost a must. We are tired of having our hands tied by some developer who writes for themselves. We had one jerk actually make a change without any notice that prevented customers from using the cart for 5 freaking days.
Integrations with Google Checkout and PayPal. i.e.. Not just a link to them. We want communication between them and the cart, so that the shopping cart has a complete copy of the order just like the regular orders. i.e.. We want our metrics all in one place. RomanCart does this. If we could purchase the source for RomanCart, we would be very happy. If I could just get them to make changes once in a while I would be happy with them. We also tested Ecwid which had some impressive Ajax code, but again failed in the Google & PayPal integration.
Live Order tracking so customers can track where their order is. Funny, but most carts fail on this obviously needed functionality. EasyCart handle this very well.
Work with a previous existing website. i.e.. We call the cart with a product ID, instead of the cart building the website.
A really valuable feature would be PCI compliance. For example, delete all credit card details after products ship.
Most carts ship all products at once, but RomanCart allows you to ship partial orders. Very nice feature. Too bad you cannot get the source.
Note: It appears that a cart in .PHP or .ASP are better then Java, but I would think, that the way we use ours now would make a C# or C or C++ more secure. Maybe you can shed some light on this, offer us a solution, or point us in a possible direction to investigate.
Also, do you have a demo of the control panel? The part for updating statuses and adding the tracking numbers..
Blogroll
Categories
- AI Search & SEO
- author rank
- Authority Trust
- Bing search engine
- blogger
- CDN & Caching.
- Content Strategy
- Core Web Vitals
- Experience SEO
- Fake popularity
- gbp-optimization
- Google Adsense
- Google Business Profile Optimization
- google fault
- google impact
- google Investigation
- google knowledge
- Google panda
- Google penguin
- Google Plus
- Google Search Console
- Google Search Updates
- Google webmaster tools
- google-business-profile
- google-maps-ranking
- Hummingbird algorithm
- infographics
- link building
- Local SEO
- local-seo
- Mattcutts Video Transcript
- Microsoft
- Mobile Performance Optimization
- Mobile SEO
- MSN Live Search
- Negative SEO
- On-Page SEO
- Page Speed Optimization
- pagerank
- Paid links
- Panda and penguin timeline
- Panda Update
- Panda Update #22
- Panda Update 25
- Panda update releases 2012
- Penguin Update
- Performance Optimization
- Sandbox Tool
- search engines
- SEO
- SEO Audits
- SEO Audits & Monitoring
- SEO cartoons comics
- seo predictions
- SEO Recovery & Fixes
- SEO Reporting & Analytics
- seo techniques
- SEO Tips & Strategies
- SEO tools
- SEO Trends 2013
- seo updates
- Server Optimization
- Small Business Marketing
- social bookmarking
- Social Media
- SOPA Act
- Spam
- Technical SEO
- Uncategorized
- User Experience (UX)
- Webmaster News
- website
- Website Security
- Website Speed Optimization
- Yahoo




