Mattcutts Video Transcript

Should I tweak my titles and descriptions to improve my CTR?

A question from Bangalore, Amalve asks “Are title and description tags helpful to increase the organic CTR – clicks generated from organic (unpaid) search – which in turn will help in better ranking with a personalized search perspective?”

Great question! So many people think about rankings and then stop right there. And that’s not the right way to think about things. You want to think about rankings and then you want to think about maximizing your click through which means making your title and your snippet very very compelling not deceptive but something that invites the users to click on it because they know they’ll find what they want. And then you want to think about conversion rates and you want to get good ROI. And so title and description tags absolutely can increase your organic click through rates. Try to optimize for that. Because it doesn’t really matter how often you show up it matters how often you get clicked on and then how often you take those clicked on visits and convert those to whatever you really want, sales, purchases, descriptions whatever it is you are trying to optimize for. So I wouldn’t think about it in terms of oh I get more visitors and it helps me in terms of personalized search just think about in terms of getting more visitors and they convert better. So do spend some time looking at your title looking at your URL looking at your snippet that Google generates and see if you could find ways to improve that and make it better for users because then they are more likely  to click . You’ll get more visitors you’ll get better return on your investments.

What’s a preferred site structure?

Here’s a question from London Katy Bairstow asks “There seems to be very little on human visitors where in the site’s structure a given page is, so: Is it better to keep key content pages close to root or have them deep within a topical funnel-structure, eg.: food/fast-food/burgers/hamburgers.php”

Well this is not SEO advice; this is just a behavioral advice. If you can stuff fewer number of clicks from the root page visitors are more likely to find it. If somebody has to click eight times to find the page to register for your conference compared to register right on the main root page, fewer people are going to find if all that many clicks away. So it doesn’t really matter where it is in the path like is it at the root level or eight levels deep it might matter for other search engines. But at least for Google I would think about can your visitors find it and that’s not search engine ranking advice, that’s just general advice on how to improve your ROI.

How can I optimize for “deep web” crawling?

We have a question from Brighton, Danny asks “What are Google’s plans for indexing the deep web? Are there best practices for form construction to optimize for this?”

Great question! We recently published a paper in VLDB which I believe stands for Very Large Databases, that talks exactly about our criteria all the ways though we tried to do it safely so if there are people who don’t want their forums to be crawled we won’t crawl them. So there are various simple things that you can do. So rather than having text that has to be filled out like a zip code if you could make it a drop down for example that’s much more helpful. If you could make it so that it’s not a huge form with 20 things to fill out but more like one drop down or two drop downs that’s going to be lot easier as well. I definitely encourage you to go read the paper there’s nothing sooper dooper confidential in it. And of course if you can make it that you are not part of the deep web you can take those pages that’s your database and have a HTML site map so that people can reach all the different pages on your site by crawling through categories or geographic areas, then we don’t have to fill out forms. And Google is a pretty good company about being able to index the deep web through forums but not every search engine does that. And so if you can expose that database somewhere where people can get to all the pages on your site just by clicking not by submitting a form then you are going to open yourself up to an even wider audience. If you could do that, that’s what I recommend. But if you can’t do that then I’ll say check out this paper from the VLDB conference where the team talked about it in more detail.

Star Wars or Star Trek?

Barbeta from Buenos Aires asks “Star Wars or Star Trek?”

Star Wars! Sorry Star Trek folks, but there are like 50,000 movies admittedly cold wars, I can see both sides. I’ve always been a Star Wars fan I don’t even know all the Star Trek criteria but there’s good points to both sides.

Can I tell Google not to use the posting date in my snippet?

Can I tell Google not to use the posting date in my snippet?

Here’s an interesting question from Brazil, Fabio Ricotta asks “In some queries I can see the date of the post/article in the description snippet (at Google search). Why? Can I tell Google not to use it? If yes, how?”

Right now I don’t think there’s a way to say please don’t do this. Our snippets team is always here to show really helpful descriptions or what we call snippets of our search results. If you are on a forum maybe we can show ‘oh there’s been 4 replies’ or if you are on a blog may be there’s been 30 comments on this blog post. So we are always trying to think about new ways to have helpful descriptions or helpful snippets. And one of those is to highlight the date on which you blog post or forum thread appeared because, if you know that something was recent that might be really useful to you as a user. So we do deserve the right to show the snippet that we think is best for users. Sometimes we provide a way to turn that off; no ODP is a midi tag not to use the open directory projects descriptions. But in general we deserve the right on do we show part of a page, do we highlight the date of a particular post went live, those sort of things we do deserve the right, because we want to return the best results for users.

Can rel=”canonical” index my hostname and not my IP address?

A smart question from Sweden, Anders has asked “Will the new canonical tag help with issues where you by accident (stupid editors linking to wrong addresses) have indexed sites by IP address rather than hostname?”

I’ll have to double check, but that’s the sort of thing that you’ll be liked to able to do. You’d like to take that IP address and put that over to the hostname. Now that I’m thinking aloud, we might consider the     IP address different than the hostname, so we’ll have to confirm on that. But I don’t think it would hurt to go ahead and have that. And ideally that is the sort of thing where you don’t want your IP address to show up, you want your hostname or domain name to show up instead. So I think that would be a nice thing to do. I’m not sure whether we supported for IP addresses yet but I’ll ask Yalcom, the guy that wrote and did the heavy lifting on this code and see what happens.

What do I do after being hacked?

Question from Laura Thieme from Columbus, OH. “I have a client who was hacked. The SEO consultant said the things were cleaned up, but they weren’t correctly. All 30,000 Viagra/cialis types and paid links have been removed but no improvement in SERPs. We sent reconsideration. What do we do now?”

I would send another reconsideration request, I would also do a site: search and look for site:example.com Viagra, cialis, porn, free sex any nasty spammy terms you can think of jus to make sure all the pages are gone. And I would also look at the keywords at the Webmaster Tools Council to see which keywords you are showing up for, if any of them look like spam or porn or anything like that. Do a fresh look.  You might also invite someone to take a look on the webmaster help forum and say ‘hey, is anything wrong with my site?’ Because sometimes people can spot things there. And make sure you have the current patched version of your software, if you are running wordpress, make sure you update your wordpress installation because sometimes you clean it up and you just get hacked again. So if you do search on Google Webmaster blog hacked there is two or three posts that we’ve done and you can read more about that. And if you really think it’s all completely cleaned up, do another reconsideration request and we’ll hopefully get that back in.

Is eating the same sandwich every day duplicate content?

Here’s a question from Canada. Quentin from Vancouver says “Hi Matt, I have the same sandwich for lunch every day. Will I be punished by Google fro duplicate content?” NO! “Can the canonical tag help me here?” Not really! It doesn’t work in meat space yet or sandwich space it only works in web space. “I just can’t get enough Reuben sandwiches!”

Power to you! Although Reubens are a little bad for you, you might consider turkey ham you know a little thin BLT So tasty. Anyway that don’t worry about but the canonical tag is helpful to splat canonicalness so that you can clean up the architecture of your site. Don’t worry about having the same sandwich for lunch

Should I use underscores or hyphens in URLs?

A question from Ontario, Canada. Tripstar says “Underscores vs hyphens in URLs, does it make a difference? my-page vs. my_page?”

It does make a difference; I would go with dashes or hyphens if you can. If you have underscores and if things are working fine for you I wouldn’t worry about changing your architecture. A while ago I said we are looking at using underscores as separators and the reason why we typically never talk about stuff in the future is that it gives us the freedom to change our mind. In fact the people who are working on that project worked on something slightly different for scoring in the URL that was actually higher impact and a much higher win. We might still get around to that, thanks for the ping I’ll try to ask some folks on our quality triage team, hey, can we take a fresh look at this. But for the time being underscores are treated as separators, sorry dashes or hyphens are treated as separators and underscores are not. That might change in the future but that’s the way it stands right now.

Does rel=”canonical” make it safe to use tracking parameters?

Here’s a perfect question from Nick in Chicago. “Does the new canonicalization tag make it safe to add tracking arguments to some of my internal links without fear that Google will split the quality signals between the two addresses?”

So I believe you can do this. I would try it out on just one directory or small set or URLs at first to make sure it’s completely safe. If you can’t fix it upstream like if you can do something with your cookies or analytics package where you can say “oh I’m getting into this point of my page so I’ll track that event.” If there’s a way to do it that way that’s just a little bit better because then there’s no , suppose someone copies and pastes a URL and they might copy and paste it differently may be that URL goes away or the tracking code changes. So if you can’t make the URLs unified that’s still better, but I believe that this sort of thing can work totally fine with the new canonicalization tag. Again just start out cautiously, make sure it works for you make, sure that there’s no problems, but this is the sort of thing that you can do, two conceptually same changes, may be one is, I came in from the work front page may be one was I came in from the help pages so you have slightly different breadcrumb parameter or something like that, you can use the canonicalization tag and say really these two are the same pages and the same pages without this breadcrumb parameter.

Request a Free SEO Quote