Mattcutts Video Transcript
How to structure a site? – Mattcutts Video Transcript
Ok As you can this is the closest I can get to a World Map, Did you know there are 5000 languages spoken across the globe, how many does Google support? Only about a 100 still a long way to go.
Alright, Lets do some more questions. Todd writes in he says Matt I have a question, One of my client is about to acquire a domain name very related to their business and has a lot of links going to it. He basically wants to 301 redirect to the final website and the acquisition. The question is will Google ban or impose a penalty for doing this 301 redirect. In general probably not you should be ok, because you specify its closely related, anytime there is a actual merger of two businesses together, two domains very close to each other do a 301 redirect and merge together its not a problem. However if you are a Music site and you are suddenly getting links from Debt consolidation and online cheap YAHYAHYAH, that should be problem but what now you have planned to do is fine and you should be ok.
Barry writes in “What’s the right way to theme a site using directories you put the main keyword in the directory or on the index page? If you are using a directory do you use a directory for each set of keywords?
This is a good question, I think you are thinking too much about your keywords and not about your site , this is just for me I prefer a tree like architecture, so everything branches out even, nice like a branch sort of thing and also it will be good if you are breaking down by topics so if you are selling clothes and you have sweaters as one directory and shoes as an other directory and something like that, if you do something like that what you will end up with is your keywords do end up in directories, So as far the directories vs the actual Html file it doesn’t matter with Google screwing up with it, So actually I think if you break it down by topic and make sure that your topic is broken down by keywords. Then I think if your user type of keywords and find your page then you are in pretty Good shape.
Aright Joe writes in, If a Ecommerce site has too many parameters say it has punctuation marks, dots etc and its un index able is it ok to be within Google’s guidelines and serve static html pages to Googlebot to index instead. This is something that I will be very careful of because if you end messing this up you will be doing something called cloaking which showing different content to users and different content to Googlebot and you need to show the exact same content to both users and Googlebot. So my advice is to go back to the question I asked about whether or not the parameters in URLs are indexable and unified so that both users and Googlebot see the same user directory. And if you are going to do something like that, definitely that’s going to be much better saying that what ever html paying you are going to show to the Googlebot , if users go to that page and if they stay on the same page and not redirected or sent to an other page then you are fine. They need to see the exact same page that Googlebot saw that’s the main criteria you got to be careful about that.
John writes in he says “I would like to use AB split testing on the static html pages, will google understand my PHP redirect for what it is or will Google penalize my site for assumption of Cloaking, If there is a problem is there a better way to split test?. That’s a Good question if you can I would recommend split test in a area where search engines aren’t going to index it. Because when we go to a page and you reload and show different content then it does look a bit strange so if you can please use robots.txt or .htaccess file or something that Googlebot doesn’t index it. Saying that I wouldn’t do a PHP redirect , I will configure in a server to serve 2 different pages parallel. One thing to be careful about and I touched on this a while ago in a previous session that you should not do anything special for Googlebot just treat it like regular user that’s going to be the safest thing in terms of not being treated as cloaking.
And lets wrap up, Todd asks an other question he says hi Matt here is the real question, Ginger or Marian? I am going to go with Marian.
Why we prepared this Video transcript?
This transcript is copyright – Search Engine Genie.
.
Feel free to translate them but make sure proper credit is given to Search Engine Genie
Qualities of a good site – Matt cutts Video Transcript
Hello again lets deal with a little more questions, I hope its work lets give it a shot. Raf writes in some comments on Google sitemaps please. He says does updates on sitemaps depend on page view of the site? I feel that’s not the case page views are not the factor on how things are undated in sitemaps, you know there are different pieces of data in sitemap so imagine you know there are a file of different sets of data. They could all be updated in different times and in different frequencies and typically they should be updated within days or worst case within weeks however as far as I know it doesn’t depend on page views.
Lets deal an other one “What are your basics ideas and recommendations on increasing sites ranking and visibility in Google? ” Ok this is a meeting topic definitely a longer issue ok so lets go ahead and dive into it? So Lets go and see the number one thing most people make mistake on SEO is the they don’t make the site crawlable. So I want you to look at your site in search engines eyes or user text browsers do something and go back to 1994 and use lynx or something like that. If you could get through your site only in text browser you are going to be in pretty good shape, because most people don’t even thing about crawl ability. You want to also see on things like sitemaps on your site or also you can use our sitemaps tools in addition to that once you got your content, content that is good content, content that is interesting, content that is reasonable that’s attractive and that will make some actually link to you and then once your site is crawlable then you can go about promoting, marketing, optimization your website. So the main thing that I would advice or thing about the people who are relevant to your niche and make sure they are attracted. So if you are attached to a doctor since you run a medical type of website make sure that doctor knows about that website if he knows about your site it might be appropriate for him to link to your website.
You also should be thinking about a hook some thing that holds your visitors it could be really good content newsletters, tutorials, I was trying to setup all these video stuff trying to make it look semiprofessional and there is tutorial by a company called photo flex something they said here is something like keylike, throw etc and BTW they say you need to by our equipment to do that. That’s really really smart, infact another photography site that I went to I saw they syndicated the other site tutorials to add on their website. That could be a great way to get links you can also You should also think on places like reddit, digg, Slashdot you know social networking sites myspace this sort of stuff. Fundamentally you need to have something that sets you apart from the pack once you have something like that you are going to be in very good shape as far as promotion your site is concerned. But the biggest step making sure your site is crawlable after that , making sure you have good content and finally make sure you have a hook which makes people really love a site return to it and really bookmark it.
Alright lets do an other one “what condition asks
Alright this one is a good one. Laura McKenzie says does Google favor bold or strong tags. In general we probably favor bold a little bit more but just to say it more clear its so slight that I wouldn’t really worry about it. When you do it do what ever is best for yours and oh I don’t think its going to give a little bit of boost in google or anything like that. Like I said its relatively small so I recommend you do what is best for users and what ever is best for your site and then not worry about that much after that.
I think that’s it,
Thank you,
Why we prepared this Video transcript?
We know this video is more than a year old but still there are people who have questions about their site and want to listen from a Search Engine Expert. Also there are millions of Non-English people who want to know what’s there in this video so a transcript is something that can be easily translated to be read in other languages. We know there are people with hearing disability who browse our site this is a friendly version for them where they can read and understand what’s there in this video.
This transcript is copyright – Search Engine Genie.
Feel free to translate them but make sure proper credit is given to Search Engine Genie.
Matt Cutts Discusses the Importance of alt Tags – Mattcutts Video Transcript
But the general problem of you know, detecting what an image is and been able to describe it, is really really hard; so you shouldn’t count on computer being able to do that, instead you can help Google with that. Now let’s see what this image might look like, if you look at the right this might be a typical image source “img src – “
DSC00042.JPG” you know, you got your image tag, u describe what the source is, here is DSC because it is a digital camera, you know blah blah blah 42.JPG, that doesn’t give us lot of information, right? You won’t be able to say this is cat with a ball of yarn we don’t want to say, here is number that gives a virtually zero information, if you go down a little bit, here is sort of information that we want to show up, you won’t be able to say this is Matt’s cat, Amy Cutts, with some yarn; right? & you know that’s not a lot of words but it adequate describes the scene, it gives you a very clear picture what’s going on.It includes words like yarn, a word like Emmy Cutts, which is all completely relevant to that image and it isn’t stuffed with tons of words like cat, cat, cat, feline, lots of cats, cat breeding, cat fur with all sorts of stuffs. So you want to have a very simple description, sort of included with that image; how do you do that? If you look here, “Matt Cat, Emmy, Cutts, with some yarn> you can see this image tag, image source and an ALT tag which stand for alternative text, and if so somebody is using a screen reader, or they can’t load the image for a reason, your browser can sow you this alternative text and you know it is very helpful for Google. Now you can see what’s going on, different people and people who are interested in accessibility can also get a good description what the image is, you are not spamming; this is a total of 7 words, if u got 200 words in your ALT text, you really don’t need a ton of words because 7 is enough to describe a scene pretty well. Right? If you get 20 or 25, that’s even getting a little bit out there, 7 is perfectly fine, you are talking whatz going with in the picture itself.
You can also look for alternative tags like tidal and things like that but this is enough to help Google to know whatz going on in the image. You can go in advance, you could think about naming your image something like ‘Cat and Yarn.JPG’ but we are looking for something light weigh and easy to do, adding an ALT tag is very easy to do and you should pretty much do it in all of your images, it helps your accessibility and it can help us (Google) to understand whatz going on in your image.
Why we prepared this Video transcript?
We know this video is more than a year old but still there are people who have questions about their site and want to listen from a Search Engine Expert. Also there are millions of Non-English people who want to know what’s there in this video so a transcript is something that can be easily translated to be read in other languages. We know there are people with hearing disability who browse our site this is a friendly version for them where they can read and understand what’s there in this video.
This transcript is copyright – Search Engine Genie.
Feel free to translate them but make sure proper credit is given to Search Engine Genie
Some SEO Myths – Mattcutts Video Transcript
Alright, I am trying to upload the last kick to the Google videos. So we will see how it looks while I am waiting I think I can do a few more questions and see if we can knock a few out I am realizing that with this video camera that I have got I can do about 8 minutes length of video before I get to the 100 megabytes limit then I have to use the client uploader so ill probably make it into chunks of 5 to 8 minutes each.
So Ryan writes – He says can you put us out of some myths where having too many sites on the same server, for having sites on IPs that look similar to each other, but having them include the same Javascript of a different site. In general if you are a average webmaster this is something that I haven’t have to worry about. Now I have to tell a story about Tim Mayer and I on a penalty panel. Someone said hey you took all my sites out he said both Google and Yahoo did and I didn’t really have that many, so tim saw that guy and asked so how many sites did you have?
And the guy looked little sheepish for a minute and then he said well I had about 2000 sites so well there is a range right? Say if you have 4 or 5 sites and if they are all different themes or different contents you are not in a place where you really need to worry about. But say if you have 2000 sites you ask yourself do you have enough value added content to support 2000 sites the answer is probably not. Its just that if you are a average guy I wouldn’t worry about being on the same IP address and I definitely wouldn’t worry about being on the same server that is something that everyone does. The last one Ryan asked about Javascript there are a lot of sites that do this, Google adsense is javascript included this is something that is common on the web I don’t have to worry about it at all, but now again if you have 5000 sites and you are including the Javascript that does some sneaky redirect then you need to worry but that is something that you do on a few sites that is entirely logical and using Javascript I wouldn’t worry at all.
Alright Aaron write in – its kind of interesting question? I am having a hard time understanding the problems that we face when we launch a new country. Typically we launch a new country with millions of new pages at the same time additionally due to our enthusiastic PR team we get tons of backlinks as well as press news during every launch. So they say the last time they did this they didn’t do very well they launched a site for Australia and they didn’t do very well at all.
Aaron this is a good question primarily because the answer to this somewhat changed since the last time we talked someone asked this question when we were in a conference in New-York and I said just go ahead and launch it you don’t have to worry about it , it may look a bit weird but it will be just fine. But I think if you are launching your site will millions of web pages you got to be a little more cautious if you can. In general if you are launching with that many pages its probably better to try and launch a little more softly so a few thousand pages and add a few thousand more and stuff like that it could very well be, millions of pages are a lot of pages. Wikipedia is like say how many 5 or 10 million pages so if you are launching that many pages make sure you find ways to scrutiny and make sure those are all good pages. Or you might as well find yourself not as good as you hoped for.
Alright quick question; classic nation writes in and says What’s the status on Google images and whether we will be able to hear about the indexing technology of the future?
Actually there was a thread about this on webmasterworld we just did an index update, just did I think last weekend for Google images, Actually I was talking to someone on the google images team they are always working hard, there is a lot of stuff you may have seen there might be new updates in future where we will be bringing new images that the main index has and stuff like that but they are always working on making Google images index better.
Static vs. Dynamic urls – Matt Cutts Video Transcripts
Hi everyone again, Alright here we go again I am learning something everytime I do one of these, For example its probably smart to mention today is Sunday July 30th 2006.
Alright Gerby writes in “Does Googlebot treats dynamic pages different than static pages? ” My company writes perl and there are query strings in URLs yahyahyah?
That’s a good question, my first opinion we do treat static and dynamic pages equally so let me explain that in a little bit more detail. Pagerank flows in dynamic URLs the same way they flow in static URLs so if you got nytimes linking to a dynamic URL you will get the pagerank benefit and will still flow the Pagerank benefit. There are other search engines in past who said ok we go one level deep from static URLs so we are going to crawl a dynamic URL but we are going to go one level in dynamic URL, so the short answer is pagerank still flows the same between a static and a dynamic URL, lets go into a more detailed answer. The example you gave has like 5 parameters and one of them is like a product ID 2725 and you definitely cant use too many parameters I would recommend 2 or 3 at the most if you opt for using them , not to go for too long numbers because we might confuse them with session IDs any extra parameters you can get rid of its always a good idea. And remember google is not the only search engine out there so say if you have the ability to do a little of Mod_Rewrite I am going to say make it look like a static URL and I am going to say this is a very good way to tackle a problem. But pagerank still flows but experiment if you see any URLs that has the same structure and same number of parameters as you will think of doing its probably better to cut short some number or parameters or shorten them in URLs, or try to use Mod rewrite. Alright Mark writes in this is an interesting question he has a friend who’s site was hacked he did not know about for couple of months because of they had taken it out or something like that. So he asks can google notify the webmaster of the site basically when its hacked within sitemaps and inform them maybe say that inappropriate pages were crawled. That’s a great question my guess is we don’t have the resources to have something like that right now in general if somebody is hacked if they have a small number of sites they monitor, they will get to know about it really quickly, the web host will alert them about it. So webmaster console team is really going to work on new things but my guess is this is really not right now in the priority list.
Ok james says, “Hey Matt , in the fullness of time I am going to use Geo-targeting software one that will give different type of messages to different type of people in different parts of the world. So for example this kind of pricing structure are we safe to use these type of Geo-targeting software clearly we don’t want to avoid any suspicions of cloaking. That’s a very interesting question, SO lets talk about that a little bit Google webmaster guidelines very clearly says showing different type of content than what you show to Search engines. Geo-targeting by itself is not cloaking under google’s guidelines. Because what you are doing take an IP address and hey you are from Canada we will show you this particular page, take the IP address hey you are from Germany we will show you this particular page, this thing that will get you in trouble is if you treat Googlebot as a special guest and do something special for it. So Geo-targeting for Google bot like Googlebotistan is bad. So what you can do is instead just treat Googlebot as a regular user. So if you are targeting by country and if Googlebot is coming from United states just show what people in United states will see so google for example does geo targeting we don’t think that’s cloaking its all about playing the cards pretty well. So again as I said cloaking is showing different contents to users and different contents to search engines. In this case just treat Googlebot as you treat like any other fact that they got this particular IP address and you should be totally fine.
Alright its time for another break
Why we prepared this Video transcript?
We know this video is more than a year old but still there are people who have questions about their site and want to listen from a Search Engine Expert. Also there are millions of Non-English people who want to know what’s there in this video so a transcript is something that can be easily translated to be read in other languages. We know there are people with hearing disability who browse our site this is a friendly version for them where they can read and understand what’s there in this video.
This transcript is copyright – Search Engine Genie.
Feel free to translate them but make sure proper credit is given to Search Engine Genie.
Google update April 2008- Update Dewey names by matt cutts of Google
Google’s web spam head Mr.Matt Cutts has asked for feedback on new changes being rolled out in Google. Google has done a major update to its index and it seems lots of sites are affected. I personally see none of our client sites affected in any of the ips mentioned . One DC to look out for is DC 64.233.167.104 , When I checked I dont see much difference does anyone figure out what’s happening.
Here is Matt’s post
“Hey all, I asked a few people to look into this and they weren’t seeing many large differences in rankings between these datacenters. The issue with discussing on this thread is that specific urls/queries aren’t allowed. If anyone wants to mention a search where they see large-scale differences, feel free to send feedback to Google in the usual way. I’m going to pick a random-but-pretty-unique keyword so that I can look up reports. Let’s use “dewey” as the word. So if you want to mention a search where you think the results are very different at one data center compared to other data centers, use the spam report form at ihttp://www.google.com/contact/spamreport.html\ and make sure to include the word “dewey” in the “Additional details” text area. Or feel free to point out differences in other ways: do a blog post, leave specifics on the Google webmaster help group, or whatever way you want to point out specific searches that look different to you.
The usual rules of thumb apply: you probably won’t get a personal reply, but I’ll try to get someone to check out reports that get sent in. There shouldn’t be much difference between data centers, so I’m curious to find out what queries people seem to be seeing different results on. “
Dewey what a weird name for an Update, I hope people never forgot the great updates like Google florida update, Austin update, Jagger and more.
Forum discussion and ramblings as always here webmasterworld.com/google/3615693-3-30.htm
Google webmaster central full audio transcript raw unedited version
Hello Everyone,
Atlast we are done with audio transcript from the Google webmaster central live chat audio which was recorded last friday when webmasters around the world had live chat session with Google. we have posted that here its about 11,000 plus words which is equal to 20 plus 500 words article. Google did a great job and we were able to record the full audio ( transcript here ) as well as the full Q and A session as well as full chat log. The audio came out to be of very poor quality since it was recorded with a Mobile phone and the mobile phone was placed near the laptop computer through which we were monitoring and listening the session. Laptop’s speakers messed up the audio but still we were able to get the transcript read which you can read it here. Please read the disclaimer and NOTE before proceeding on reading the transcript.
Feel free to link to the document, don’t copy it.
Search Engine Genie SEO Blog Team,
Matt cutts confirms penalty to traffic power seo company
Matt cutts of google has unofficially confirmed that traffic-power.com were banned from google for using unethical seo tactics to rank their client sites, there has been a lawsuit going on against aaron wall of seobook and trafficpowersucks.com for using words against traffic power
Now their complains has been confirmed and matt cutts lead of web spam team confirmed in his blog http://www.mattcutts.com/blog/confirming-a-penalty/ that traffic power were indeed banned from search engines,
Some previous articles on traffic power ( here ) and here.
SEO Advice: Spell-check your web site – a typical Joke from Matt cutts
Matt cutts seem to be in happy mode today when he blogged about a site which offers 100% money back guarantee, The site which posted this offer has a banner which has typo errors, so how do you get business when your banner has spelling errors, Nice that matt noted this,
See spelling error in this message,
Matt cutts asking for feedback or various issues related to google,
Mattcutts of google has asked for feedback on various topics related to google,
He asks feedback on following issues,
Feedback: Webspam in 2006?
Feedback: Search quality in 2006?
Feedback: Products/features in 2006?
Feedback: Webmaster services in 2006?
Feedback: Communication/Goodwill in 2006?
Feedback: What did I miss?
Give your feedback in matts blog,
mattcutts.com/blog/
Blogroll
Categories
- 2013 seo trends
- author rank
- Bing search engine
- blogger
- Fake popularity
- google ads
- Google Adsense
- google fault
- google impact
- google Investigation
- google knowledge
- Google panda
- Google penguin
- Google Plus
- Google webmaster tools
- Hummingbird algorithm
- infographics
- link building
- Mattcutts Video Transcript
- Microsoft
- MSN Live Search
- Negative SEO
- pagerank
- Paid links
- Panda and penguin timeline
- Panda Update
- Panda Update #22
- Panda Update 25
- Panda update releases 2012
- Penguin Update
- Sandbox Tool
- search engines
- SEO
- SEO cartoons comics
- seo predictions
- seo techniques
- SEO tools
- seo updates
- social bookmarking
- Social Media
- SOPA Act
- Spam
- Uncategorized
- Webmaster News
- website
- Yahoo