Keywords in URls and domains – is it still useful?
I think most would agree that a keyword friendly domain can be a major help for SEO in various ways. Domains and URL paths are a bit of a different consideration.
The URL itself is less of a signal these days, and so thinking too much about where in the URL keywords appear is probably too finicky.
However, as with any question that affects URLs, you need to make sure your URLs serve their major purpose as a user route into your site. If you have meaningful but short URLs, that should help SEO.
Generally, I would read from right to left in a URL – going from most specific information to least specific. So, www.youtube.com/watch?v=gRzMhlFZz9I is not great. www.youtube.com/entertainment/very-funny would tell a user (and a search engine) that you had something very funny, filed under entertainment, in a site named youtube.
A quick test that can help you evaluate the search engine impact of keywords in URLs is to compare results for a keyword to results for the same search without the keyword in the URL. But I would say instead of concentrating on these stuff if you can do what is best for your users I am sure you will do fine.
Best CMS for adapted copywriting and search engine performance:
A content management system is a software that keeps track of every piece of content on your Web site, much like your local public library keeps track of books and stores them. Content can be simple text, photos, music, video, documents, or just about anything you can think of. A major advantage of using a CMS is that it requires almost no technical skill or knowledge to manage. Since the CMS manages all your content, you don’t have to.
Joomla: Joomla is currently the best and leading content management system that will help you to build great content rich websites with ease. Joomla is an open source software means you get it totally for free from www.joomla.com , It is currently the best and most easy to use software it have got various awards for best performing CMS.
Drupal: Drupal is an other quality CMS that helps to easily publish , manage and organize variety of contents including blogs, content sites etc. You can use Drupal to build :
- Community web portals
- Discussion sites
- Corporate web sites
- Intranet applications
- Personal web sites or blogs
- Aficionado sites
- E-commerce applications
- Resource directories
- Social Networking sites
Umberco: If you use .net hosting you can go for Umberco a friendly CMS which again is easy to use and comes with a load of great features.
Mattcutts discusses PR sculpting:
Matt Cutts, talks about the best ways to stop Google from crawling your content, and how to remove content from the Google index once we’ve crawled it.
Sebastine explains pretty well on that topic:
As for password protected contents, are you sure that you don’t index those based on 3rd party signals like ODP listings or strong inbound links?
You totally forgot to mention the neat X-Robots-Tag that allows outputting REP tags like “noindex” even for non-HTML resources like PDFs or videos in the HTTP header. That’s an invention Google can be very proud of. 🙂
@Ian M
Actually, Google experiments with Noindex: in robots.txt, but that’s “improvable”.
Currently Google interprets Noindex: in robots.txt as (Disallow: + Noindex:). I think that’s completely wrong, because:
1. It’s not compliant to the Robots Exclusion Standard.
2. It confuses Webmasters because “noindex” in robots.txt means something completely different than “noindex” in meta tags or HTTP headers.
3. Mixing crawler directives and indexer directives this way is a plain weak point that will produce misunderstandings resulting in traffic losses for Webmasters and less compelling contents available to searchers. All indexer directives (noindex,nofollow,noarchive,noodp, unavailable_after etc.) do require crawling when put elsewhere. I do Webmaster support for ages and I assure you that Webmasters will not get it. If nobody understands it and adapts it, it’s as useless as Yahoo’s robots-nocontent class name that only 500 sites on the whole Web make use of.
4. The REP’s “noindex” tag has an implicit “follow” that Google ignores in robots.txt for technical reasons (it’s impossible to follow links from uncrawled pages). When I put a robots meta tag with a “noindex” value, then Google rightly follows my links, passes PageRank and anchor text to those, and just doesn’t list the URL on the SERPs. When I do the same in robots.txt Google behaves totally different, for no apparent reason. (Of course there’s a reason but I want to keep this statement simple.)
Having said all that, I appreciate it very much that Google works on robots.txt evolvements. Kudos to Google! However, please don’t assign semantics of crawler directives to established indexer directives, that doesn’t work out. I see the PageRank problem, and I think I know a better procedure to solve that. If you’re interested, please read my “RFC” linked above. 😉
@all
Do not make use of experimental robots.txt directives unless you really know what you do, and that includes monitoring Google’s experiment very closely. If you’ve the programming skills, then better make use of X-Robots-Tags to steer indexing respectively deindexing of your resources on site level. X-Robots-Tags work with HTML contents as well as with all other content types.
Link Building rap – funny
I found this when I was going through youtube for some SEO related videos. I thought this one was funny but not so informative. Its worth watching just for the sake of fun:
Here is the lyrics / transcript for that video:
Chuck raps about link building.
You create a new site and its content heavy,
With the right amount of pictures you believe it’s ready,
So you launch it trying to put money in the bank,
But when you search and try to find yourself, you can’t,
So you thank until your mind goes blank,
Got titles and headers but no page rank,
Sooner or later it will show if I wait,
In the meantime make sure my code validate,
And it do,
Hmm, now what I’m supposed to do,
Add meta information and alt tags too,
Still don’t get listing,
Something must be missing,
Brad and Chuck recommended doing link building,
So you start hunting down sites like a predator,
Doing back links on all your competitors,
Whoever linking to them need to link to me,
Is it free, do we swap, or do I pay a fee,
Well take it from us, before you take that step,
Some things about the site that you might want to check,
Did they use a link farm or some dirty tactics,
Could have a bad effect on your site that’s drastic,
Could’ve link baited, look at what they created,
Compare it to yours, is it even related,
Take the time, go inspect and see,
Take advantage of paid directories,
If you follow all the steps with a little bit of patience,
Get links from relevant sites that are favorites,
Update your content on the regular basis,
I’m confident you’ll make it to first page placement
Quote for SEO criteria.
1. Competition: Competition is one of the most important factor when deciding a pricing for SEO. If the competition of the keywords is lesser then I am sure you can get ranking in a shorter period of time which will result in lesser price.
2. Keywords: Lots of sub factors are involved when it comes to keywords to determine the price for SEO service. Keywords can be global , regional , very much targeted, country wise and lot more. If you can get a clear picture of the keywords that will be targeted it will be easier to give a price.
3. Niche: You need to analyze the website thoroughly to know the niche the site targets its an important factor before deciding the price for SEO.
4. Time: SEO is a time consuming process it cannot be done in a day or week or month some sites take years to rank and some can be ranked in 2 months before quoting a price it’s important to know how much time its going to take to rank a website.
5. Website quality: Some clients will come to you with a low quality website and will want you to rank, its much easier to rank quality websites with quality unique crawl able content than sites which doesn’t have sufficient information pages. Make sure you set this as a criteria when you give a price quote.
Deciding the right price is very important you can scare away customers by quoting very high yet you can’t burn your hands by quoting less. Make sure you have sufficient experience in giving price quotes. I have been giving price quotes for more than 6 years and it hardly takes me 10mts to decide the price for SEO for a website but it cannot be the same for amateur people. Research more I am sure you will get more ideas.
Bizarre places to get backlinks for a quality website:
There is no limitation in getting backlink for a high quality site. Especially high quality site like ours get 100s of backlinks from good sources as well as some bizarre sources. There are lots of bizarre links we get when we monitor our blogs here is a list of some of them.
1. From PDF and Doc files which are actually not html or web based files. PDF ( Portable Document Format ) is a format used to create important documents, thesis, projects, invoices and lots of other users. We get links from PDF files as well as doc files from some edu sites as well as other sites which use our website as reference for their project or report.
2. From PPT: Google , yahoo, MSN all top 3 search engines crawl power point slides effectively we have some Gov sites use our website and tools as reference on their PPT slide shows and this turn up as backlink for our website. You know edu, gov backlinks are some of the best backlinks you can get for your website.
3. From Other language sites: We get links from Japanese blogs, European language blogs, urdu blogs, Arabic blogs and from lots blogs in different languages. Though we are an English only site we have people visiting our website all over the world.
4. In college personal project pages: We get links from some students from major universities who use our tools as a testing base for their projects, also they use our articles and other good resources we have for their college projects. We get some good quality links from this source.
We get lot more bizarre links which I will share in future in our blog.
Google ranks a page of a website?
Google ranks pages not sites we know that but the real question does Google rank a page based on keyword relevance on that page or keyword relevance throughout the website? Based on my, admittedly very targeted, sites I have to say Google is now sophisticated enough to analyze whole site for relevance than just the ranking page. Pages, yes, words are analyzed and targeted. But Google also seems to understand a sites “keywords” and rewards with better SERPS when those are entered.
I look at it like this. I have a site about widgets, 1,000 pages primarily aimed at widgets. Each page is about a particular aspect of widgets, green ones, round ones, making widgets, etc. A couple of my pages on that site are about an apparently disconnected subject, e.g. plankton. The reason that a couple of plankton pages are there is because they are the source of the material which is used to make widgets.
I have loads of these “disconnected” pages, all written well and in the same style as the base widget pages. But they don’t rank for plankton. My guess is that Google can’t connect plankton and widgets and therefore the plankton pages are considered as less value.
Increase in traffic and rankings can cause manual review:
We have seen this across some major sites. If your site is doing very well and if its starts appearing for some rare competitive keywords I am sure your site will be subject to manual review by search engine specialists.
Search engines are very careful about the quality of their SERPs. We know from some internal knowledge that Google uses people to manually review their search engine results. Main flag areas are if a search engine detects big boost in backlinks from high pagerank websites then you can be sure to raise a red flag. A manual review is not always bad but again it depends on the reviewer. A reviewer sees in Google point of view and they are pretty strict on their guidelines even a site going little bit out of bound might raise questions in reviewers mind. We should be careful especially gaining lot of backlinks in short time since Google is always watching us.
Websites targets ignorant offering grants money:
Some scam websites that advertise with Google or rank in Google organic results are scamming people about Obama Stimulus package. According to those websites you can buy a home, a car, pay credit card bills, invest in Business, run a business or can do anything with that money. To find these scam sites just type “government grants” , “grants” , stimulus share , stimulus package , stimulus etc.
Con artists are creating and running scam websites like federalgovernmentgrantsolutions.com, OfficialStimulusPayments.com etc and lure ignorant people for spending to some crap services. Scammers create bogus blogs too and redirect traffic to sites that offer tips on bogus grant offers.
“A company in Las Vegas called The Grant Instructor has generated even more complaints – 450 so far. The BBB says the company, which also has an “F” grade, runs at least two dozen sites with names such as: American Grant Club, Get My Grant, Grant Dollars, Grants Are Easy, Grant Resource Center and Your American Grant.
Christopher Gaffer of Mankato, Minn. stumbled onto one of their sites called “The Grant Search.” Gaffer is on the board of a non-profit group in Mankato that helps provide affordable housing. Part of their funding comes from grants. Gaffer went online to look for new funding opportunities.
The initial cost was just $1.95 for seven days access to the Grant Search database. Gaffer paid but never got his access code. Seven days later, he found a charge for $49.50 on his credit card for “a recurring monthly membership.” Gaffer tried to contact the company but could not find a phone number or e-mail address. “It was a nightmare,” he says.
http://www.msnbc.msn.com/id/29643680/
Yahoo Makes money now from Yahoo search BOSS developer network:
BOSS is currently offered to developers free of charge. In the near future (likely late Q2 2009), we plan to implement a fee structure for API use above a set threshold.
BOSS will continue to offer free use of the API below a set daily threshold – up to 10,000 search queries per day depending on the type of API call. In addition, we plan to implement a service level agreement (
When we implement fees, we also plan to offer developers the ability to make non-time sensitive API requests at a significantly reduced price. This “off-peak” option can be used for research and analysis efforts to support building search experiences and would allow developers to request queries to be fulfilled by BOSS when capacity permits.
How Will Fees Work?
We currently plan to implement a fee system with the following structure:
· Fees will be determined based on the number and type of API requests made per day
· We are no longer restricting developers from monetizing their products using third-party platforms
· Fees will be determined using a unit system
· Units for Web, News, and Image BOSS API calls will be incurred based on the last result requested. For example, an API call requesting results 91-100 is the same cost as one requesting results 1-100
· Developers may request up to 1000 results in a single API call
· Units will cost $.10
· Developers will receive 30 units per day for free of charge
Blogroll
Categories
- AI Search & SEO
- author rank
- Authority Trust
- Bing search engine
- blogger
- CDN & Caching.
- Content Strategy
- Core Web Vitals
- Experience SEO
- Fake popularity
- gbp-optimization
- Google Adsense
- Google Business Profile Optimization
- google fault
- google impact
- google Investigation
- google knowledge
- Google panda
- Google penguin
- Google Plus
- Google Search Console
- Google Search Updates
- Google webmaster tools
- google-business-profile
- google-maps-ranking
- Hummingbird algorithm
- infographics
- link building
- Local SEO
- local-seo
- Mattcutts Video Transcript
- Microsoft
- Mobile Performance Optimization
- Mobile SEO
- MSN Live Search
- Negative SEO
- On-Page SEO
- Page Speed Optimization
- pagerank
- Paid links
- Panda and penguin timeline
- Panda Update
- Panda Update #22
- Panda Update 25
- Panda update releases 2012
- Penguin Update
- Performance Optimization
- Sandbox Tool
- search engines
- SEO
- SEO Audits
- SEO Audits & Monitoring
- SEO cartoons comics
- seo predictions
- SEO Recovery & Fixes
- SEO Reporting & Analytics
- seo techniques
- SEO Tips & Strategies
- SEO tools
- SEO Trends 2013
- seo updates
- Server Optimization
- Small Business Marketing
- social bookmarking
- Social Media
- SOPA Act
- Spam
- Technical SEO
- Uncategorized
- User Experience (UX)
- Webmaster News
- website
- Website Security
- Website Speed Optimization
- Yahoo




