Archive for July, 2008

Three simple Steps to double the site traffic

Three simple steps that helped to double the site traffic in a month.

1. First, I made the most of Technorati tags. I tagged every key word in all my posts. Initially I did this by hand but then I discovered a WordPress plugin called SimpleTags that made the work a whole lot easier. I found that by tagging my post efficiently they were getting a lot more attention than their untagged counterparts, and as an added advantage I was getting focused, quality traffic to the site!

2. I leveraged my existing site. I have been running my business site for a few years and that was getting modest level of traffic that was significant to my blog – so why not try to make some of that to my new blog! I placed a few FeedBurner title animator blocks on some of my most popular pages and after a day or so I noticed a major increase in traffic for 5 minutes worth of work on my part.

3. Finally, I made efficient use of trackback links to popular sites. If I remark on a post on another site I would make sure that I set up the suitable trackback for it. The results from this are varied depending on the site and post that you are linking to but since I liked to remark and work together with the wider blogosphere anyway, it was free traffic!For detailed info http://www.problogger.net/archives/2005/12/30/three-simple-actions-that-doubled-my-website-traffic-in-30-days/
Monday, July 28th, 2008 Link Building No Comments

10 methods of Writing Articles Can Improve Your Business

Here is some SEO Tips for you to know on how to write articles that could improve your business.

1. Present them to e-zines and web sites for publishing. Put your resource box at the end of the article to get free advertising.

2. Join your articles into a free e-book. You could place your business ad in the e-book and give it away to visitors to allow them to do the same to multiply your advertising.

3. Create an article directory on your site. People would visit your web site to find the free information.

4. Submit your articles to print publications which pay for submissions. You can also make extra income getting paid as a freelance writer.

5. Combine a few articles together into a free report. You can give away the free report as bonus for buying your main product or service.

6. Publish a book with all your articles and make additional money by selling the book from your web site.

7. Give peoples an instant article directory. Also tell visitors they could instantly add a free article directory to their web site by linking to yours. All those links could add up to a large amount of traffic to your site.

8. Post your articles in linked online communities. This gives you free a advertising in newsgroups, forums and e-mail and discussion lists.

9. Allow people to incorporate your articles in their free e-books. Your article can end up being in 20 to 30 e-books in no time. You don’t even have to promote the e-books.

10. Allow people to access your articles by autorepsonder. Also include your full page e-mail ad with the article.

Saturday, July 26th, 2008 Search Engine Optimization No Comments

Become an SEO Pro by 3 Simple Strategies

Google is all on linking. If you want to grab a top spot in Google’s search results you must link like a pro. Linking is a lot easier today than it was ago. Experience has taught us a lot and with the information here you won’t need to be anxious about endless trial and error. Let’s take a look at the linking strategies that Google give you the most credit for…

Link Strategy 1: harvest the Most Benefit from Anchor Text

You perhaps know that anchor text is the clickable word that makes up a hyperlink. But what you may not yet fully grasp how powerful anchor text is. Do a simple experiment…

Go to Google.com and look for “click here”. Did you get a link for Adobe Acrobat Reader at 1st number? Why is that? Verify the page. They have no mention of “click here” anywhere on the page.

Why does it rank number 1 on Google for “click here”? It’s all due to anchor text. More particularly, it’s since the countless pages that have “click here” as anchor text that will links to Adobe’s Acrobat Reader download page.

Did you observe how many opposing pages there are for “click here” on Google? Almost Two Billion! Anchor text is really important. Here are a few rules to get the most from yours…

A. Use your three important keywords for your anchor text. Particularly, you’re most important word 60% of the time; you’re second important keyword 25% and your third keyword 15%. That is for every page you link to.

B. Use “long tail” keywords when suitable.

C. If your anchor text is element of a paragraph, like a signature block, make sure the surrounding text is optimized for the keyword you desire or close variations. And make sure that the text is varied. You need to have plenty of versions of the surrounding text block so Google doesn’t ding it as duplicate content.

Link Strategy 2: Make Your Target URLs Laser correct

URL accuracy is extremely important. Be sure to use the same URL whenever you request for a link. Though a URL link beginning with “http” or “www” may resolve to the same webpage, Google looks them as different destinations.

Find out for yourself? Go to Google.com and enter “Links: http://” followed by www… any domain name you want. This will offer you the inbound links for that particular domain.

OK, now jot down the number of links. Now try it again WITHOUT the “www.” And see the number of links. Again try it a third time with the “www” but NOT the http://”. Do you get different numbers? This is for the reason that Google sees them as different link destinations.

Link Strategy 3: Go ahead of Reciprocal Linking

Reciprocal links aren’t enough anymore. Google is now discounting the importance of simple link swaps so that reciprocal links alone will not do the trick as they used to. With a little time or money or both, you could have the best links imaginable.
The two fastest and least expensive ways are submitting your site to directories or paying a link service, except NOT a reciprocal linking service. Let’s begin with directories…

Links from directories might not be all they used to, but still help your SEO efforts. There’s a page I like to test out that has the top directories listed as well as links to the site, their GRP, cost and more; StrongestLinks.com

You could click on the column headings to sort by any type you want. This makes finding the top GPR sites quick and easy. This site also seems to have some sort of paid membership obtainable, but I use the free link and it does all I need and more.

Important Note: There is also a third possible risk that comes from link farms, sometimes very similar to certain link services. That risk is having so many inbound links upcoming from a single IP range (Internet Protocol address). Google HATES this and would discount all these links, or worse.

As for reciprocal linking services, I suggest you to avoid them. However there is a good solution called “3 way linking” that will still allow you the set it and forget it option. Here’s a service I’ve had great results with…

3WayLinker.com will not link sites back to each other reciprocally. Instead it will create a series of one way links that are purely counted as inbound links by Google. Even more it helps to eliminate duplicate content in your link text and make sure all the inbound links are from a wide range of IP addresses.

Here’s how it will work… Site “A” links to site “B”. Site “B” links to site “C”. And then site “C” links to site “A”. So each one is a true one way link. This also gives the system more choice regarding which sites form a group. With reciprocal links, if both the sites use the same hosting provider, there is a high chance that you will be linked within the same IP range. With three way linking this problem could be eliminated.

So to recap,
1. Get the majority out of your anchor text;
2. Be very reliable with your link URL, and
3. Do more than reciprocal linking.
Now go and get that top spot you’ve been after!

Friday, July 25th, 2008 Search Engine Optimization No Comments

How Does Web search engines work

A search engine functions, in the following order

  1. Web crawling
  2. Indexing
  3. Searching

Web search engines work by storing data about numerous web pages, which they reclaim from the WWW itself. These pages are reclaimed by means a Web crawler, a robotic Web browser which follows each link it observes. Exclusions can be made by making use of robots.txt. The contents of every page are then analyzed to resolve how it must be indexed. Information about web pages is store in an index database for use in later query. A few search engines, like Google, store all or portion of the source page and data about the web pages, but others, such as AltaVista, store each word of all pages they find. This cached page always hold the real search text because it is the one that was actually indexed, so it can be especially useful when the content of the modern page has been updated and the seek out terms are no longer in it. This problem may be considered to be a placid form of linkrot, and Google’s usage of it raises usability by gratifying user view that the search terms will be on the return webpage. This satisfy the principle of least amazement since the user in general expects the search terms to be on the returned WebPages. Increased search relevance make these cached pages very helpful, even ahead of the fact that they may contain information that may no longer be available in another place.

When a user enter a query into a search engine, by using key words, the search engine examines its index and provide a list of best-matching web pages according to its criterion, generally with a small summary containing the document’s heading and at times parts of the text.

Friday, July 25th, 2008 Link Building No Comments

Robots file

The robots.txt file is an ASCII text file that has specific information for search engine robots about particular content that they are not permitted to index. This information’s are the deciding factor of how a search engine indexes your site pages. The universal address of the robots.txt file is: www.domain.com/robots.txt. This is the first file that a robot visits. It picks up instructions for indexing the site content and follows them. This file contains two text fields. Let’s study this example:

User-agent: *

Disallow:

The User-agent field is for specifying robot name for which the access policy follows in the Disallow field. Disallow field specifies URLs which the specified robots have no access to. An example:

User-agent: *

Disallow: /

Here “*” means all robots and “/” means all URLs. This is read as, “No access for any search engine to any URL” Since all URLs are preceded by “/ ” so it bans access to all URLs when nothing follows after “/ “. If partial access has to be given, only the banned URL is specified in the Disallow field.

For detailed information http://www.redalkemi.com/articles/robots-tutorial.php

Friday, July 25th, 2008 Search Engine Optimization No Comments

Link Building: Natural Linking Behavior

Effective link building is a complex affair and can be a tremendously time consuming and difficult task. However quality links are extremely powerful and a steady flow of incoming links is necessary for any website hoping to struggle in the search engine marketplace.

Deep Linking Google will provide more weight to a site with 500 links all pointing to dissimilar pages on a site than it will a site with 750 links all pointing to the homepage.

Varying Anchor Text Many webmasters consider that the more keyword rich anchor text links to the target page the improved. As discussed, when sites acquire links naturally they have little control over where the external site links to. This is the same for the anchor text of usual links i.e. the webmaster has little manage of the anchor text of inbound links.

It is highly likely that a website with compelling content will acquire links from a diversity of these sources. It will have press releases announcing new content, articles reviewing the site, listings in trust directories and communities will be discussing the site in Blogs and forums.
Thursday, July 24th, 2008 Link Building No Comments

Six Simple Ways to Dominate Google Rankings!

The reason why Google is the most winning search engine in the world is because they provide the best search results; pages ranked by concrete value. That value is a combination of content and links, with links being the more important factor. Here are some tips that will help you to take full advantage of Google’s love of linking…

1. Link deep and with significance

Google figured out that a link to a homepage is good if that page has the information for the visitor needs. If an individual clicks a link for “chocolate chip cookie recipe” and ends up on the home page, which doesn’t have it, Google considers it as a wasted link. If the link leads to the page contain info on the “Chocolate Chip Cookie Recipe,” even five levels deep, the link will have huge value to the visitor and to Google.

Want proof? You’ve ever used Google’s Ad Words pay per click service. They will not even accept PAID links to pages that are not relevant for their visitors, despite of what you are willing to pay per click.

2. Utilize Absolute Links Internally

Absolute links are one with a fixed full URL. There’s a different kind, called “relative” links that skip the first part of the domain and remain “relative” to the file constitution. Here’s the absolute link to Google Ads page from Google’s homepage: “http://www.google.com/intl/en/ads/”

It might look like as a relative link: “./intl/en/ads/”

Absolute links aid your SEO efforts and relative links don’t.

3. Employ Keywords in Anchor Text

Use related keywords in your link anchor text Forget about “Click here” you see on so many sites. Not only it helps your ranking, also it lowers the relevancy of your actual keywords since Google believes that if a word is important enough it will be used as part of a link to get the visitor where they want to go.

4. Pursue the 1% solution

Make not more than 1% of your page text into links. And don’t overuse the same keyword text for links. If you have three mentions each of three different keywords, use each just once in a link.

Example: If “chocolate chip cookies” is your chief keyword you might use “chocolate chip cookies” as the anchor text for one link and “my favorite chocolate chip cookie recipe” for another link.

It’s also a good to use 10 Links Max per page whether you have 1,000 words or more on that page.

5. Adjoin a Link Failsafe

This is simple and almost nobody does it. Links get broken sometimes because we moved a page and sometimes it has nothing to do with anything we did. The solution is to make a custom 404 page that looks like any other page on your site and have simple note like “We’re sorry we cannot find the page you are looking for. However, if you love cookies of all kinds we think you’ll find exactly what you want by clicking on one of the following links…”

6. Get the most excellent Links Possible

This is very important often overlooked since it could be difficult and time consuming job. Finding the finest possible inbound links is the most important thing you can do to make the number one spot on Google.

Here are three tips to minimize your time and effort as giving you results SEO experts charge an arm and a leg for.

A. Get scheduled in directories.
Submit your site to top directories like Jayde.com and DMOZ.org. If they link to your site you would have great relevant inbound links and instant credibility with Google.

Here are some great free directories starting with the best… dmoz.org, jayde.com, webworldindex.com, turnpike.net, and directoryvault.com. Yahoo is vital but charges $299 for commercial site inclusion.

B. Use “Special Commands” to perform the legwork for you.
The best linked sites could be easily found with a search command called “allinanchor:”Go to Google and type “allinanchor:keyword goes here”. Now hit Enter and you see the sites that have the maximum relevancy for keywords used in anchor text. Look for any competitors and outrank your site.

Now take the URL and use this command “link:www.theirdomain.extension”. This will show you every sites linking and internal pages linking back in.

In short, these two gives you an inside look at accurately how the competition does as it with the results they get. This is huge!

C. Use high-quality SEO software whenever possible.
If you could afford to spend one or two hundred dollars to save enormous amounts of time and get professional results, it’s fine worth it. Like many SEO professionals whose livelihood depends on results, I was using SEO software to get top search engine placement for years. The best ones help you to identify great link partners and also to contact them and make sure they don’t cheat you. I use SEO Elite and still amazed by all it can do.

If possible, get a tool that does rank checking and reporting. Once you start check rankings so often and an automated tool will save you time. I bought SEO Elite chiefly for rank checking then discovered it was worth as linking tool as well. So any tool you use, get much out of it as you can.

Thursday, July 24th, 2008 Google No Comments

Optimization Techniques success of Email Marketing

It’s a general trend that internet marketers will do anything they to get the e-mail address of the people visiting their site. Online, you have a small window of chance to convert a visitor into a client and if they’re not prepared to buy whatever they’re browsing on your site, at least find out whom they are so that you could try a various sales approach on them on another day. Just because they are not concerned in what you’re selling today will not mean you can’t find something else to tickle their fancy.

There are a lot of ways to do this such as:

- present a free report with information they’ll find valuable
- Ask them to go for your newsletter or subscribe to your website for update
- plot them with entry into a free draw for a prize.

Once somebody has signed up you’ve got a valuable e-mail address that you could use to build your list of potential customers. Once you get that e-mail address, you should be very careful with it since otherwise it won’t do much for you.

How can you stay out of the spam filter?

It’s an excellent idea to have someone validate before you give away anything. Many will put a false e-mail address just to get something for free. You could involve them to confirm by receiving an e-mail from you and clicking a verification link. Remind them to check their junk folder and if they spot your message as not Spam, their email software can then automatically whitelist you so that you do not land in the spam folder in future. By requesting verification, you also make sure that you are getting a real e-mail address.

Autoresponder software is a good investment for anybody marketing online. Buying a package AutoResponsePlus or something related would help you manage easily your mailing lists. The thing is, once you have a mailing list in place and a plan that include autoresponders, you should work on your email content and deliverability.

Each contact you make in future with prospective client should do two things:

1) The e-mail needs to present them something precious otherwise they’ll unsubscribe. Give information or tips that will have that person anxious to open future emails from you.
2) You require a call to action line at the end that will entice them to do something you need after they close the email. This could be to visit your site with a handy link, sign up for another service, refer their acquaintances to you or even offer a purchase link so they could buy what you’re selling.

Deliverability is certainly a key factor. You need to know that your e-mails arrive at their recipient’s email box so here are a few things you can do so as to increase the probability of your target audience actually opening your e-mail. People are fed up with Spam and using all kinds of techniques to ensure they’re not having to go through spam to get to the important e-mails.

People use domain keys and SPF records so that they are not getting phishing e-mails by those posing as their bank or PayPal.

People use spam filters to help them build the trust in their incoming messages. Once you make it, you’re set.

By getting yourself whitelisted, you could increase the probability of being delivered to an e-mail inbox instead of Spam folder and buying an autoresponder product that’s hosted on your server, you could more easily get past Spam filters since if you’ve done your bit to get whitelisted, you’ll be trusted by your recipient’s e-mail server which would increase the chances of your brilliant and persuasive copy being taken seriously.

Wednesday, July 23rd, 2008 Search Engine Optimization No Comments

Landing page optimization in Search Engine Optimization

Landing page optimization in Search Engine Optimization (LPO), also identified as webpage optimization, is an Internet market method with the aim of humanizing a guest’s acuity of a website. A landing page optimization is a webpage that appear when a latent client clicks on an ad or a search engine result linkage. This webpage will typically show content that is a logical extension of the poster or else link. Landing Page Optimization aims to give page content and manifestation that make the webpage more tempting to target audience.

Landing Page Optimization Bases

There are three main types of Landing Page Optimization based on targeting:

  1. Rule-based optimization – The page content is customized based on information obtain about the guest’s search criterion, geographic data’s of basis traffic, or else other identified generic constraint that can be use for overt non research based customer segmentation.
  2. Active Targeting – The page content is attuned by correlating some known data’s concerning the guest to expect future events based on prognostic analytics.
  3. Social Targeting - The page content is fashioned using the application of publicly available data’s through a system based on tagging, referrals reviews, ratings etc.
Wednesday, July 23rd, 2008 Link Building No Comments

Contextual Internal Links – Even Out the Link Strength

It’s quite common for certain pages of a website to receive disproportionately more inbound links than other pages of the site. These pages tend to rank well compared to other pages of a site. Use Yahoo Site Explorer, Google Webmaster Tools, or your analytics program to find which pages have the most link love and which ones have the least. You can strengthen the weaker pages by pointing some contextual links from the strong pages to the weak ones. It’s usually best not to link to them from every page of the site, just from a few that have strong linkage.

Internal links don’t carry as much weight as links from other sites, but they can still make an impact.

Usability Benefits

When done right, contextual internal links also can help improve the usability of a site. By adding links to the content of your site that are relevant for the user, it provides another path to the destination you’re trying to lead them to. Multiple paths are a good thing in site navigation. Usability studies have shown that users are more likely to click on a link in the text of a page instead of those on a navigation bar, because it feels more natural.

Tuesday, July 22nd, 2008 Link Building No Comments