Search Engine Optimization

Properly use the HTTP protocol

Many sites do not use facilities available in the HTTP protocol correctly, which often has no impact on users, but can have a dramatic impact on the value a machine can derive from the site.

The first common problem is the use of old-fashioned “meta refresh” tags to direct users from one page to another. While these work, search engines will not follow them.

A better solution would be to use standard HTTP response codes:

  • HTTP 301 “Content Moved” – This should be used to direct someone to a replacement URL for the content they were attempting to access. Search engines will remember this and will visit the new URL from that point on
  • HTTP 302 “Temporarily relocated” – This should be used in general situations where a page has moved temporarily, or you are redirecting based on some decision that you would not want the client to cache

Using the correct HTTP redirection code is critical for search engines. If a page has moved and visitors are directed to the new location with a 302 response, search engines will continue to direct visitors to the old address.

Also, if your site needs to move to a new domain name, it is best to use a permanent 301 redirection of pages on the old domain to the new one, so the new domain comes up in search results instead of the old one.

For more information: http://www.e.govt.nz/resources/research/SEO.pdf

Search engine penalties

  • Google Sandbox effect
  • Google sandbox effect is one type of test for newer websites, because google wants the best websites or quality content websites or established websites to be listed first as it wants the quality results to users. It also helps to stop scam websites getting popular quickly. There are many webmasters who purchase lot of links quickly so that their website could be listed on 1st place for highly competitive keywords from the day of launch. One can reduce its risk of getting its website into sandbox by targeting less competitive keywords, so that Google might have no point in applying filter for such websites.

  • Expired domain penalty
  • Expired domain name is subject to temporary penalty. Google may refuse to index the pages during that period due to content theft.

  • Duplicate content penalty
  • Duplicate content penalty is applied by all search engines when content on your website is largely the same as found elsewhere on your site or on other websites across the internet. The simple solution to avoid this problem is don’t rely on duplicate content to drive traffic to your site.

  • Google supplementary index
  • The importance of a page is measured by the number and quality of links pointing at it. Also what matters is the degree at which Google trusts a site’s inbound links which influences the importance of a page. The few reasons why pages go into supplementary are because of duplicate content, lack of content, lack of page rank, dynamic URL’S, and orphan pages.

    Steps to follow in the creation of user centric sites

    To create a user-centered web site you must think about the needs of your users throughout each step in the development of your site, including:

    • Planning your site
    • Collecting data from users
    • Developing prototypes
    • Writing content
    • Conducting usability testing with users

    Creation of user centric web site

    The first step is to clearly define your organization and users needs, goals, and objectives. To get the project started, begin by asking yourself (and your Web development team) more detailed questions such as:

    • What are your agency’s primary business objectives and how do they relate the Web?
    • Who are the users of your Web site?
    • What are your user’s tasks and goals?
    • What information do your users need, and in what form do they need it?
    • What functions do your users want from the Web site?
    • How do users think your Web site should work?
    • What are your user’s experience levels with the Web site?
    • How can the design of your Web site facilitate user’s cognitive processes?
    • What hardware and software will the majority of your users use to access your site?

    For more info: http://usability.gov/basics/usercntrd.html

    How Do Text Links help out My Website Rank higher in the Search Engines?

    Link popularity is the most important factor the search engines use for shaping how well a website should rank for its keywords. Link popularity is the measure of the quantity and the quality of the websites that link to yours. Getting text links from a very popular websites will boost your link popularity and increase your sites rankings.

    Additionally, over the past couple of years, the search engines have started placing added value on links coming from relevant websites. It is essential to get links from sites that are in a related category. For instance, a real estate website is supposed to try and get links from other realtors, mortgage brokers, or any kind of site that would generally link to a real estate site.

    Textlinkbrokers most important business is selling appropriate links to help increase our customer’s link recognition.

    Guide the search engines to the useful content

    While search engines are relatively good at finding their own way around a site, occasionally they need further guidance. If there are any pages or areas of your site that you would prefer weren’t added to a search index, you should place a robots.txt file in the root of your site specifying them. See the Robots Exclusion Standard at http://www.robotstxt.org for more details. This mechanism should be honored by all search engines.

    Another recent method available is the Sitemap Protocol (see http://www.sitemaps.org, an XML format that is now supported and encouraged by most major search engines. Using this protocol, you can give search engines guidance on which pages of your site you would like to have indexed, as well as how important those pages are in relation to each other, and how often you expect them to change. If your site is not currently capable of producing these XML sitemaps, it may be worth including them in a future redevelopment where practical.

    Of course, classic page-based sitemaps and index pages on your site are also of benefit to search engines, but be aware that most search engines only read the first 100 links they encounter in an HTML page, so a large sitemap or index should be broken up into multiple pages to get maximum benefit. It may seem counter intuitive to restrict search engine access to certain pages on the site if you are trying to increase your rankings, but remember that quality is more important than quantity to search engines; by only allowing the search engines to index pages that would be useful to a searcher, you are helping keep the search engine indexes clean, and therefore people will find what they want faster.

    Common pages you should prevent search engines from indexing include:

    • Web-based applications such as a webmail service that would provide no value to a searcher
    • Content that is temporary in nature and would likely be gone by the time anyone tried to visit
    • Any search result pages from an internal search engine Search engines prefer not to index the results of other search engines
    • Any content that isn’t for public consumption search engines find pages that normal user probably wouldn’t

    For more information: e.govt.nz/resources/research/SEO.pdf

    Don’t abuse the search engines

    Many common practices to rank well in the past now actually hurt your ranking. In general,any change you make to your site with the intention of making it appeal more to search engines, while having the effect of making it appeal less to a human user, is going to impose apenalty.

    Such practices include placing many general search terms in the title of a page which may not have anything at all to do with what that particular page is about. This not only confuses users, but also leads to demerit points from search engines that note that you’re lying about the actual content of that page. As mentioned earlier, your top level heading should match your page title, so if you’re adding keywords to the title that you wouldn’t put in a large heading on the page, they probably shouldn’t be there.

    Another current common practice is to include the site’s name in the page’s title, such as About e-government – New Zealand E-government Programme, which is usually acceptable as it helps users see what site they’re on. The search engines will tolerate this, and it may even help your rankings if users search for a combination of your site’s name as well as keywords from page content.

    Always ensure that your title and meta tags correctly describe the content of the page they are on. Adding superfluous keywords to either will result in lower rankings and the possibility of being blacklisted for those terms.

    For more information: e.govt.nz/resources/research/SEO.pdf

    Link building strategies to increase page rank

    Link building is very important for a website since it directly impacts Page Rank, one of the important features of Google Algorithm. To increase your page rank, submit as much directories possible. If your site is listed in local directories or industry specific directories it can be helpful to create niche keywords. Submit press releases. Be active on social networks as it can increase traffic on your website and increase page rank to a certain level. Try to publish articles as it increases traffic and build credibility for your business. Try commenting on blogs. There are some blogger’s who are glad to encourage participation on their Blog by removing the “no follow” tag. Publish classified ads. Providing free quality content that people can use is a really good link building strategy that many experts have mastered and use to get one way links.

    Importance of Tracking our websites

    Many Web sites offer users information, goods, services, and entertainment. But many of these sites are hard to use, and ultimately don’t keep users.

    Government Web sites are being used more frequently and by more citizens than ever before:

    • The use of government Web sites to obtain information increased 50 percent from 2002 to 2003, according to a recent Pew Internet and American Life Report on e-government.
    • In fact, according to the Pew Report, one of the top online activities in 2004 was using government Web sites. In 2004, approximately 97 million people used government Web sites.

    Users struggle to find the information they need on Web sites:

    • Of these 97 million Americans, 46 percent said they encountered problems on government Web sites. These Americans say their top problem is not being able to find the right information, according to the Pew Internet and American Life Project.
    • Research by User Interface Engineering, Inc., shows that people cannot find the information they seek on Web sites about 60 percent of the time. This can lead to wasted time, reduced productivity, increased frustration, and loss of repeat visits and money.

    Test your Website Visitors

    Doing regular usability testing with your website visitors is a best practice in managing your agency’s website. In a typical approach, users one at a time or two working together use your website to perform tasks, while one or more people watch, listen, and take notes.

    Usability testing allows you to measure of the quality of a user’s experience when they interact with your website. It’s one of the best ways to find out what is or isn’t working on your site.

    Millions of Web sites offer users information, goods, services, and entertainment. But many of these sites are difficult to use, don’t work properly, and ultimately don’t attract or keep users. By following a usability engineering process, user’s abilities to find information and satisfaction with Web sites improve significantly.

    Request a Free SEO Quote