Using the Bloglines OPML file

Rather than trying to find blogs and web feeds of interest by going to large, wholesale aggregators such as Technorati or Feedster, you can go to this bloglines account and view specific feeds within topic areas that might be of interest to you. Clicking on the feed name, in the left hand frame, shows the items, newest first, in the right hand frame.

You can also export the OPML file — in this instance a file that essentially names all of the feeds — and import that file into the feedreader of your choice.

On the bottom of the left hand frame is a link that reads “Export Subscriptions.” Simply right-click that link, choose “Save As” and put the resulting XML file in a place where you’ll find it again.

When you are in the aggregator of your choice, you can import that file and feeds, along with the folder structure, will be pulled into your desktop.

Why do this and not just use that page to visit the links? Putting the feeds into your own aggregator allows you to keep track of what you’ve read and what you haven’t. It allows gives you control of the subscriptions — add and delete feeds, change the folder structure so that it is more relevant for you — in a way that you can’t do with the Bloglines account.

90% of the Rankings Equation Lies in These 4 Factors

I think sometimes, in the field of search marketing we try to make the concept of ranking more difficult than really it is. True – there are many ways to build a link, an endless number of keywords, thousands of exclusive sources to drive traffic all along with analytics, usability, design, conversion testing, etc. but, when it comes to the very precise question of how to rank well for a particular keyword in standard results at the engines, you’re talking about a few big key components.

#1 – Keyword Use & Content Relevance


As I don’t believe in keyword density, no doubt that using your keywords intelligently to create a page that are relevant to the query and the searcher intent is critical to ranking well. You can use the primary keyword phrase as follows:

.In the title tag once or possibly twice if it makes sense.
.one time in H1 header tag of the page
.At least 3X in the body copy on the page
.once in bold
.once in the alt tag of an image
.Once in the URL
.once in the meta description tag
.not in link anchor text on the page itself.

For who’ve done nonsense words testing to see engines respond, you know you can certainly get extra value out of going wild and filling the keywords all over the page, but we’ve also seen that once you reach this level of saturation you’re getting about 95% of the value you can get.

#2 – Raw Link Juice


Some call this PageRank or link weight – it refers to raw quantity of global link popularity to the page. You can raise this with internal links and external links. A page with an exceptional amount of global link power can still rank remarkably well in Google & Yahoo!

Link juice operate on the fundamental principle used in the early PageRank formula – pages on the web have some inherent level of significance and that the link structure of the web can help to point out pages with greater and lesser value. Those pages linked to by many thousands of pages are very important and, when they link to other pages, those pages also have great importance.
Moving this theory to your own pages, you can observe how raw link juices have a large impact on the search engines score their rankings. Growing global link popularity needs both link building and intelligent internal link structure.

#3 – Anchor Text Weight


As search engines evolved in the early 2000’s, they selected up on the usage of anchor text .The anchor text of links is a critical part of the ranking equation, and seen in great quantity, it could overshadow many other ranking factors – you could see many web pages weaker in all the other three factors I explain here ranking primarily because they’ve earned many thousands of links with the exact anchor text of the phrase they’re targeting.

Note that anchor text come from both internal and external links, trying to optimize, think about how you’re linking to from your own pages – using generic links or image links.

#4 – Domain Authority


This is the most complex factors I described in this post. Basically, it refers to a variety of signals concerning a site that the search engines use to decide legitimacy. Does the field have a history in the engine? Do many people search for and use the domain? Does domain have high quality links pointing to it from other trustworthy sources?

To influence this positively, you need to do is operate your site in a way consistent with the engines’ guidelines. If you desire to earn a lot of trust early in a domain’s life, get many sites that the engines already trust to link to you.

Site issues sensed by Google Toolbar

404 errors with default error pages

When a user tries to open your site with an invalid URL and your server returns a short, default error message, the Toolbar will recommend an alternate URL to the user. If this is a common problem in your site, you will see these URLs in the crawl errors section listing of your Webmaster Tools account.

If you want to set up a custom error page, ensure it returns end result code 404. The content of the 404 page can assist your users to recognize that they tried to reach a missing page and give suggestions regarding how to get the content they were looking for. When a site shows a custom error page the Toolbar will no longer offer suggestions for that site. You can verify the behavior of the Toolbar by visiting an invalid URL on your site with the Google Toolbar installed.

DNS errors

When a URL holds a non-existent domain name (like www.google.cmo), the Toolbar will recommend an alternate, similar looking URL with a valid domain name.

Connection failures

When your server is inaccessible, the Google Toolbar will automatically show a link to the cached type of your page. This attribute is only available when Google is not explicitly banned from caching your pages through use of a robots Meta tag or crawling is blocked on the page through the robots.txt file. If your server is frequently unreachable, you will possibly want to fix that first; but it may also be a good plan to check the Google cache for your pages by looking at the search outcome for your site.

Anchor Text Optimization

If you have done some grave reading regarding your website optimization, you possibly would have come across the allusion to the phrase “optimization of anchor text”.

What do you mean by Anchor Text in web page?

Anchor text is the noticeable hyperlinked text on the page.

In a normal built site, anchor text is typically used to specify the subject matter of the page that it links to. For instance, the manuscript “SEO case studies” indicate to guests that they can expect to notice content about case studies relevant to SEO if they visit the link. This mold of usage has been applied in search engine algorithms to improve the relevance of the “target” or else the “landing page” URL for the keywords that appears within the anchor text.

Anchor Text Improves the Relevance of the Target Page

Please note that the keywords used in anchor text improve the relevance of the target page pertaining to the keywords being used. While the relevance of the page that contains the anchor text is also improved to some degree the actual gainer is the target page URL. Use this knowledge to make the relevance of every page of your site, by means of optimized anchor text that contains important keywords, relevant to the subject of such pages, from other page of your site.

Search Engine Algorithms akin Anchor Text.

The insertion of main keywords in the anchor text can make a big distinction in the final ranking of your site pages. All search engines that subject, give considerable weight to the anchor text on your pages. In fact Google even has a special machinist: allinanchor: keyword’, which pick up text from inside the anchor text of indexed pages.

Mistakes which is done by SEO

1. Keyword wadding: Placing the same keyword over and over or using hundred dissimilar spellings or tenses of the identical keywords in your keyword Meta tag is known as keyword wadding. You must keep away from it as it may harm your search engine rankings.

2. Copy Content: Make definite to have some single and informative content for users on all web pages, it must be connected to your trade. Having the similar content on your different pages of website must be shunned as it may have an adverse outcome on your search engine rankings.

3. Steering and internal linking: Good navigation and inner linking is also matters a lot. Navigation menu should be effortlessly available by users. Make certain that the anchor text linking to pages within your own website is pertinent to the aim page.

4. Anchor content of inbound links: Having a lot of inbound links is scanty but the anchor text pointing to these links is also very significant. The anchor text should be targeted to your major keywords and the web page they tip to should have those keywords.

5. Covering: Cloaking is a method used by some webmasters to show different pages to the search engine spiders than the ones regular visitors see. You should always shun any type of cloaking as it powerfully prohibited by most major search engines now.

6. Over Optimization: Over optimization show that your site has been designed for search engines and not for users. It may drop your search engine rankings as search engines are now able to sense over optimized sites so you must avoid excess optimization.

7. Annoyance: Search engine optimization needs a lot of endurance. You must wait for few months for results after optimizing your website. Have a little patience and you will get your preferred results if you correctly optimized your website using moral SEO techniques.

Categories for comparison social bookmarking

Common Functionality :

Tagging
Ability to add ad hoc metadata in the form of keywords to URLs

Categories
Ability to pre-determine a series of words or phrases under which URLs can be stored.

Descriptions

Ability to write a brief description of the content.

Syndication (RSS, Atom)
Various levels of output in the form of RSS or Atom feeds. These may include syndication of a user’s bookmarks, a tag and/or category with a user’s set of bookmarks, or a tag as used by all system users.

Posting bookmarklets

These are “widgets” that you can add to your browser that enable you to easily bookmark URLS directly to the service. Typically, other users create these widgets based on XYZ programming code and share them freely with other users. Some bookmarketlets provided additional functionality such as importing or exporting urls from the service to your desktop.

Sharing
Unless “private” functionality is offered, all bookmark collections in users accounts are visible to everyone who uses or visits the site. Some social bookmarking tools allow you to designate a particular URL as “private” so it is only visible to you.

Browse by tag/category

You can browse all bookmarks that are tagged with a particular tag within a service. This lets you see the entire collection of URLs related to topic. In some services, you can then examin specific users other bookmarks.

Subscribe to tags
This feature allows you, whether in the social bookmarking application itself or via RSS, to follow addition to a specific tag.

Subscribe to users
This feature allows you, whether in the social bookmarking application itself or via RSS, to follow addition to a specific user.

Integration with other tools
Related to 3rd party tools, add-ons and APIs, this indicates whether or not it is possibile to easily add the data or certain pieces of functionality of the social bookmarking tool to other applications. A common example of this is the ability to integrate a users del.icio.us account and daily links with the posts on their weblog.

Development of add-on, 3rd party tools
There are additional tools, functionality, and services built on top of the social bookmarking tool.

Import
The ability to easily add an existing list of bookmarks to the users account.

Export
The ability to easily export the existing list of bookmarks as either html or xml.

Publishing
The ability to make the lists available in another format on the web. This is commonly solved by enabling RSS feeds of user bookmarks.

Saves cache of webpage
The service saves the webpage so that it can be retrieved later, if the page no longer exists.

User Experience

Interface
Entirely subjective, this discusses the ease with which the interface can be approached and navigated.

Documentation
An evaluation of the services help documents, faq’s, user forums and technical support. A review of tutorials or documentation developed by other users.

Technical (Under the Hood)

Open source

The source code is available for inspection, modification and use.

Closed source or propietary
The source code is not available for inspection, modification and use.

Open API
A standard method for accessing the data and a mechanism that allows applications to be built on top of the social bookmarking tool.

Meets usability criteria (for alternative browsing methods etc)
Settle on a service (bobby etc) and test the tools.

Reliability
Uptime, speed. The challenge here will be to find something that provides a measurement of this.

Integration with other tools
Duplicates above. Should it be here on in the above list.

Multilingual interface options
Not sure what I meant by this. Maybe the ability to create alternate ways to post to the site (see the various del.icio.us bookmarklets as an option)

Text reading options
Again, no guess on what I meant here.

Community

  1. Number of active users
  2. Number of links/bookmarks in system
  3. Sense of the content/topics collected
  4. Developers
  5. Support (financial or otherwise)
  6. Likely to be around in five years

Organic Search Engine Optimization

Search engine optimization is a demanding and time consuming task, and there is much more to search engine optimization and then link building and page optimizing page. To understand better the existing approach to search engine optimization, it is vital to understand the part of content in SEO. Google was the first search engine to execute a technology called LSI (Latent Semantic Indexing) for generating search results. LSI require a Googlebot to get note of the keyword density of particular words on a web page apart from caching a page. For example, a page with 500 words and the keyword Lenovo laptops appearing 50 times would have a keyword density of 10% for keyword Lenovo laptops.

Once Googlebot has indexed the page and checked the keyword density of various phrases on the webpage, it must then bring up the webpage for search phrases such as Lenovo laptops. Though Google take different factors into account as generating search results, there is no denying that content plays a key role in generating organic search results. Apart from this, a Googlebot also indexes the textual content like page headers, titles and links. In core, Google stores a complete account of every website and the keywords related with the website.

A main step in any victorious search engine optimization is creating content targeted at particular keywords. Search engine optimizers carry out thorough analysis of popular phrases and isolate the keywords that need to be targeted. Once the keywords are cleared, the next step is optimizing the website for keywords. Contrary to accepted belief, the domain name will play a vital role in search engine optimization and a domain name like lenovolaptops.com would rank well for the search query Lenovo laptops. What is true is that the domain names alone do not decide the search ranking of a website.

After domain name, name of every individual web page also plays a fundamental role in LSI base search engine optimization. For example, a page named lenovolaptops.html would stand a better option of rating well for a Google search for Lenovo laptops. But not all websites have the comfort of choosing a domain name before starting optimization and the next option is to name all pages for target keywords.

Separately from the name of a page, page titles and headers also play a significant role in search engine optimization. Google always lays huge emphasis on the keyword concentration of a web page and placing headings with target keywords alone is no longer enough to carry out search engine optimization. What is true is that Meta tags also have lost their importance and have reduced to providing basic information to a WebCrawler.

Content is not the only criteria
There is an admired myth that a website can do well purely on content, but this is not true. There are different other parameters like web elements, link building and the era of a website that determine its success with the search engines. A website with a high Page Rank gain more from well written content than a website with low Page Rank. But, the reason why content is given great weightage is that websites with low page rank are constantly outdoing websites with the higher Page Rank. It is almost impossible to talk about all the elements that involve in a successful search engine optimization campaign; however a nut shell content alone is not enough to improve the organic ranking of a website.

Why do we require Search Engine Optimization?

Now, there are billions of web pages on the web in compliance to the same idea of enhancing their market share in the virtual earth. This give rise to the rivalry among various players offering same services which has necessitated the improvement and enhancement of services offered on the web. On an average, 50% of buyer search for products on search engines before pay for and 80% of online traffic is driven by search engines like yahoo, google and msn. As a result it becomes grave for online business to show-up on search engines for search terms associated to their products as well as services.

In order to realize the benefits of SEO let’s examine into the intricacies of the meaning below and the ancillary actions involved which help in preparing the base for the websites.

Search Engine Optimization SEO is a complete exercise of customize and retooling a website so that it achieve continuous high ranking on Search Engine Results Pages (SERP).

Nice ways to get Tons of Back Links

In SEO we know how important it is to get back links to your site. Google, Yahoo, and all of the other most important search engines love back links. In fact that is how they index pages, by the number and worth of sites linking back to that site. So the more links that are pointing to your site from other sites, the improved place you will obtain in the search engines, and consequently you will also get more traffic to your site.

There are multiple ways to do this. Here are some of the methods that people use.

Politically Incorrect : If you post on your site, or your blog, something that is politically correct, or something that many People that agree with you will link back to you, because they agree with you and want to demonstrate other people what a good idea it is that you have. And people that disagree with you will also link back to you, because they will put a link to your article, followed by a counter argument. In either way you should be able to acquire tons of back links.

Free Stuff : People love free stuff. If you give gone something for free that others want people are going to tell their friends. Some will tell there friends face to face, and others will place a link on their site. Either way you will get a ton of traffic, and will be additional likely to get a good ranking in the search engines.

Current News : Anything that is going on in the news you can write about. This works improved for some sites than it does for others. For instance celebrity sites work really well with this kind of marketing. But you can still do this with any site. Find a topic in the news related to your site, and write about it. People will often link back to that article, because they are involved in it, and so will their readers.

Lists : The final method is lists. They love to read them, and they love to link to them. Lists offer an easy to read, easy to follow guide that offers tips and suggestions for doing things. If you write a long list, maybe a list of a hundred or so things associated to any topic, you are bound to get good results with people linking back to you. To create a list, think of something that people desire to know, or something that they will often ask.

Domain Ages will affect Search Rankings?

The order that pages come out in the results of a search at a search engine may be powered by the number of pages that link to that page and by rankings of the pages that link to that page. When a site is linked to by a popular site, that link might give more value than a link from a site that is less well-liked and trusted.

Ages of linking sites

A new rights application from Microsoft adds an extra twist, by also ranking domains based upon the ages of sites which link to those sites. The cost of purchasing a site has decreased considerably in recent years, and some site registrars have offered free site registrations for trial.

A spammer may take advantage of a bid like that to develop something known as a link farm, which is a spam method in which spammers “purchase or otherwise get a large number of domains and interlink the sites jointly to increase the site rankings by unnaturally increasing the number of contributing sites for some or all of the sites.”

These rights applications think that newer sites have a “higher possibility of being spam and/or being a part of a web farm that attempts to falsely inflate domain rankings for domains in the web farm.”

By looking at the age of sites that link to those newer domains when determining a rank for a domain, domains which have links from older sites “may be ranked higher than spam sites and/or less relevant sites.”

Request a Free SEO Quote