search engines

301 redirect search engine friendly redirect

I am describing here how to setup 301 redirect in various servers. We have posted various helpful articles and tools on our site. We keep getting requests from people for an article on making redirects search engine friendly. When you are moving one page to other or moving site from http to https or fighting canonical issues 301 redirect is the way to go.

301 redirect search engine

301 redirect setup is not same for different servers. Each server works in a different way. For IIS servers you need to make changes directly on the server. Mostly you need dedicated server or remote server access. But for other servers it can be done just through FTP. Here are some sample coding for important servers.

If you are moving pages it is important to keep your rankings intact. Remember search engines rank pages not websites. You need to make sure none of the pages have 2 versions because it will create duplicate issue. Learning about search engine friendly redirects is the way to go.

Apache Server

Doing redirect in php files is a simple way to do on an apache server if you want to redirect only one page to an other page.
Here is a sample how to redirect in PHP
<?
Header( “HTTP/1.1 301 Moved Permanently” );
Header( “Location: http://www.example.com” );
?>

Adding redirect in .htaccess is another efficient way to redirect in an Apache server. .htaccess is an efficient file used for both server access and SEO purposes. Most of URL rewriting , URL redirects can be done in .htaccess file.

htaccess redirect

There are many types of redirects you can do using htaccess

To redirect from one domain to another domain. There are various reasons you want to redirect from one domain to another domain. You want to move your site to a new domain name, you want to just replace existing domain name for other purpose but use new domain as main site. Like that there might be many uses to redirect from one domain to another domain name. You can do that in .htaccess file. .htaccess is a server file so handle with care. If you don’t know how to handle it leave it to an expert. You can create a htaccess file with your notepad. Just past this below code in notepage and save it as .htaccess.

Options +FollowSymLinks
RewriteEngine on
RewriteRule (.*)
http://www.yournewdomain.com/$1 [R=301,L]

Just replace yournewdomain.com with the URL you want to redirect.

Search engines like Google used to have problems figuring out the difference between www and non-www versions of a site. John Muller even suggested verifying both versions in webmaster tools to see data. This shows Google is still struggling with www and non-www version of a site. We have an option in webmaster tools to select which version we want to use. We can select either of the 2 versions. But to avoid confusion and play safe you can just 301 redirect from one version to the other. This way Google will not have problem identifying the right version to use.

Just include the below code in your .htaccess file to redirect all requests to example.com to be redirected to www.example.com
Options +FollowSymlinks
RewriteEngine on
rewritecond %{http_host} ^example.com [nc]
rewriterule ^(.*)$ http://www.example.com/$1 [r=301,nc]

Just change example.com in the above code to your own site URL. That’s boom you are done fixing www and non www versions.

Remember .htaccess works only on Apache servers. Just because PHP can run in IIS and other servers htaccess will work. htaccess doesn’t have anything to do with PHP. It is an apache server file.

Windows Server.

Redirect using ASP pages. Sample like PHP pages you can redirect from one page to another page in IIS server using this simple ASP code. Just include this code in the top part of your ASP page.
<%@ Language=VBScript %>
<% Response.Status=”301 Moved Permanently”
Response.AddHeader “Location”,”http://www.example.com/” %>

ASP .NET Redirect
<script runat=”server”>
private void Page_Load(object sender,System.EventArgs e)
{
Response.Status = “301 Moved Permanently”;
Response.AddHeader
(“Location”,”http://www.example.com”);}
</script>

PERL

Below code will help you redirect in a perl server code. Copy the below code and replace example.com with your own code.

CGI PERL Redirect
$q = new CGI;
print $q->redirect(“http://www.example.com/”);

Simple 301 redirect on Ruby on Rails

def old_action
headers[“Status”] = “301 Moved Permanently”
redirect_to “http://www.example.com/”
end

JSP server

Java Server Pages an important serve side code used in Java enabled servers. Many sites use Java as their backend platform and they will need a SEO friendly redirect if they are moving one page from the other. Copy the below code to your server and it should work best for your SEO purpose.

JSP (Java) Redirect
<%
response.setStatus(301);
response.setHeader( “Location”,”http://www.example.com/” );
response.setHeader( “Connection”,”close” );
%>

IIS server

Setting up redirect in IIS is the most difficult part. When I first started doing SEO I always found it difficult to setup redirect in IIS. We use to buy a simple software called ISAPI redirect. But things have changed now. It is much easier to setup 301 redirect on server side. If you have access to a dedicated server through remote server access you can do the following edit. It is simple and you don’t need any programming skills to do it. Just loggin to your remote server.

  1. Go to Internet Services Manager and right click on the folder you want to setup a redirect.
  2. Select the radio titled “a redirection to a URL”.
  3. Enter the redirection page
  4. Check “The exact url entered above” and the “A permanent redirection for this resource”
  5. Click on ‘Apply’

Cold Fusion Server

There are couple of methods you can use to setup cold fusion redirect.

First method is to use

CFlocation. You can use both cflocation and cfheader.

Just add this below code on the page you want to redirect. Replace newpage.cfm to your own page name or page.

<cflocation url=”newpage.cfm” statuscode=”301″ addtoken=”false”>

Using CFheader just use this below code.
<cfheader statuscode=”301″ statustext=”Moved permanently”>
<cfheader name=”Location” value=”http://www.example.com/newpage.cfm”>

Redirect from http to https on an Apache server

With Google’s new algorithm tweak which helps https sites, there is rush to setup https and redirect http pages to https. I will write a seperate article on this. For now here is a simple example how to redirect on an Apache server.

Apache recommends using RedirectSSL as the preferred method. But to use that method we need to have direct access. For shared hosting it is not possible to gain direct access. So I recommend using .htaccess redirect.

To redirect your whole site to https use this code. This code will redirect which ever page being opened to be redirected to the https version of it. For example www.example.com will redirect to https://www.example.com

RewriteEngine On
RewriteCond %{HTTPS} !=on
RewriteRule ^/?(.*) https://%{SERVER_NAME}/$1 [R,L]

This code can be used to redirect a specific page or directory.

RewriteEngine On
RewriteCond %{HTTPS} !=on
RewriteRule ^/?secure/(.*) https://%{SERVER_NAME}/secure/$1 [R,L]

If you have questions email me at webpromotions@gmail.com I am more than willing to help if I have enough time in my hands.

Thursday, September 11th, 2014 search engines No Comments

How can we create unique meta details for each product pages?

Today’s question is ,”we have an e-commerce site Around a thousand products in pages on that site so how can we create unique minute details for those pages”

 Matt cut answers,

The question should not be i have been pages how can i make the unique the question is how many pages can make the quality that provide user. if you can’t managed to have thousand pages and have something unique something different than just an affiliate theater whatever on each page of that site then why should your thousand pages which are again baby Jessica reward content of an affiliate feed rent compared to someone else’s thousand pages of that same affiliate.

Anderson what you’re asking i think i would ask yourself what makes your site unify what is the value add where what what’s the compelling proposition that makes people want to go visit your e-commerce site .if you can have unique content if you can come up with some unique Angola reason even if it’s again it’s something that makes it fun something that makes it better easier for people that makes them like your site then that that’s a really tough proposition and and so i wouldn’t say yourself a bowl of having a thousand pages and then how can i have a little bit of value each one it sometimes is better to think about okay how much value can add and that determines the number of pages on my site rather than the other way around so it des Moines sound a little bit harsh but on the other hand if everybody on the entire internet all wanted to have  thousand page site and weren’t able to come up with unique content or some other way or differentiating factor for their website you’re only left with a a lot of pages which are not that different she did not that high quality and and dot that unique and that’s not good for the web it’s not good for the ecosystem the web it’s not good for uses the way we’re just trying to find good stuff and don’t necessarily want to find  a copy of the same content they want to find you know something with unique value add so that might sound a little bit harsh but you know give it some thought because we do see a lot of people who are like okay i can make this mini sites now how do i make them useful and sometimes by the time you reach that point where you have in number of sites and you try to make them useful you’ve already made some summit in my opinion errors and how you’re trying to think about how to create your site just some food for thought whatever you’d.

 

SOPA Act and Its potential effects on Search Engines

SOPA is an abbreviation for Stop Online Piracy Act passed by the US government on October 26th 2011 in order to protect online copyrights. The bill allows US government to fight online trafficking in copyrighted intellectual property and counterfeit goods. It is built and generated from the earlier bill Protect IP Act. It allows copyright holders to seek court orders against websites accused of copyright infringement. The actions if proven guilty includes barring online advertising networks, payment facilitators such as PayPal from doing business with the accused, barring search engines to track and mark such websites and in general seeking internet service providers to completely ban the website from being viewed that is accused.

The bill also makes unauthorized streaming of copyrighted content a felony which means the viewer and the website owner could face legal ramifications for the doings. The bill is more like a internet censorship that gives immunity to internet services that voluntarily take action against websites that involves in infringement which makes it all the more profitable for copyright holders.

SOPA is actually quite different from Protect IP Act. It is more like a companion bill to the Protect IP Act that is aimed at websites or web companies hosting unauthorized content from movies, songs or software’s. Many copyright holders have been fighting against such websites for a long time now as they seem to lose their jobs or profits because of these copyright issues. Movie makers invest so much on their movies and when there are websites releasing torrent files of the movie when the movie is currently running on screen itself, people prefer to download the movies and watch without paying for it and the movie makers undergoes tremendous amount of loss. It is more of a Piracy concern that triggers the tremors between the website industries and other industries.

Website owners on the other hand believe that SOPA could break the internet itself which is quite alarming and there will be legal ramifications against almost every website invariably. They believe this act could be quite stringent and strangulate the entire web company industry.

SOPA is not just about Internet Piracy. The Act emphasizes on the fact that online infringement has become epidemic. Copyright owners feel that extreme measures ought to be taken to combat such infringes. However, for the time being they say that only “egregious” violators will be dealt with severely. But you know about politicians; that is just a false promise in order to safely pass the bill for the moment.  Those in favor specifically suggest that only rouge sites that purposely steal content and involve in the sin called illegal distribution would be targeted. But then the bill clearly states that every site that uses song clips, trailers, create GIF’s using scenes from copyrighted movies could potentially be forced to remove the content and shut down permanently. This tremendously cripples the internet and put every site in danger of violating SOPA.

SOPA would be responsible for websites to disappear:

China already has censorship issues with Google search engine. People who are against the SOPA bill can prove in many ways that the censorship rights proposed or defined in the SOPA bill isn’t much different.

To hinder foreign websites from stealing contents that belongs to a particular country or foreign sites streaming movies that belong to a specific owner, SOPA requires an internet service provider to ban or make websites disappear that is responsible for copyright issues. It states that it is ok to endanger internet security, censor sites as long as it is in the name of IP enforcement.

It makes things more complicated as it blurs the distinction between the site’s host and the members who post contents on it that eliminates internet safe harbors for shared contents that violates many other bills passed earlier regarding the same. The site owner could also be dealt with severely and legally for not monitoring his/her site or for not taking sufficient actions against members who posted copyrighted content.

If SOPA starts blacklisting domains, it means that thousands of websites associated with the offender will also get banned. Even if none of these websites apart from one particular website did not offend or violate any laws. What happened to wiki leaks could now systematically happen to many other websites in a streamlined and linked fashion as long as someone believes that IP rights have been violated.

This itself is enough to threaten the entire internet system which simply means that if this bill does get passed, every website invariably will get affected and internet might no longer exist. It will make the task for search engines even more difficult and complicated as they will have to identify such websites, mark them and ban them and if such policies are included, search engines will be very stringent banning every website for the slightest issue. Eventually search engines will become jobless as there won’t be any website only.

SOPA also creates an online monopoly. SOPA also puts barricades for advertising networks and payment sites such as PayPal. It gives copyright owners to have every authority to tell these organizations to stop funding or providing services to the accused websites. This means that these organizations or systems will have to cut down all services for the website being accused from the moment it has been accused even before the court passes its verdict and the website is proven guilty of its accusations. It financially chokes the website to death and they incur heavy losses. It creates a monopoly by allowing copyright owners to dictate terms and creating overheads or strangulating many other business firms. It also threatens popular search engines such as Google or Yahoo that fits under the bill meaning that any violation of privacy laws, even if there is one small post related to copyright terms on its search result pages, these search engine organizations also face the threat of legal ramifications. The solutions outlines in SOPA Act is draconian that forces URL’s to be removed and domains to be banned causing major trauma and strangling many business organizations that markets using the internet which includes popular search engines such as Google and Yahoo as well. All in the name of Internet Censorship and the irony is that where will internet be if this bill gets passed when you have nothing left to censor anymore?

Fundamentally somehow this has become about politics and jobs. SOPA might aid copyright owners for the time being but the future looks bleak as it imposes a great threat to the internet itself and many other business organizations as well. It might give great returns to copyright owners at present but drastically and dynamically changes our world as it will shrink and strangulate many other industries, cutting jobs and many sectors leading to worldwide unemployment. Something more like the great depression period. It manipulates our entire world to its own destruction as poverty and unemployment will spread like virus. As we know it, popular sharing sites such as YouTube wouldn’t exist only which stands as a great marketer for the same copyright owners. People often tend to develop their reputation about the copyright content and watch movies in theaters as well which brings in profits for the copyright owners. But once this gets affected, these copyright owners might thrive for the time being but in the long run even they will run out of business.

Many people and business organizations have already started to make their stand against the bill and demonstrate how drastic the cons of passing the bill could be as the US government seeks to reform the bill before making its judgment.

To summarize everything, one can simply state that SOPA will destroy our internet which include search engines as well. It doesn’t monitor and protect IP in the name of copyrights, but rather strangulates every business organizations threatening many sectors and financial inflow which makes our world economy look bleak. And without internet, ease of access to information or the first amendment of US law which is free speech will have no meaning only. There won’t be any websites to share information or market organizations after some point of time.

Tags: ,

Tuesday, November 29th, 2011 search engines, SOPA Act No Comments

Operation Kill competitor – successful

Google has long claimed no one can hurt a competitor site. We believed Google all these time but we wanted to test it recently. We took a main tail keyword where our client ranks position 12. Our client wanted us to push into top 10 results. So we tried our secret weapon against our competitors. We added a bunch of low quality links from sites deemed low quality by Google. Just added 30 unique links from different sites to each competitor and BOOM within 2 weeks we saw the competitor sites drop 6 to 8 positions. One competitor dropped from position 6 to 14 other dropped position 9 to 17. Well it can be a coincidence but there was no update during the time we tested and those sites were sitting there for about 2 years. So what Google is claiming is truth we don’t know, but we believe we can sabotage a competitor rankings but pushing low quality links. When I say low quality links the links we added are from good pagerank pages and they are strong themselves but come from a negative area.

We wish Google seriously looks into this issue. If you think I am lying we don’t care but it is 100% true we tested and the results are positive. So why disclose a secret weapon? LOL we don’t want to take it as a weapon we just tested we prefer to do SEO search engine friendly way. We tested this to bring awareness to people.

Do you disclose the test results?
Ofcourse NO. We don’t want to keep our client’s rankings and online Business in jeopardy.

 

 

 

 

 

 

 

 

 

 

 

 

Monday, August 1st, 2011 search engines, SEO, Spam 2 Comments

New Sitelink-type links with meta descriptions on some searches

Google has recently made a lot of changes to their site links. When site links ( links that appear below the site when the domain name or other major name that the domain represents appears ) came into existence people thought its a weird thing. But now Yahoo, Bing both uses them copied from Google. Its an interesting way for people to find pages to a site they get a chance of multiple pages to be clicked from the search results itself. I have personally used site-links for a long time. Every one out of 2 times i have clicked a site link than the primary result. So yes as an active internet user i use sitelinks which i feel is a big achievement from Google. A trial has worked for them and we usually see tracking URLs in Sitelinks so even Google is monitoring the clicks on sitelinks.

Today Google has improvised a lot when it comes site links. They came up with a site link similar to simple single line text links we see it visible these days. Take a look at the following example.

Now furthur improvization they are able to show actual links in descriptions it was not the case before. It shows how much effort Google is putting into their organic search too. They know Organic search is their primary area and they need to keep focusing on improving it. I am sure we can see a lot of improvization in future.


Great going Google keep it up.

Thursday, October 8th, 2009 link building, search engines 2 Comments

Top 10 Influential People In Search Engine And Search Engine Optimization Industry

Matt Cutts

Matt Cutts joined Google as a software engineer in January 2000 and currently he is the head of Google’s Web spam team. Before joining Google, he worked on his Ph.D. in computer graphics at the University of North Carolina at Chapel Hill. He has done M.S. from UNC-Chapel Hill, and B.S. degrees in both mathematics and computer science from the University of Kentucky. He wrote the first version of Safe Search, which is Google’s family filter, and he has worked on search quality and web spam at Google for the last several years.

Danny Sullivan

Danny Sullivan is the editor-in-chief of Search Engine Land, a blog that deals with news and information about search engines, and search marketing. He graduated from the University of California, Irvine and for a year he was in England working for the BBC.

Sullivan started Search Engine Watch in beginning of June 1997. Search Engine Watch was a website with guidelines on how to get good search engine results. Matt Cutts of Google considered Search Engine Watch as “must reading,” and Tim Mayer of Yahoo! as the “most authoritative source on search”. Search Engine Land is a news web site that deals with search engine marketing and search engine optimization. It was founded in 2006 by Sullivan after he left Search Engine Watch.

Brett Tabke

Brett Tabke is an American programmer and SEO professional. Brett Tabke has worked in the computer industry for almost three decades. Tabke is at present the CEO and founder of Webmaster World Inc. He is also the founder of PHD Software Systems, a specialty software manufacture that produced a line of software for Commodore computers in the 80’s and 90’s.

Tabke gave several SEO staples as “Link Farm”, “SEO Themes”, and the classic “SERP” (Search Engine Results Page). He was also a founding board member of SEMPO. He also wrote a chapter for the best selling “Google Hacks” book. He is also the first identified person to have fully decoded the Excite and most of the Altavista search engine algos. Tabke’s document “26 Steps to 15k a Day” is one of the most widely read in SEO history.

Shawn Hogan

Shawn Hogan is currently the President and CEO of Digital Point Solutions. He is also a very good programmer, analyst, technical writer and artist. However, around these parts he’s perhaps better known as digitalpoint, webmaster of the DigitalPoint Forum. In the world of eCommerce, Shawn is known for numerous things, not the least of which is his very popular SEO Tools. These extremely regarded tools are accessible free of charge and belong in the arsenal of every serious webmaster, including forum owners and administrators.

Aaron Matthew Wall

Aaron Matthew Wall a, California-based blogger and search engine optimization expert who writes the popular blog SEOBook. He is a recurrent speaker at the Search Engine Strategies and PubCon conferences.

Rand Fishkin

Rand Fishkin is the CEO and Co-Founder of SEOmoz, a head in the field of search engine optimization tools, resources & community. In 2009, he was called among the 30 Best Young Tech Entrepreneurs below 30 by Business Week, and has been written about it in The Seattle Times, Newsweek and the NY Times among others. Rand has keynoted conferences on search from Sydney to Reykjavik, Montreal to Munich and spoken at dozens of shows around the world. He’s predominantly passionate about the SEOmoz blog, read by more than 40,000 search professionals each day.

Michael Grey

Michael Grey, another man “in the know” when it deals about Search Engine Optimization. He blogs on the subject of SEO at Graywolf’s SEO Blog. He is a SEO although he does work in a lot of different areas of internet marketing. He has traditional SEO projects and works as well as managing his own PPC campaigns for some affiliate products and a few e-Books which he sells.

Loren Baker

Loren Baker is a graduate in Mass Communications (marketing and advertising) from Towson University seems to do good justice to his graduation by entering into Search Engine marketing. His interest in SEO and web world made him the Director of Search Engine Marketing in his same organization, WebAdvantage.net where he managed Search Engine Marketing and SEO team. And with the his learning and experiences, in July 2003, he launched his own Search Engine Journal where he is the editor now.

Darrin J.Ward

Darrin J. Ward leads the Darrin Ward team, a group of SEO & SEM professionals that propose top-tier professional SEO and search marketing services. Darrin has been greatly involved in the SEO industry since day one. Darrin has founded a number of successful SEO and search related Websites and search companies.

Jeremy Zawodny

Jeremy Zawodny is currently a Technology Evangelist at Yahoo. Formerly he was part of Yahoo’s platform engineering group. He’s been at Yahoo ever since then, helping to put MySQL and other Open Source technologies to use in amusing, interesting, and often very big ways. In 2000, he started writing for Linux Magazine and continues to do so today as a columnist and contributing editor.

Friday, September 4th, 2009 search engines, SEO 9 Comments

SEOBABA.com Jokers PR-inside cannot be so ignorant

The stupid Seobaba.com guys sending out a press release in our company name. They don’t even know to edit the name of the company. Check here pr-inside.com/seo-sem-ppc-company-india-usa-uk-canada-r1430835.htm . Why are some of these indian SEO companies becoming a joke.

Just one question to these guys grow up. Copy past is not the way to develop in your Business. The most stupid thing ( SEObaba.com ) they can ever do is copy content and send it out for press release. Worst joke is Pr-inside.com don’t even check these. They are just jokers who need to learn the way internet works.

SEOBABA joke

Tuesday, August 11th, 2009 search engines No Comments

Google gets all the bashing but why?

From what I have seen in forums and blogs Google gets so much bashing for something they do to defend their algorithm. Why do people do that? Don’t they ever know they wouldn’t have been doing SEO for their sites to rank in Google if Google never exist?

I have been watching Google ever since I started my online Business. I have seen major Google updates for a period of over 7 years almost all the updates were aimed at protecting their algorithm and getting rid of Spam and sites that entertain aggressive search engine ranking tactics. Today Google has changed into a highly quality search engine with good results. If they were not targeting the aggressive search engine optimization people they will not be what they are today.

Hottest topic in today’s SEO world is the Google’s ability to detect and penalize paid links. Whether you buy it or sell it if you get caught by Google police you are gone. Once in a SEOmoz post Matt cuts replied to Rebecca’s post where he talks about natural links being like very strong tires and paid or other artificial links as week tubes / tires than can burst any time. It’s actually true and from what I have seen every site that got affected for links had some sort of problem with artificial links.

Personal experience

Our own site had some problem with Google rankings when we created the search engine promotion widget and got lots of backlinks without knowing we were abusing it. Then we were hit with ranking filter which prevented our site from being in top 10. Did we whine? Well know personally we were not aware that widget links can hurt a site. We were not abusing the system in any way with widgets we spent money on our widgets and the only way we get back our investment is by links. We do that for all our tools but Google never complained on it but when we redirected the links from widgets to our Homepage Google algorithm got angry with our site and reduced our rankings.

What did we do?

We never whined we made all the widget links optional no-follow, cleaned up some links to homepage, removed link to homepage and added it to the widget page directory, checked for any other potential problem with our website and submitted a re-consideration review and in 1 month we were back in rankings.

So was Google wrong with our website?

Ofcourse no even though we thought widget links when not abused will not affect rankings still we shouldn’t have linked to the homepage with keywords. It’s our mistake and Google has every right to make us regret for this mistake their own way. But Google were nice, in fact very nice after rectifying our mistake and explaining them we got back to rankings. So Google definitely want us back in their rankings. Over 4000 people use our SEO tools (http://www.searchenginegenie.com/seo-tools.htm ) and out of that almost 2000 come from search engines. Google knows that and they know our tools get lot of traffic from them and they are happy to send people because people like it.

We don’t come under link buying / selling category

We never bought a single link to our site almost all of them are links to our tools, widgets and some custom built links through articles, directories, blogging etc. We don’t buy links but still hit with a link buying / selling detection algorithm. Was Google wrong in doing this? Ofcourse no why because abusing a widget Is same like buying links. Those links are not editorially given links, people linked to us in exchange for our widget. They didn’t link to our homepage because they liked our site. We understand / I understand and when everyone in our company understands Google’s position we are all good with anything Google decides. But not everyone take it that way I see so much Google bashing out there when something Google does to protect itself and its algorithm.

Being SEO is nothing to be proud of.

Some people think SEO is something great and they are the best in the world. I’ll tell you in Google point of view most of the SEOs are very close to spammers. Not everyone but most I said, including places like SEOmoz which is popular among SEOs discuss so much link buying / selling. Even Rand fishkin is an active support of Text-link-ads and he also supports buying / selling links for ranking. If this industry supports so much text link buying / link selling for ranking purposes and Google tries to defend itself is it wrong? For most SEOs yes Google is wrong. I would call that **** ****. Without Google you would have never existed, who are you to give commands to Google? The massive improvement by Google in transparency with webmasters and Google has helped webmasters a lot. But still webmasters and SEOs want more and more. They don’t want Google to penalize link buying, selling and other sort of aggressive and abusive link building tactics. I would say better leave the SEOs to run the search engine they know how difficult it is. Even the so called Google supporters abuse the search engines when they loose rankings. If you lost your ranking see the mistake you did. Rectify your mistakes, fix them and ask Google to reconsider rather than whining that Google is useless.

Confession from a SEO.

I am in this industry for more than 7 years. Am I proud to be a SEO? No never this industry is hated by so many people including the search quality engineers themselves. I am passionate about search engines I like them, I like the miracle algorithm that works behind it, I like all the PHDs. I personally wanted to become a scientist which never happened. I want to be friends of search engineers not for SEO benefit but to admire and gain knowledge from the wonderful work they do. I sometimes wonder why I came into this SEO industry. Truth I came into SEO from my programming background only for the money involved. This industry has so much money involved than programming and web design. People will pour money if they get good business from search engines. I have seen that practically in some PPC campaigns our company handles. Some big clients spend around 100,000$ a month for PPC. Though ‘not the same case in SEO still the rewards are high. But I am always looking alternate ways because I am not the bad guy type who goes after money. I like to earn money in a way everyone appreciates. Not in a way everyone glares at you. To all the SEOs out there realize the type of work you are doing and please give respect to my loveable search engine. If Google never existed I wouldn’t be here running a Business in SEO. Love Google and appreciate everything they do whether its right or wrong. Everyone appreciates if Google does something right and everyone bashes if they do something harsh to protect their algorithm. Love Google and all its efforts.

My suggestion to all the SEOs and newbie’s (so called SEOs out there) . Google is a search engine for people it’s not for you to play with.

Tuesday, July 7th, 2009 Google, search engines, Webmaster News No Comments

Age of search engine crawlers:

From age old days where the search engine crawlers used to take up to a month to crawl a page and can take another 15 to 30 days to re-crawl the page things have changed a lot. Now search engine crawlers are the best compared to previous years. Lot has changed with search engine crawlers. Due to ever evolving algorithms and new features added to crawlers they tend to crawl a website much easier and faster than ever before.

Way back in 2001 search engine crawlers were not so sophisticated. Most of the crawlers including Googlebot take a lot of time to crawl a page and more time to refresh the data. It slowly evolved over the years and today we have Googlebot picking up pages of millions of websites in minutes sometimes in seconds. Features like RSS, XML feeds atom feeds help Google track blogs and other relevant sites and they crawl the pages as soon as the new pages are live on the blogs. It’s not just blogs Google at times have capabilities to detect changes to a website and if a webpage is added search engines are good enough to detect the new page and crawl it.
I call this the age of Crawlers because they just amaze me with their speed and effectiveness. Imagine millions of WebPages are updated every day but Googlebot is able to detect and spider them immediately. I wonder what the future will be for search engines I don’t see any more improvement to be made lets wait and see.

Monday, May 25th, 2009 search engines No Comments

Age of domains – Importance in search engines:

I’d say the age of domain is one of the top factors in obtaining “authority” status. And then within the age factor are all the other things that determine its age. Change of owners. Change of IPs. Change of Servers. Change of just about everything that takes place within DNS.

I also feel that “stability” in the hosting network is another factor. I feel that anyone who has a serious Internet business is going to have their own IPs and possibly server(s) to do their thing. They won’t be relying on a cheap web host to compete in a market that is reserved for the BIG BOYs. That’s like showing up for a drag race in a Prius or something. :)

It is not the “only” factor though. Success takes time. There was a point in Internet history when it “could” happen overnight but that has slowly dwindled to lotto type statistics. You’ll have to invest the time and patience knowing that what you are building is a long term proposition.

Sure, you can open your doors and have success overnight and it still happens. If you have the right product and/or service along with a few strategically placed marketing efforts, you can start the ball rolling. Nature will take its course from there. Yes, things will “naturally” happen as it goes viral. Unfortunately many of us may not reach that level and we’ll continue to work in the trenches, feed our families, and make a decent living off the leftovers. :)

Yes, domain age is an important factor in “all” things Internet related. Each year that passes, the value of that domain increases. It’s like fine wine. Since 1995 has a bit of meaning when talking Internet these days. Companies who have been online for any period of time would be wise to start advertising their since dates.

Wednesday, March 11th, 2009 search engines No Comments