SEO
Log file analysis is the way for better ROI – Analysing log files, seo tip feb 25 friday 2005
Log file analysis is very important in success of website,
Log file information provides a baseline of statistics that indicate use levels and support use and/or growth comparisons among parts of a site or over time. Such analysis also provides some technical information regarding server load, unusual activity, or unsuccessful requests, and can assist in marketing and site development and management activities. Server log file analysis is also an important part of search engine optimization, it also helps in determining the best keyword that converts and best keyword mostly used,
What’s in a Log File
Every communication between a client’s browser and a Web server results in an entry in the server’s log recording the transaction.
In general, a log file entry contains:
the address of the computer requesting the file
the date and time of the request
the URL for the file requested the protocol used for the request the size of the file requested
the referring URL the browser and operating system used by the requesting computer.
What Can You Learn From a Log File?
Data available from a log file can be compiled and combined in various ways, providing statistics or listings such as:
total files and kilobytes successfully served
number of requests by type of file, such as HTML page views
distinct IP addresses served and the number of requests each made
number of requests by domain suffix (derived from IP addresses)
number of requests by HTTP status codes (successful, failed, redirected, informational)
totals and averages by specific time periods (hours, days, weeks, months, years)
URLs from which user came to the site browsers and versions making the requests.
number of requests made (“hits”)
number of requests for specific files or directories
How good are log files?
We should be able to differentiate between hits, page view and unique visitors, not all hits and page views are visitors, there are good free tools for separating the data that way, some are funnel web analyser, awstats, webalizer etc,
For more information on tracking and logging visit this forum thread, Tracking and Logging http://www.webmasterworld.com/forum39/
Is Javascript a hindrance for search engines – search engine optimization tip 24 feb 2005
Javascript is a complex language which needs lot of browser understandable coding to make it friendly for browsers, Search engine crawlers are not too sophisticated browsers, they dont have high capabilities like Internet explorer or firefox in understanding Javascript,
So best is to reduce the use of javascript in web pages intend to be search engine friendly, Large javascript coding also covers a lot of the page and search engines find difficult to index contents buried lower than the javascript, Best option is to add the javascript used for your site in an external .js file, that way the size of the page is reduced,
Search engines like google used to crawl only the first 101kb of the html of the page, so it is important that the higher level of the page is well accessible by crawlers, Also it is best to avoid Javascript dropdown menus, rollovers, popups etc, Search engines find difficult to crawl these stuff, if you have a javascript menu best is to add text only links to your inner pages, that way search engines wont struggle in finding your precious inner pages,
Googlebot google’s crawler has recently been reported to crawl javascript, it is a great improvement but still it is in beginning stage, so better avoid javascript menus and avoid large coding in javascript,
Use external .js files to reduce the size of the page,
SEO BLOG TEAM,
Detecting "rel=nofollow" – way to detect the new anti comment spam tag rel=nofollow using FireFox browser — search engine optimization tip 23 2005
There is a new way to detect rel=nofollow tag using the firefox browser, Detecting this tag helps in catching cheating webmasters using this tag in their directories and link exchanges,
rel=nofollow tag can be highlighted in firefox by adding a simple line to the main CSS file ( usercontent.css ) which powers the appearance of the web pages in firefox browser,
we have to find the path of the usercontent.css file and add a simple line so that the rel=nofollow link appears in a different colour,
For Windows XP / windows 2000 users:
C:\Documents and Settings\[User Name]\Application Data\Mozilla\Firefox\Profiles\hmgpxvac.default\chrome
hmgpxvac can be anything on your system,
For PCs using Windows 98 / ME:
C:\WINDOWS\Application Data\Mozilla\Firefox\Profiles\hmgpxvac.default.default\chrome
Open up your new userContent.css file and add the following line:
a[rel~=”nofollow”] { border: thin dashed firebrick! important; background-color: rgb(255, 200, 200)! important; } save the file,
if above code is used firefox will automatically highlight links in red,
You can highlight links in internet explorer too refer the attached for it,
http://www.searchenginegenie.com/seo-blog/ie-detection-nofollow.txt
SEO Blog Team,
Will automated submission to search engines hurt your site or hurt rankings – search engine optimization tip feb 22 2004
Automated submission are not good, Todays search engines are sophisticated crawler based search engines and they find sites based on links pointing to that site, Better and more the links better search engines will crawl the site,
Automated submission rarely hurt your site, Search engines understand that even your competitor can submit your URLs repeatedly to search engines, So automated submission rarely hurts, but we have to avoid automated submissions, Just for self satisfaction we can do a manual submission and leave it to search engines to find the site, crawl it, index it and rank,
Keep building links to your sites definitely all search engine crawlers will find your site,
Avoid automated search engine submissions,
SEO Blog Team,
Subdomains or sub-folders which is better for search engine optimization? SEO tip – Monday – feb 21 2005
Sub-domains or sub-folders for search engines, This has been an important questions among forums,
In our point of view to build a big quality site sub folders are better, search engines usually treat sub domains as different sites, Subdomains and subfolders use completely different standards,
A subdomain acts as a different site, places where sub domains are useful,
1. Say suppose you want to start a site completely unrelated to your existing site, but you dont want to buy new hosting or use some other domain, that time subdomains serve a very good purpose, if your site is real-estate.yourbrand.com then you can start a completely different topic site saying cosmetics.yourbrand.com here real estate site will talk about real estate and cosmetics will talk about cosmetic items,
2. Also if you are a free hosting provider it is will be costly process to give away domain hosting, so lots of free hosting companies give sub domain hosting, some sites like freeservers, netfirms, tripod etc give free sub domains, since sub domains are treated completely different sites people can use it freely without any problems,.
3. If your organization is an affiliate, subsidiary, or chapter of a national or international organization, you might ask the headquarters if you can get a subdomain within the organization’s domain; if you can, this would save you the InterNIC registration fee (subdomains don’t cost anything to register) as well as give your organization an address that indicates its affiliation.
4. One more thing to note is that, if you use cookies in your site, they only work within one domain. For security and privacy purposes, browsers will not transmit cookies to any site in a different domain from the one that set the cookie. They can be shared across subdomains and hosts within a domain, but not from one domain to another.
5. Subdomains is appropriate, especially if you have a domain name that has a high level of brand recognition, For example sites like google, excite, yahoo etc use subdomains to show diversity among their sites,
subfolders are always considered part of a site,
For example
here there are two subfolders, Both those subfolders talk completely about different topics, research subfolder talks about all research done by google, their research materials etc,
Services subfolder talks about all services provided by google.com, those they talk about completely different areas they virtually stay on the same site, So they literally belong to the same site, this type of division helps identify a site being an authority,
For better search engine ranking and becoming an authority site in search engines we recommend using subfolders than sub domains, Some major disadvantage of sub domains are subdomains are subject to a lot of spamming, Lots of spammers create 100s of subdomains for lots of keywords and just manipulate the search engine results,
because of this search engines frown upon these type of sites slightly, So better go for subfolders if you want to run a quality site,
SEO Blog Team,
Is frames a hindrance for search engines, Seo tip feb 20 2005
Search engines have difficulty indexing framed sites, We seo people recommend not to build framed sites, Frames can help navigation of large sites but they are not search engine friendly, You can use frames to load a third party sites as well as any page inside your site, This is one reason search engines stay away from indexing contents inside a frame,
Those contents may not necessarily be from your site it can come from any site,
Say suppose your site is built in frames and you don’t want to rebuilt them, best way to get some content crawled is to add the contents in a noframes tag,
these tags helps the search engine to identify frames and index content inside the no frames tag,
If you want best search engine ranking avoid frames totally and design sites which render proper html to the browser, Whether it is a dynamic site or static site proper rendering of html matters, Search engine crawlers are just simple browsers like netscape or firefox,
If you have a framed site we at search engine genie will help remove it and optimize it for better search engine rankings, Contact us if you want to get rid of your site from frames,
SEO Blog Team,
Does Off-topic inbound links cause problem to your site or does search engines ignore them? – Search engine optimization tip feb 19 2005
Many of us have this problem with offtopic inbound links, Some people think off-topic links are bad for a site, The fact that is not the case, Internet is a huge network of websites, Any site can link to anysite that is just a simple rule of thumb,
For example if I run an ice cream site and my friend runs a real estate site is it wrong to link to my friend’s site recommending his services to my visitors, Nope, Infact Seos or search engine optimizers are the ones worried about off-topic inbound links, Some misguided webmasters, site owners also worry about off topic links,
From Search Engine Genie team’s point of view there is nothing wrong in getting off topic links for a site, Just get them natural not artificial by link buying, Our site gets a lot of natural backlinks from various sites simply for quality contents do that for your site too, Attract people to link to your site,
Search engines today are not sophisticated enough to identify quality or related inbound links, If they ever think of doing it, they have to spend a lot of on it and also it will use up a lot of resources for them,
Best rule
Dont get too many sitewide offtopic links,
Dont go for too many unrelated links,
SEO Blog team,
Http list of header status Important tip of the header status of a site – seo tip friday 18 feb 2005
list of HTTP status :-
100 Continue
The client SHOULD continue with its request. This interim response is used to inform the client that the initial part of the request has been received and has not yet been rejected by the server.
101 Switching Protocols
The server understands and is willing to comply with the client’s request, via the Upgrade message header field (section 14.42), for a change in the application protocol being used on this connection
200 OK
The request has succeeded. The information returned with the response is dependent on the method used in the request
GET an entity corresponding to the requested resource is sent in the response;
HEAD the entity-header fields corresponding to the requested resource are sent in the response without any message-body;
POST an entity describing or containing the result of the action;
TRACE an entity containing the request message as received by the end server.
201 Created
The request has been fulfilled and resulted in a new resource being created.
202 Accepted
The request has been accepted for processing, but the processing has not been completed.
203 Non-Authoritative Information
The returned metainformation in the entity-header is not the definitive set as available from the origin server, but is gathered from a local or a third-party copy.
204 No Content
The server has fulfilled the request but does not need to return an entity-body, and might want to return updated metainformation.
205 Reset Content
The server has fulfilled the request and the user agent SHOULD reset the document view which caused the request to be sent.
206 Partial Content
The server has fulfilled the partial GET request for the resource.
300 Multiple Choices
The requested resource corresponds to any one of a set of representations, each with its own specific location, and agent- driven negotiation information (section 12) is being provided so that the user (or user agent) can select a preferred representation and redirect its request to that location.
301 Moved Permanently
The requested resource has been assigned a new permanent URI and any future references to this resource SHOULD use one of the returned URIs.
302 Found
The requested resource resides temporarily under a different URI. Since the redirection might be altered on occasion, the client SHOULD continue to use the Request-URI for future requests.
303 See Other
The response to the request can be found under a different URI and SHOULD be retrieved using a GET method on that resource.
304 Not Modified
If the client has performed a conditional GET request and access is allowed, but the document has not been modified, the server SHOULD respond with this status code.
305 Use Proxy
The requested resource MUST be accessed through the proxy given by the Location field.
306 (Unused)
The 306 status code was used in a previous version of the specification, is no longer used, and the code is reserved.
307 Temporary Redirect
The requested resource resides temporarily under a different URI. Since the redirection MAY be altered on occasion, the client SHOULD continue to use the Request-URI for future requests.
400 Bad Request
The request could not be understood by the server due to malformed syntax. The client SHOULD NOT repeat the request without modifications.
401 Unauthorized
The request requires user authentication.
402 Payment Required
403 Forbidden
The server understood the request, but is refusing to fulfill it.
404 Not Found
The server has not found anything matching the Request-URI. No indication is given of whether the condition is temporary or permanent.
405 Method Not Allowed
The method specified in the Request-Line is not allowed for the resource identified by the Request-URI. The response MUST include an Allow header containing a list of valid methods for the requested resource.
406 Not Acceptable
The resource identified by the request is only capable of generating response entities which have content characteristics not acceptable according to the accept headers sent in the request.
407 Proxy Authentication Required
This code is similar to 401 (Unauthorized), but indicates that the client must first authenticate itself with the proxy.
408 Request Timeout
The client did not produce a request within the time that the server was prepared to wait. The client MAY repeat the request without modifications at any later time.
409 Conflict
The request could not be completed due to a conflict with the current state of the resource.
410 Gone
The requested resource is no longer available at the server and no forwarding address is known.
411 Length Required
The server refuses to accept the request without a defined Content- Length.
412 Precondition Failed
The precondition given in one or more of the request-header fields evaluated to false when it was tested on the server.
413 Request Entity Too Large
The server is refusing to process a request because the request entity is larger than the server is willing or able to process. The server MAY close the connection to prevent the client from continuing the request.
414 Request-URI Too Long
The server is refusing to service the request because the Request-URI is longer than the server is willing to interpret.
415 Unsupported Media Type
The server is refusing to service the request because the entity of the request is in a format not supported by the requested resource for the requested method.
416 Requested Range Not Satisfiable
A server SHOULD return a response with this status code if a request included a Range request-header field (section 14.35), and none of the range-specifier values in this field overlap the current extent of the selected resource, and the request did not include an If-Range request-header field.
417 Expectation Failed
The expectation given in an Expect request-header field (see section 14.20) could not be met by this server, or, if the server is a proxy, the server has unambiguous evidence that the request could not be met by the next-hop server.
500 Internal Server Error
The server encountered an unexpected condition which prevented it from fulfilling the request.
10.5.2
501 Not Implemented
The server does not support the functionality required to fulfill the request. This is the appropriate response when the server does not recognize the request method and is not capable of supporting it for any resource.
502 Bad Gateway
The server, while acting as a gateway or proxy, received an invalid response from the upstream server it accessed in attempting to fulfill the request.
503 Service Unavailable
The server is currently unable to handle the request due to a temporary overloading or maintenance of the server.
504 Gateway Timeout
The server, while acting as a gateway or proxy, did not receive a timely response from the upstream server specified by the URI (e.g. HTTP, FTP, LDAP) or some other auxiliary server (e.g. DNS) it needed to access in attempting to complete the request.
505 HTTP Version Not Supported
The server does not support, or refuses to support, the HTTP protocol version that was used in the request message.
SEO Blog Team,
Keywords in domain name do the help ? Search engine optimization tip Thursday feb 17 2005
Lots of people in SEO industry are still worried about having keywords in their domain names, So are they worth having it, NO,
In our opinion best domain name is a domain name good for the users, Usually keyword domain names ar hyphenated and they don’t have good value from users, These days search engines place so little value on keywords in domain names it is just a negligible factor,
There are more important factors than just keywords in domain name, Keywords in domain name help if you are spammer, Say if a spammer wants to do automated blog comment spam and guest book spam he might not always get the backlinks with the anchor text he expected, that time if the keyword is in domain name search engines give value to them,
For legitimate search engine optimization keyword hyphenated domain names are a waste of time and effort, Best we recommend select a domain name best your brand and business, Keyword domains are a thing of the past don’t worry about keyword rich domain names unless you are spammer who want to spam search engines,
We hate spam and we hate people who do that,
SEO Blog Team,
Is sitemap necessary for a site? Search engine optimization seo tip wednesday February 16 2005,
Site map is a term coined and used for a site to show a list of important pages of a site, To avoid the homepage or any other page look cluttered sitemaps are created and connected to an important page of the site, Mostly it is connected to the homepage of a site,
There has been various debates across various forums whether a sitemap is essential for a site, As an opinion of search engine genie we suggest adding a site map to your site, Site maps help in various ways both for visiting search engines and regular visitors,
Some benefits of sitemaps on a site,
1. Users wont get lost of your site if you run a big site, If you connect your important pages in your site map users can navigate easily to any page listed in the sitemap and return to the main page without loosing their path,
2. For large sites it is difficult to navigate deeper and deeper, A good sitemap prevents this and users can navigate to any page without getting lost, Also most of the inner pages will come more closer to the homepage and this helps effectively in good navigation,
3. Not all search engine crawlers crawl a site deeper and deeper, Some crawlers like yahoo slurp tend to go into a site only to a certain extent, For crawlers like slurp pages shouldnt be buried deeper, A sitemap will combat this problem, When a site map is used pages will come closer to top level and this help yahoo’s crawler slurp to reach inner pages too,
4. Also suppose if a site has javascript menus, DHTM menus and other menus that are not search engine friendly a sitemap will definitely help in better crawling of a site, A sitemap with text only links to inner pages is very much useful,
5. Sitemaps also help in better spreading of a pagerank, Pagerank google’s trademark algorithm
Site maps are good for large sites as well as small sites, Hope our tips on sitemaps will give your site a better exposure to search engines,
SEO Blog team,
Blogroll
Categories
- AI Search & SEO
- author rank
- Authority Trust
- Bing search engine
- blogger
- CDN & Caching.
- Content Strategy
- Core Web Vitals
- Experience SEO
- Fake popularity
- gbp-optimization
- Google Adsense
- Google Business Profile Optimization
- google fault
- google impact
- google Investigation
- google knowledge
- Google panda
- Google penguin
- Google Plus
- Google Search Console
- Google Search Updates
- Google webmaster tools
- google-business-profile
- google-maps-ranking
- Hummingbird algorithm
- infographics
- link building
- Local SEO
- local-seo
- Mattcutts Video Transcript
- Microsoft
- Mobile Performance Optimization
- Mobile SEO
- MSN Live Search
- Negative SEO
- On-Page SEO
- Page Speed Optimization
- pagerank
- Paid links
- Panda and penguin timeline
- Panda Update
- Panda Update #22
- Panda Update 25
- Panda update releases 2012
- Penguin Update
- Performance Optimization
- Sandbox Tool
- search engines
- SEO
- SEO Audits
- SEO Audits & Monitoring
- SEO cartoons comics
- seo predictions
- SEO Recovery & Fixes
- SEO Reporting & Analytics
- seo techniques
- SEO Tips & Strategies
- SEO tools
- SEO Trends 2013
- seo updates
- Server Optimization
- Small Business Marketing
- social bookmarking
- Social Media
- SOPA Act
- Spam
- Technical SEO
- Uncategorized
- User Experience (UX)
- Webmaster News
- website
- Website Security
- Website Speed Optimization
- Yahoo




