Log file analysis is the way for better ROI – Analysing log files, seo tip feb 25 friday 2005
Log file analysis is very important in success of website,
Log file information provides a baseline of statistics that indicate use levels and support use and/or growth comparisons among parts of a site or over time. Such analysis also provides some technical information regarding server load, unusual activity, or unsuccessful requests, and can assist in marketing and site development and management activities. Server log file analysis is also an important part of search engine optimization, it also helps in determining the best keyword that converts and best keyword mostly used,
What’s in a Log File
Every communication between a client’s browser and a Web server results in an entry in the server’s log recording the transaction.
In general, a log file entry contains:
the address of the computer requesting the file
the date and time of the request
the URL for the file requested the protocol used for the request the size of the file requested
the referring URL the browser and operating system used by the requesting computer.
What Can You Learn From a Log File?
Data available from a log file can be compiled and combined in various ways, providing statistics or listings such as:
total files and kilobytes successfully served
number of requests by type of file, such as HTML page views
distinct IP addresses served and the number of requests each made
number of requests by domain suffix (derived from IP addresses)
number of requests by HTTP status codes (successful, failed, redirected, informational)
totals and averages by specific time periods (hours, days, weeks, months, years)
URLs from which user came to the site browsers and versions making the requests.
number of requests made (“hits”)
number of requests for specific files or directories
How good are log files?
We should be able to differentiate between hits, page view and unique visitors, not all hits and page views are visitors, there are good free tools for separating the data that way, some are funnel web analyser, awstats, webalizer etc,
For more information on tracking and logging visit this forum thread, Tracking and Logging http://www.webmasterworld.com/forum39/
Is Javascript a hindrance for search engines – search engine optimization tip 24 feb 2005
Javascript is a complex language which needs lot of browser understandable coding to make it friendly for browsers, Search engine crawlers are not too sophisticated browsers, they dont have high capabilities like Internet explorer or firefox in understanding Javascript,
So best is to reduce the use of javascript in web pages intend to be search engine friendly, Large javascript coding also covers a lot of the page and search engines find difficult to index contents buried lower than the javascript, Best option is to add the javascript used for your site in an external .js file, that way the size of the page is reduced,
Search engines like google used to crawl only the first 101kb of the html of the page, so it is important that the higher level of the page is well accessible by crawlers, Also it is best to avoid Javascript dropdown menus, rollovers, popups etc, Search engines find difficult to crawl these stuff, if you have a javascript menu best is to add text only links to your inner pages, that way search engines wont struggle in finding your precious inner pages,
Googlebot google’s crawler has recently been reported to crawl javascript, it is a great improvement but still it is in beginning stage, so better avoid javascript menus and avoid large coding in javascript,
Use external .js files to reduce the size of the page,
SEO BLOG TEAM,
Do search engines have to act to all spam reports- An extra ordinary whining thread in webmasterworld.com
When i was browsing through webmasterworld.com the ultimate forum I found a thread where the thread starter was worried that his competitor using hidden text and the site was not penalized,
Read more in this thread here http://www.webmasterworld.com/forum30/28211.htm
Read answers of great experts like Brett Tabke, Ciml etc they have seen this all the time from newbies and they are taking all the efforts to correct newbies so called seo experts,
One thing people should know that search engines are not for seos, webmasters or site owners it is solely run for end users who perform searching, Webmasters are just an other group of google users, Google needn’t have to worry about how well they make webmasters happy, They just have to worry about their search users, If something is hurting their search users they will definitely take action on it, SEOs or webmasters needn’t have to whine all the time that their competitor site is not banned by google or any search engine for spam( spam in seos mind ),
According to experts search engine SPAM report for so called search engine optimization experts is simply a waste of time,
A great expert gave a clean expansion for useless SEOs definition of SPAM,
SPAM – Sites Positioned Above Mine.
Don’t whine if your site don’t rank, Keep working on contents and backlinks, stop worrying what others use to rank their sites,
An other expert clearly said “If you think sites using hidden text or hidden links ranks high and you cant outrank them, it simply means you SUCK at search engine optimization”
Yahoo’s Image search index increases to 1.5 billion images,
Yahoo image search index has increased to over 1.5 billion images, Initially yahoo had 1 billion images, then google updated their index with about 1,187,630,000 images, Now yahoo index has increased to over 1.5 billion images,
Detecting "rel=nofollow" – way to detect the new anti comment spam tag rel=nofollow using FireFox browser — search engine optimization tip 23 2005
There is a new way to detect rel=nofollow tag using the firefox browser, Detecting this tag helps in catching cheating webmasters using this tag in their directories and link exchanges,
rel=nofollow tag can be highlighted in firefox by adding a simple line to the main CSS file ( usercontent.css ) which powers the appearance of the web pages in firefox browser,
we have to find the path of the usercontent.css file and add a simple line so that the rel=nofollow link appears in a different colour,
For Windows XP / windows 2000 users:
C:\Documents and Settings\[User Name]\Application Data\Mozilla\Firefox\Profiles\hmgpxvac.default\chrome
hmgpxvac can be anything on your system,
For PCs using Windows 98 / ME:
C:\WINDOWS\Application Data\Mozilla\Firefox\Profiles\hmgpxvac.default.default\chrome
Open up your new userContent.css file and add the following line:
a[rel~=”nofollow”] { border: thin dashed firebrick! important; background-color: rgb(255, 200, 200)! important; } save the file,
if above code is used firefox will automatically highlight links in red,
You can highlight links in internet explorer too refer the attached for it,
http://www.searchenginegenie.com/seo-blog/ie-detection-nofollow.txt
SEO Blog Team,
Hotbot says bye to yahoo search results, Now uses google search results
hotbot.com a search engine used by very low number of users was powered by Inktomi search now yahoo search, Until recently I used to check hotbot for our clients since hotbot results used to show the dates which the yahoo slurp crawler visited our sites and our clients, Now when I performed a search suddenly results were completely different,
then the results was compared to google and it was similar, It seems hotbot search engine has stopped using yahoo search engine results and now using google search results,
Not a big significant change but worth knowing,
Google introduces new command movies:, Google’s movie review search hits the ground
Google has introduced a new feature search the movie: This new search helps users to find movie related information, For example we can find reviews, movie related etc using this search,
Movie: search also helps to find movie names you happen to forget, read what google says about movie: command,
Just in time for the Oscars, we’ve created a new “movie:” operator that enables you to find movie-related information faster and more easily, whether you’re looking for titles or actors, director or genre, famous lines or obscure plot details. Can’t remember the name of that film where Tom Hanks made friends with a volleyball? Search for [movie: Tom Hanks talking to a volleyball] and Google will tell you: it was Cast Away.
Google is reaching great heights and their new feature will be absolutely helpful,
Will automated submission to search engines hurt your site or hurt rankings – search engine optimization tip feb 22 2004
Automated submission are not good, Todays search engines are sophisticated crawler based search engines and they find sites based on links pointing to that site, Better and more the links better search engines will crawl the site,
Automated submission rarely hurt your site, Search engines understand that even your competitor can submit your URLs repeatedly to search engines, So automated submission rarely hurts, but we have to avoid automated submissions, Just for self satisfaction we can do a manual submission and leave it to search engines to find the site, crawl it, index it and rank,
Keep building links to your sites definitely all search engine crawlers will find your site,
Avoid automated search engine submissions,
SEO Blog Team,
Subdomains or sub-folders which is better for search engine optimization? SEO tip – Monday – feb 21 2005
Sub-domains or sub-folders for search engines, This has been an important questions among forums,
In our point of view to build a big quality site sub folders are better, search engines usually treat sub domains as different sites, Subdomains and subfolders use completely different standards,
A subdomain acts as a different site, places where sub domains are useful,
1. Say suppose you want to start a site completely unrelated to your existing site, but you dont want to buy new hosting or use some other domain, that time subdomains serve a very good purpose, if your site is real-estate.yourbrand.com then you can start a completely different topic site saying cosmetics.yourbrand.com here real estate site will talk about real estate and cosmetics will talk about cosmetic items,
2. Also if you are a free hosting provider it is will be costly process to give away domain hosting, so lots of free hosting companies give sub domain hosting, some sites like freeservers, netfirms, tripod etc give free sub domains, since sub domains are treated completely different sites people can use it freely without any problems,.
3. If your organization is an affiliate, subsidiary, or chapter of a national or international organization, you might ask the headquarters if you can get a subdomain within the organization’s domain; if you can, this would save you the InterNIC registration fee (subdomains don’t cost anything to register) as well as give your organization an address that indicates its affiliation.
4. One more thing to note is that, if you use cookies in your site, they only work within one domain. For security and privacy purposes, browsers will not transmit cookies to any site in a different domain from the one that set the cookie. They can be shared across subdomains and hosts within a domain, but not from one domain to another.
5. Subdomains is appropriate, especially if you have a domain name that has a high level of brand recognition, For example sites like google, excite, yahoo etc use subdomains to show diversity among their sites,
subfolders are always considered part of a site,
For example
here there are two subfolders, Both those subfolders talk completely about different topics, research subfolder talks about all research done by google, their research materials etc,
Services subfolder talks about all services provided by google.com, those they talk about completely different areas they virtually stay on the same site, So they literally belong to the same site, this type of division helps identify a site being an authority,
For better search engine ranking and becoming an authority site in search engines we recommend using subfolders than sub domains, Some major disadvantage of sub domains are subdomains are subject to a lot of spamming, Lots of spammers create 100s of subdomains for lots of keywords and just manipulate the search engine results,
because of this search engines frown upon these type of sites slightly, So better go for subfolders if you want to run a quality site,
SEO Blog Team,
Do search engines hate directories? Lots of big directories disappear from the results or are PR0ed
Recently lots of big directories started to disappear from google results, Many have speculated various reasons for it, Some people blame google tightening up the dupe content penalty which makes directories disappear since they have lots of duplicate pages,
Some speculate google is removing directories manually from their results since empty directories doesn’t add any value to their search users,
it is true fact empty or very low listing directories doesn’t provide value to visitors, There are some ultimate resource directories like dmoz.org, yahoo directory which provide great value to users to find good quality relevant sites, There are new empty directories just emerging whose sole purpose is to earn money, They simply start some junk directory take 40 for listing and inturn people who list their site there doesn’t get anything in return than just a simple link,
There are directories out there whose sole advertising tactics is around google Pagerank, They get high pagerank by buying links from industry established sites and sell those pagerank indirectly to people who list their sites, So do those directories provide value to users — NO, Most the categories in those junk directories are empty or with only 1 or 2 listings,
So we at search engine genie would like to warn everyone who submit their site to directories seeing their pagerank, Better check the number of pages indexed in google, check whether the category you submit has other good quality sites listed, check whether your category has google pagerank, check whether the category page your site will feature is crawled by top search engines like google, yahoo, msn etc. Be warned that there are people who run directories with the sole purpose of getting rich quickly, We don’t like to mention their names here since it doesn’t add any value to our awareness campaign, We just like to warn people before submitting their site to Junk directories,
SEO Blog Team.
Blogroll
Categories
- AI Search & SEO
- author rank
- Authority Trust
- Bing search engine
- blogger
- CDN & Caching.
- Content Strategy
- Core Web Vitals
- Experience SEO
- Fake popularity
- gbp-optimization
- Google Adsense
- Google Business Profile Optimization
- google fault
- google impact
- google Investigation
- google knowledge
- Google panda
- Google penguin
- Google Plus
- Google Search Console
- Google Search Updates
- Google webmaster tools
- google-business-profile
- google-maps-ranking
- Hummingbird algorithm
- infographics
- link building
- Local SEO
- local-seo
- Mattcutts Video Transcript
- Microsoft
- Mobile Performance Optimization
- Mobile SEO
- MSN Live Search
- Negative SEO
- On-Page SEO
- Page Speed Optimization
- pagerank
- Paid links
- Panda and penguin timeline
- Panda Update
- Panda Update #22
- Panda Update 25
- Panda update releases 2012
- Penguin Update
- Performance Optimization
- Sandbox Tool
- search engines
- SEO
- SEO Audits
- SEO Audits & Monitoring
- SEO cartoons comics
- seo predictions
- SEO Recovery & Fixes
- SEO Reporting & Analytics
- seo techniques
- SEO Tips & Strategies
- SEO tools
- SEO Trends 2013
- seo updates
- Server Optimization
- Small Business Marketing
- social bookmarking
- Social Media
- SOPA Act
- Spam
- Technical SEO
- Uncategorized
- User Experience (UX)
- Webmaster News
- website
- Website Security
- Website Speed Optimization
- Yahoo




