HTC PLANNING TO RELEASE GOOGLE ANDROID PHONE THIS SUMMER
The 14 new members who joined Google’s Open Handset Alliance showed their support in development of the Google Android mobile operating system. Among the fresh addition is Sony Ericsson, and it looks like the company is not wasting anytime and has hit the ground running.
By summer 2009, according to several sources Sony Ericsson is planning to release its Android Handset. A company spokesman says that first model will be on the higher end while it will release more mass market devices at a later time. In addition, HTC is asked to work on whole portfolio of Android devices and also on the release program of Android devices. HTC is the manufacturer of the first Android Smartphone, T-MobileG1, which was questionable in the Hardware department, but now HTC has acquired one & Co Design Inc. for this handset designs, perhaps we will see some sleeker device? Summer can actually get here quick enough.
SEO experiment keyword rich links webmasterworld member
A webmaster world member asks “Hi guys, I am doing an interesting experiment on two of my more throw away domains. The experiment is testing to try and determine more information about how linking to the homepage affects rankings. The testing involves various controls – linking to the root domain from the nav only using ‘home’, linking from the nav using ‘main keyword’, linking from nav using ‘variations’ of keyword, linking from content only (while nav links saying home) to home using ‘keywords’ etc, etc.
First, I should mention some points about the domain.
4 years old Owned by me Dedicated IP Canonical comdomized HTML only Ranks top 5 in Google.com for main, second and third keyword phrases. Total of 90 pages, all unique content (written by me)
Testing was done over a 3 month period, with grace periods in between testing.
Here is so far what I have found. Might tell us a little about the threshold and re ranking filters
1. Linking home from every page in content using the same keyword caused 6 page drop in rankings.
2. Linking home using keyword in nav on all pages caused the same drop.
3. Link home from every page in content using variations caused a 3 page drop.
4. Linking home from the first 10 pages listed in google.com for site:domain.com/* brought increased ranking (from 5th to 3rd)
What is really interesting is that I gotten this down to the ‘by page’ factor. When I *slightly* cross the threshold and add links to two extra pages, and then wait until they are cached, I tip the scales and drop, to page 6.
What is further interesting is that linking home from content using variations of keywords WAS quite effective to a point, after which the site plummeted.
As well, this might point to a ‘hard line’ being crossed in terms of threshold, at one point I had the website going between position 4 and 51-60 for the same keyword every second day (flipping back and forth)
My test will be about trying to -950 the website by being ridiculously deliberate in nav linking, and then seeing if I can reverse the results by removing those (and how long it takes for the trust to be reinstated to the website) “
Google calendar officially comes to Apple’s iCal
On Monday, Google announced full support for the CalDAV protocol along with the release of a small piece of software for Mac computers so that users can easily link up their Google Calendars with iCal applications.
In July, CalDAV was previously launched by Google; however still consumer had to manually add their calendars directly to CalDAV- supporting application like Mozilla Sunbrid and Apple’s iCal. The newly launched Mac utility named “Calaboration” let the user to plug in their google calendar username and password to send the Google calendar over to iCal. It provides the benefits of two way synchronization that mean what ever changes you make on either end will appear to both in every few minutes.
After all the changes made to Calaboration, when it was started it worked without any problem. With this current implementation, we are able to see other people schedules, as well as reply yes, no or may be to calendar invitations. The only problem faced earlier was syncing errors which mean it dint allow to write data to Google servers, which was remedied with a closing and reopening of the program after the initial CalDAV setup.
If you are sunbird user, you can grab the Calaboration. As there is a simple provider extention that does the same thing.
Googlebot and if-modified since
New record in processing 1 PetaByte of Data
According to Official Google blog
“At Google we are fanatical about organizing the world’s information. As a result, we spend a lot of time finding better ways to sort information using MapReduce, a key component of our software infrastructure that allows us to run multiple processes simultaneously. MapReduce is a perfect solution for many of the computations we run daily, due in large part to its simplicity, applicability to a wide range of real-world computing tasks, and natural translation to highly scalable distributed implementations that harness the power of thousands of computers.In our sorting experiments we have followed the rules of a standard terabyte (TB) sort benchmark. Standardized experiments help us understand and compare the benefits of various technologies and also add a competitive spirit. You can think of it as an Olympic event for computations. By pushing the boundaries of these types of programs, we learn about the limitations of current technologies as well as the lessons useful in designing next generation computing platforms. This, in turn, should help everyone have faster access to higher-quality information.We are excited to announce we were able to sort 1TB (stored on the Google File System as 10 billion 100-byte records in uncompressed text files) on 1,000 computers in 68 seconds. By comparison, the previous 1TB sorting record is 209 seconds on 910 computers.Sometimes you need to sort more than a terabyte, so we were curious to find out what happens when you sort more and gave one petabyte (PB) a try. One petabyte is a thousand terabytes, or, to put this amount in perspective, it is 12 times the amount of archived web data in the U.S. Library of Congress as of May 2008. In comparison, consider that the aggregate size of data processed by all instances of MapReduce at Google was on average 20PB per day in January 2008.It took six hours and two minutes to sort 1PB (10 trillion 100-byte records) on 4,000 computers. We’re not aware of any other sorting experiment at this scale and are obviously very excited to be able to process so much data so quickly.”
Search-based Keyword Tool Google launch
The Search-based Keyword Tool provides keyword ideas:
Based on actual Google search queries
Matched to specific pages of your website with your ad and search share
New to your Adwords account (typically excluding keywords matching those already in your account)What is the Search-based Keyword Tool?
The Search-based Keyword Tool generates keyword and landing page ideas highly relevant and specific to your website. In doing so, the tool helps you identify additional advertising opportunities that aren’t currently being used in your AdWords ad campaigns.
Based on your URLs, the Search-based Keyword Tool displays a list of relevant user queries that have occurred on Google.com (and on other Google search properties, such as google.co.uk) with some frequency over the past year; these suggestions can be found under the Keywords tab, in the New keywords related to (site) section. In the Keywords related to your search section, you can see a broad list of keyword ideas that are also relevant, but aren’t necessarily based on your site.
The keywords are also organized by category. Click any category to expand and view its subcategories. If applicable, you’ll also see the keywords organized by brand names
you can use it here, http://www.google.com/sktool/
Losing pages in search engine index a concern
According to a member ” I have been doing a lot of digging lately because of a site I have that has been losing pages in the index… at least I thought it was losing pages. In the coarse of investigating, I have been finding a lot of discrepancies and have come to the conclusion that though tools and search operators may be helpful, they seem to be far from accurate and do not fully portray what is in the data and returns. What I found fascinating is that while I perceived that I was losing pages in the index, I actually have been increasing position for some relatively hard to get keywords and phrases. In fact, the site in question just went to #6 for widgets. It seems the more I search and investigate, the more glaring the discrepancies.
I was having a lot of problems with the site and duplicate content. It seems there were several ways of getting to the same page (different URLs) and as we know, this can be a bad thing. The site has a forum that has generated 16,000 topics (some of them on multiple pages) so in essence, I am going to estimate that I have around 19,000 pages total on the site. Now at the height of the duplicate problem, when I did a site:mysite.com, I was getting over 80,000 pages returned. WOW! I fixed all the dupe content issues and now each page has one URL and each has a uniquely generated title, description, keywords and of course, the content is different since it is user generated. I used robots.txt to get rid of the duplicated pages and started to watch what would happen. This seemed to have corrected the problem. Pages started going supplemental and dropped, as far as I can tell. But the pendulum seemed to have swung too far! Within the past month, the number of pages returned using site: have been slowly dropping. Now when I do a site:mysite.com, it only shows 4000 pages. Huh? What’s the deal with that? Not only that, when I do a site:mysite.com/*, I only get about 800 pages. So I am confused, of course. But are the missing pages really not there? I conducted about 200 searches for the pages that I thought were missing and found every single one of them, though the searches were fairly specific. So what does this tell me? The site: operator does not work. All of my pages are there, it’s just Google doesn’t want to count them all with this operator. What does this mean? Not sure, but it is what it is. For every page I find missing, I can find in a search. The tool seems to be broken – like a lot of the tools on G. “
What causes drop of indexed pages?
Google 3rd quarter results summary,
GOOGLE ANNOUNCES THIRD QUARTER 2008 RESULTS
MOUNTAIN VIEW, Calif. – October 16, 2008 – Google Inc. (NASDAQ: GOOG) today announced financial results for the quarter ended September 30, 2008.
“We had a good third quarter with strong traffic and revenue growth across all of our major geographies thanks to the underlying strength of our core search and ads business. The measurability and ROI of search-based advertising remain key assets for Google,” said Eric Schmidt, CEO of Google. “While we are realistic about the poor state of the global economy, we will continue to manage Google for the long term, driving improvements to search and ads, while also investing in future growth areas such as enterprise, mobile, and display.”Q3 Financial Summary Google reported revenues of $5.54 billion for the quarter ended September 30, 2008, an increase of 31% compared to the third quarter of 2007 and an increase of 3% compared to the second quarter of 2008. Google reports its revenues, consistent with GAAP, on a gross basis without deducting traffic acquisition costs (TAC). In the third quarter of 2008, TAC totaled $1.50 billion, or 28% of advertising revenues.
Google reports operating income, net income, and earnings per share (EPS) on a GAAP and non-GAAP basis. The non-GAAP measures, as well as free cash flow, an alternative non-GAAP measure of liquidity, are described below and are reconciled to the corresponding GAAP measures in the accompanying financial tables.
GAAP operating income for the third quarter of 2008 was $1.74 billion, or 31% of revenues. This compares to GAAP operating income of $1.58 billion, or 29% of revenues, in the second quarter of 2008. Non-GAAP operating income in the third quarter of 2008 was $2.02 billion, or 37% of revenues. This compares to non-GAAP operating income of $1.85 billion, or 34% of revenues, in the second quarter of 2008.
GAAP net income for the third quarter of 2008 was $1.35 billion as compared to $1.25 billion in the second quarter of 2008. Non-GAAP net income in the third quarter of 2008 was $1.56 billion, compared to $1.47 billion in the second quarter of 2008.
GAAP EPS for the third quarter of 2008 was $4.24 on 318 million diluted shares outstanding, compared to $3.92 for the second quarter of 2008 on 318 million diluted shares outstanding. Non-GAAP EPS in the third quarter of 2008 was $4.92, compared to $4.63 in the second quarter of 2008.
Non-GAAP operating income, non-GAAP operating margin, non-GAAP net income, and non-GAAP EPS are computed net of stock-based compensation (SBC). In the third quarter of 2008, the charge related to SBC was $280 million as compared to $273 million in the second quarter of 2008. Tax benefits related to SBC have also been excluded from non-GAAP net income and non-GAAP EPS. The tax benefit related to SBC was $63 million in the third quarter of 2008 and $48 million in the second quarter of 2008. Reconciliations of non-GAAP measures to GAAP operating income, operating margin, net income, and EPS are included at the end of this release.
"Results 1 to 35 of about 4" – what causes it ?
Interesting post by a senior member in webmaster world
“Two days ago a new CMS went live that replaces an old one that had loads of canonical problems. Some content pages had upwards of 4 or 8 different URLs indexed, and some content pages could be returned for a near infinite number of different URLs.
The old CMS used horrible dynamic parameter-driven URLs and the new one uses short folder-like URL formats, and *all* canonicalization factors have been taken into account.
There are a lots of sites to be moved over to the new CMS, but we started with the smallest — so small that it doesn’t really need a CMS (except that using the CMS has made it very easy for the owner to keep it updated).
The site was already fully indexed, and some content pages show under multiple URLs, because basic non-www to www canonicalisation and so on was only added a few months ago. Many of the really old non-canonical URLs are also still listed.
Now that the new CMS has been installed, and the old content reinstated, most of the old URLs in the SERPs (actually all except domain root) are now 301 redirects (from long and horrible dynamic, to short folder-look alike URLs).
Last night Google reported “1 to 35 of about 8” for a site:domain.com search.
Today it shows “1 to 35 of 4”. None of the new URLs are showing up yet (I expect they will in the next 24 to 36 hours).
So, their internal system “knows” that most of the URLs they already had are now redirects, and it seems that those URLs aren’t now included as a part of the “site count” (the “of nnn” number).
I would guess that the URLs that now redirect are already moved over to Supplemental.
Hence… “1 to 35” – what we are showing you “of 4” – how many URLs that we think are “real” (200 OK).
Now I understand a bit more, I think.
I would expect the new URLs for content to start appearing tomorrow, or soon after. “
Vulnerable sites Alert letter from Google now shown in Google webmaster tools
Google now shows security hole / vulnerable site warning mails in webmaster blog itself.
According to webmaster blog:
Recently we’ve seen more websites get hacked because of various security holes. In order to help webmasters with this issue, we plan to run a test that will alert some webmasters if their content management system (CMS) or publishing platform looks like it might have a security hole or be hackable. This is a test, so we’re starting out by alerting five to six thousand webmasters. We will be leaving messages for owners of potentially vulnerable sites in the Google Message Center that we provide as a free service as part of Webmaster Tools. If you manage a website but haven’t signed up for Webmaster Tools, don’t worry. The messages will be saved and if you sign up later on, you’ll still be able to access any messages that Google has left for your site.
One of the most popular pieces of software on the web is WordPress, so we’re starting our test with a specific version (2.1.1) that is known to be vulnerable to exploits. If the test goes well, we may expand these messages to include other types of software on the web. The message that a webmaster will see in their Message Center if they run WordPress 2.1.1 will look like this:
Quick note from Matt: In general, it’s a good idea to make sure that your webserver’s software is up-to-date. For example, the current version of WordPress is 2.6.2; not only is that version more secure than previous versions, but it will also alert you when a new version of WordPress is available for downloading. If you run an older version of WordPress, I highly encourage you to upgrade to the latest version.
Blogroll
Categories
- AI Search & SEO
- author rank
- Authority Trust
- Bing search engine
- blogger
- CDN & Caching.
- Content Strategy
- Core Web Vitals
- Experience SEO
- Fake popularity
- gbp-optimization
- Google Adsense
- Google Business Profile Optimization
- google fault
- google impact
- google Investigation
- google knowledge
- Google panda
- Google penguin
- Google Plus
- Google Search Console
- Google Search Updates
- Google webmaster tools
- google-business-profile
- google-maps-ranking
- Hummingbird algorithm
- infographics
- link building
- Local SEO
- local-seo
- Mattcutts Video Transcript
- Microsoft
- Mobile Performance Optimization
- Mobile SEO
- MSN Live Search
- Negative SEO
- On-Page SEO
- Page Speed Optimization
- pagerank
- Paid links
- Panda and penguin timeline
- Panda Update
- Panda Update #22
- Panda Update 25
- Panda update releases 2012
- Penguin Update
- Performance Optimization
- Sandbox Tool
- search engines
- SEO
- SEO Audits
- SEO Audits & Monitoring
- SEO cartoons comics
- seo predictions
- SEO Recovery & Fixes
- SEO Reporting & Analytics
- seo techniques
- SEO Tips & Strategies
- SEO tools
- SEO Trends 2013
- seo updates
- Server Optimization
- Small Business Marketing
- social bookmarking
- Social Media
- SOPA Act
- Spam
- Technical SEO
- Uncategorized
- User Experience (UX)
- Webmaster News
- website
- Website Security
- Website Speed Optimization
- Yahoo





