Wednesday, August 10, 2005

MSN adCenter PPC Launch News

Scoop on MSN adCenter PPC:

"Our vision for MSN adCenter is to continue to innovate and deliver a one-stop shop to help you maximize your online advertising ROI on MSN. Our first product is MSN Keywords – the versatile, new paid Search offering that can help you:


· Learn – Access comprehensive data to plan more strategic campaigns.

· Connect – Use advanced demographics to target the right audience at the right place and time.

· Refine – Make impactful changes on the fly with features for greater flexibility and control.


These are the three key MSN Keywords benefits that can simultaneously drive value and help you reach the most receptive audiences with your campaigns. Don’t miss your chance to stay ahead of the competition by being one of the first to use the MSN Keywords solution from MSN adCenter."



MSN Keywords Pre-registration/Pre-load Schedule:

Pre-registration period – August 1, to September 15, 2005

Pre-load – begins September 15, 2005

Launch – October 18, 2005 (limited traffic)

It is invite only for the "Beta Test Period", but I am sure MSN has plans to make it public by year's end.

Thursday, July 21, 2005

Website Branding - America (Favorite) Icon

Have you ever noticed how certain websites in your favorites bookmarks have nifty little pictures next to the Link Title instead of the boring old IE symbol? This little piece of branding is the result of a nifty option available to any website, the use of a Favorite Icon. I thought it would be beneficial to talk about how to create a favorite icon, and then as an added bonus to our loyal readers, how to use this as an analytic tool.

First, let's talk about the basic of a Favorite Icon. The Favorite Icon is a graphic file placed in your root directory that IE searches for whenever a visitor using IE enters your website. If you have a favorite icon installed, IE will use this icon in place of the IE icon normally associated with bookmarks. If you do not have a Favorite Icon, then IE will use its own icon. So this means that any website may use the Favorite Icon option and replace Microsoft Branding with the IE icon with an image that brands their own website.

Favorite Icon requires that the image in question be saved in [.ico] format. In other words, the image you use for your favorite icon must end in [.ico]. It also requires the icon be 16 x 16, 24 x 24, 32 x 32, or 48x48 pixels. This begs the question, how do I save a graphic in this format? Thankfully there are some free programs that will help you change any image into an icon format image. Furthermore, the program will actually save multiple copies of the icon in as many or as few sizes as you require and store them all in that one .ico format.

My choice of programs would be AnytoIcon by aha-soft. This program allows 30 trial uses of the program before needing to purchase the software. Unless you are a web developer working on more than 30 websites, you will probably only need to use the program a few times and maybe only one time if the first image you choose is suitable to your needs. This means you can incorporate Favorite Icon into your website today, right now, for FREE! Aha-soft's program allows you to simply drag your image into the program, choose your sizing or custom options, and the program instantly creates the Favorite Icon in its proper .ico format. Then simply upload the icon to your root directory, and you are done. Every visitor who bookmarks your page from this point forward will see your chosen icon next to your link in their bookmark list!

Now, I promised I would tell you how Favorite Icon as analytic tool, but perhaps some of you can see where I am going with this. Once your Favorite Icon is active, simply use your analytics program or web logs to examine how many times the favorite icon was download on a particular day or week. Remember, Favorite Icon is only accessed when a visitor chooses to bookmark your website, so this resultant number will indicate how many visitors bookmarked your website over any given time period.

Now take the number of times the Favorite Icon file was accessed and divide it by the total number of visitor to your website over the same time period. The resulting figure is the percentage of visitors who bookmark your website! Over time you can track this percentage to determine if your content/products are appealing to customers. As you make modifications to your site's content you can measure if Favorite Icon % is increasing or decreasing giving you a quick baseline figure to see if your changes are helping you to retain and recirculate visitors!

With about 5 minutes work you can brand your website with a custom image icon, and add another useful and easily calculated metric to your analysis that measures the appeal of your website for repeat visits. So why wait? Post your Favorite Icon image today and start branding with the big boys!

Tuesday, July 19, 2005

SEO & PPC Metrics - Website Page Depth

When managing an SEO initiative or PPC campaign, one of the more challenging aspects is measuring the lead quality of the inbound visitors. Most people know how to measure Return on Investment (ROI) and Return on Ad Spend (ROAS), but that should not be the only metric used to evaluate the quality of the leads coming from different Search or PPC Engines. One of the best metrics for measuring lead quality is Page Depth.

Page depth is simply the average amount of pages a visitor sees during a session on your website. It should be fairly obvious that higher page depth is better than low page depth, but how to figure the metric correctly and determining a page depth baseline can be a bit more involved. FOr this excercise we will look at some actual PPC data from a major online retailer, but first let's talk about how to measure page depth accurately.

We know minimum page depth must be 1.0 as it is impossible for any visitor to see less than 1 page. Most people would think page depth is simply calculated by: (Total Number of Page / Number of Vistors). While this is correct in the general sense, page depth should be adjusted to eliminate shopping cart pages. Let's look at a hypothetical example before proceeding to an actual data set.

Suppose your website showed 10,000 pages last month to 2,000 visitors. Let's also assume that your website converted those 2,000 visitor into 200 sales. Finally let's assume that your shopping page is 3 pages long (this means that it takes 3 pages to fill out personal & billing info, and a confirmation page.

Using the tradional Page Depth formula, Page Depth would be calculated as:
10,000 pages / 2000 visitors = 5.0 Page Depth

Now let's look at a calculation for page depth that adjusts the page depth calculation to remove shopping cart pages. The reason for this is that visitors who convert are forced to view 3 more pages, 1 of which (the confirmation page) is completely superfluous. The reason I say this is that these pages do not measure the depth of a visitor's session because they happen after the visitor has converted. Page depth as a metric should measure how far into a site a visitor goes regardless of conversion or no conversion. What Page Depth should be measuring is how conducive your website is to dispensing information about products or services you offer. Also, typically service wensites do not have a shopping cart while e-commerce site do. By adjusting page depth to exclude shopping cart pages, page depth can be compared between types of sites without a shopping cart bias "helping" e-commerce sites.

So let's do the Page Depth Calculation again and exclude shopping cart pages:
(Total Pages - (Conversions * Shopping Cart Page Length)) / Visitors
(10,000 - (200 * 3)) / 2,000 = 9,400 / 2,000 = 4.7 Page Depth

Notice that page depth dropped from 5.0 to 4.7 using the adjusted page depth formula. However this 4.7 pages calculation tells us that any average visitor will typically see 4.7 pages before either leaving or converting. If the 5.0 figure were used, it would be skewed by shopping cart pages that happen after the visitor has already been converted. Also all we could say for certian is that visitors who do not convert see less than 5 pages and those that do convert see more than 5 pages, but it does not tell us what the average visitor sees whether they convert or not. If you have a long shopping cart process or if you were to convert a high percentage of visitors, the tradional formula of [Total Pages/Total Vistors] would become even more skewed. Therefore, I have found that the adjusted page depth calculation gives a more accurate indication as to how many pages a typical visitor views since we do not know beforehand if they will convert or not.

Developing a baseline page depth figure should be down over time and should be done to each engine or campaign separately. You will also have an overall page depth figure for all campaigns but by using this figure it will be more difficult to spot meaningful fluctuations as high traffic campaigns will be weighted far more heavily in the overall Page depth figure. I would say that 1 month is a good amount of time to collect data in order to calculate reliable Page Depth figures for each natural and ppc campaign. However, there are red flags to look for that can indicate a problem in less time than a month.

Let's look at the PPC campaign data from a major online retailer doing around $10 million in gross revenues. (Click on the graphic below to see full image)



Notice the Raw (Unadjusted) Page Depth Figures for Kanoodle & Enhance [Not surpisingly, this figure also happens to be the adjusted Page Depth as neither engine converted a single visitor]. Anyone who has read this blog or my contributions to various search engine forums knows that I have a dislike for these two engines and here is the reason why. Enhance has a page depth of 1.34 and Kanoodle a rather unbelievable 1.10! Kanoodle's Page depth figure says that for every 10 people that visit the site, 9 of the 10 bounce (Bouncing is a visitor who sees only the landing page they are directed to and then leave), and the tenth sees a second page and then bounces! This is the telltale sign of poorly qualified traffic or click fraud. I will get into click fraud in a future post, but be aware that the smaller PPC engines and Shopping feeds are notorious for click Fraud. Among these would be Kanoodle, Enhance, Bizrate Shopping, & Nextag Shopping. Not only do they send unqualified traffic, but many of them also charge 200-300% more clicks than any analytic program tracks. The Page Depth metric is an excellent tool for spotting these types of fraudulent traffic sources. Statistically speaking, it is highly unlikely for any visitors to convert from sources that generate a raw(unadjusted) Page Depth of less than 3.0. Why, well because most e-commerce site have anywhere from a 2-5 page shopping cart, so a raw Page depth of less than three indicates that few if any visitors from that sources are making it to or through the shopping cart. If they were converting, raw Page Depth would have to be higher due to the additional pages a shopping cart adds to the page depth metric.

Naturally, this retailer no longer advertises with Kanoodle or Enhance. Furthermore, the Page Depth figures for the new Miva Engine show signs for concern [Miva combined Find What (miva-f) & Espotting (miva-e)]. During July Miva will be monitored for Page Depth and if things do not improve significantly, advertising will be halted on Miva as well.

Hopefully this gives you an idea on how to calculate and use Page Depth as a significant metric when evaluating your campaigns. Remember, that it could be your site's architecture (confusing navigation), bad landing page choices, or your site's appearance that could be hindering vistors from seeing more pages. Assuming that there are no appalling usability issues plaguing your site, Page depth can be an invaluable tool for measuring the quality of leads coming from your different traffic generating sources. Being that it is a simple caluclation to gain Page Depth figures, take an hour or so and examine this metric to determine which campaigns are working for your site, and which are simply trying to get into your wallet.

Friday, July 15, 2005

PPC Management - Google Changing Adwords Again

This Google Adwords news is brought to you by the fine people at Google who are sick of the phone calls asking what kind of pot they have been smoking the last few weeks....


Hello from the Google AdWords Team:

You asked, and we listened. In the coming weeks, we'll simplify our keyword status system and introduce quality-based minimum bids, giving you more control to run on keywords that you find important. We believe these changes will result in higher quality ads.

What's changing
Simplified account management: Your keywords will be active or inactive — instead of normal, in trial, on hold, and disabled. In addition, accounts will no longer be slowed. Currently, accounts are slowed when they don't meet our performance requirements and your ads appear rarely for your keywords.

Quality-based minimum bids: Soon, each keyword will be assigned a minimum bid based on its Quality Score. Keywords with a higher Quality Score will be given lower minimum bids to stay active and trigger ads. Keywords with a lower Quality Score (including those that are currently on hold) will have the opportunity to run if your keyword or Ad Group's maximum cost-per-click (CPC) meets the minimum bid.

The Quality Score is determined by your keyword's clickthrough rate (CTR), relevance of your ad text, historical keyword performance, and other relevancy factors.
Ad Rank, or the position of your ad, will continue to be based on the maximum CPC and quality (now called the Quality Score).

What you should do
Here are some suggestions for what you should do before and after we implement the changes described above:

Before implementation: If you're happy with your current keyword list, there's no need to make any changes. However, if you have any on hold keywords that you don't want to trigger ads, we suggest you delete them from your account. This is because any keywords with a high enough Quality Score and maximum CPC could be activated and accrue ad clicks. You can use our Find and Edit Keywords tool, available in your account's Tools page, to quickly search for and delete any keywords in your account.

After implementation: We'll email you after we implement these changes. You should then log in to your account and monitor your keyword performance under these new guidelines. Any disabled keywords at the time of implementation will remain labeled as disabled in your account. Several weeks later, we'll delete them. This period of time is meant to give you an opportunity to review your disabled keywords and activate them.

No word on the Cannibus however.....

Thursday, July 14, 2005

Pulling out of the Adwords Campaign Death Spiral

I've been on Google's case about this Adwords Algorithm change, but let's face it, us moaning about it isn't helping any of us deal with the situation.

Here are my initial thoughts on how to "manage" Adwords since the region parameter was added. Please remember the statistical pool of data since the change is not large enough to make definitive conclusions, but I believe you can do some things right now to try to protect yourself.

1. Network: Blogs & Forums are great tools for networking so take advantage of them! If you can make one contact in each time zone (you and 3 others here in the US), as a group you can all report on the top 25 keywords for each campaign. This means you looking up 75 keywords for you network each week and reporting their position in your area, but it means you will get information back on your top 25 keywords from 3 other regions. If you are saavy enough to get contacts in your key cities/states, even better.

2. Get a Solid Analytics program: Information is going to be your best weapon and defense in dealing with the new Adwords. With the Adwords algorithm change you are going to have keywords getting higher CTR and clicks, which may be good or bad, but only an analytics programs can tell you exactly how much revenue was created and how much adspend it took to create the revenue. More clicks may mean more sales but it may also mean more adspend per sale. Having a good handle on ROI down to the keyword level is going to be invaluable information when it comes to managing Adwords in the new system. You don't have to spend $1000's a month on the cadillac of Analytic Software (although if your revenues support it by all means get Omniture or Core Metrics), a program like Urchin for $200/month will give you the type of information you need to collect at a very reasonable price.

3) Re-examine your Maximum Bids: Almost everyone has a higher maximum bid price set than the average CPC you are charged. For example, you may max bid $1.00 for term "X" and your average CPC traditionally has been something like $0.66 for position 2.9 and you sat in position 3 most times. However if Google decides to place your Ad #1 in a region, you still have room between your average CPC and your Max Bid. If there is enough room there for you to afford position #1 in a new region, your average CPC will start to rise, AND your total number of clicks will also rise since you are now in the #1 position. So in this instance let's say tradionally you did 200 clicks per week at your AVG CPC of $0.66, so your weekly spend was about $132. Now with the #1 region placement, you can expect an increase of probably 25% in clicks and 25% in AVG CPC (these 25% rises are probably on the conservative side as costs between 1st position and 3rd position can be +100% or more in some cases). The 25% increase in clicks makes 250 clicks, while the 25% increase in AVG CPC, makes each click now cost around $0.82. So your new adspend has become $205, an increase of 65%!

I would consider reviewing max bids and perhaps pulling them down a bit to keep adspend costs from getting out of control due to a keyword(s) appearing in a hot region in the #1 or #2 position if they have not been bid to that height in the past. If you were bidding for #1 or #2 then the change is not going to be as drastic in the adspend area, but in the loss of traffic due to being positioned below your previous rankings. To deal with that problem......

4) Formulate an SEO Plan: Currently you do have some control over your ability to affect change in natural rankings. While SEO projects can be costly in the short run, their long run costs are far lower than a PPC campaign. Make sure you work with or bring in someone who has SEO experience. My mom can claim to be an SEO Specialist, and she can't attach a picture to an email, because there currently is no formal educational training for SEO. Quite simply, the people who are the best at it are people who have or do work for companies with successful SEO campaigns. Outsourcing SEO can run $125/hr for top SEO Companies, so bringing in a talented SEO Specialist for $60,000-$80,000 year ends up being a bargain. The third option is to look for or hire SEO Specialists that work for Companies to do your SEO work as Freelance Workers. Typically they can be hired for $40-$75/hour and know as much or more than some of the big companies, so they make an excellent alternative. The sooner you get a successful SEO project in place the less reliance you will have on the PPC engines and you can actually decrease adspend. Typically the savings in adspend can pay for the SEO Specialist and leave extra money that is now pure profit.

Since the change is fairly new, we don't want to do anything drastic by panicking. But it never hurts to take a few defensive measures to protect your business from a sudden change in adspend. As I mentioned above, your best weapon is information, so get on that networking and get that analytics program installed or upgraded. Err on the side of caution with your max bids and look into SEO initiatives to decrease dependence on PPC campaigns. Take a moment, analyze your data, and determine the maximum value of your keywords and you will be back flying the friendly skies, albeit a bit bumpy, of Google Adwords.

Tuesday, July 12, 2005

More on Google Adwords Algorithm Change

Judging by the mail I am getting I will attempt to explain the entire Google Adwords Algorithm change many of you may be noticing.

Some time between June 14th & June 27th I started to notice distinct changes in many Adwords accounts. Keyword buckets that were inherently stable concerning their respective spend and CTR were now performing at entirely different levels in each category. As a whole, differerent Adwords accounts have seen either a very noticible increase or decrease in CTR and Ad Spend.

The second fact that I noticed is that ads were not appearing in the position they were supposed to be in. Sometimes they were in higher spots, sometimes lower. I have noticed ads drop from 3rd position to 31st position, and from 8th position to 1st position without any change to bids or ad content.

Today I called Google and spoke to an account rep. I mentioned the strange statistics and rankings I was tracking and asked for clarification as to what they have done to Adwords algorithm. After much wrangling I finally got the answer I was after.

The answer is that Google has instituted a local search parameter into Adwords. Different regions of the country will see Google Adwords ads in different positions. So while you may be bidding for 3rd position for the term "widget", its placement in each region will be based on its historical performance in that region.

So let's say you sell surfboards. Historically you have bid for 3rd position for the term "surfboard" and you have an average position of 3.4 (as we all know ranking fluctuate depending on CTR and BID so it is rare your average position is exactly the bid position you try to attain). Now let's assume that historically you have a 8.0% click through rate in the "California" region (Google hasn't released the boundries or number of different regions), but only a 2% click through rate in the "Iowa" region. Google's new Adwords algorithm may place your "surfboard" ad #1 in California region searches , but #10 in Iowa region searches.

This wouldn't be so bad, but since we do not know the regions, the only way to know how your Adwords ads are being placed is to gather sample search results from across many regions. This is going to be tremendously difficult for a small business that has no regional or national staff or offices.

Google believes that this new parameter will result in better search results for the end user. However this comes at a steep price for the control of a Google Adwords account has been made exponentially more difficult. I fail to see why Google did this to the PPC engine. If this was applied to natural search rankings it wouldn't be nearly as bad. But Google has decided to bite the hand that feeds it, taking away control from the businesses that pay to use the Adwords service. Businesses are now at the mercy of where Google believes their products will sell best, instead of where the business believes it will sell best. If I was bidding for 3rd position and my sales didn't warrant the ROI, I have myself to blame. With this new change, most businesses will have Google to blame because businesses will not be able to control the placement of their ads.

This also bring to light many potential hazards:
1) The sparking of a bidding war to try to reclaim ranks.
2) Increased spend on terms that were known to be marginal and bid to low positions now appearing in high positions.
3) The loss of revenue from previously high ranked, high revenue terms that have now been dropped out of major markets.

Google claims that they believe that this will result in higher CTR and Conversions for most businesses. Of course higher CTR rates means higher ad spends for everyone, and this benefits one business more than any other...

I knew I should have bought more of that Google stock.

Google Adwords Algorithmic Change

Google has made an algorithmic change to Adwords. No longer will you be guaranteed the position at which you bid for keywords. Instead Google has instituted a local parameter to the Adwords formula. Basically different regions see different sets of PPC listings. So a term you used to be #3 for everywhere could now be #3 in NY, #17 in TX, and #1 in PA. They base these new positions on how your keywords have performed historically in each region. So in regions where you had high CTR on "Bandana" you would still be #3 or higher, but in places you have done poorly historically, your #3 bid could be dropped as low as 30th or lower. There is also no way to see each regions listings unless you are in a region. Hence Adwords has removed a lot of control from businesses and keyword bidding is going to be exponentially more difficult and confusing. This is yet another reason for an analytics package so that you may see what regions leads and revenues are coming from and help untangle the new messy Adwords web. I estimate that this change took place in the 3rd week of June so you may notice keywords that you are spending a lot more or less on the past two weeks even though you didn’t change a bid or the content of a Google ad.

I confirmed all this with a Google account rep today on the phone once I noticed something seemed amiss. Over the next few days I am going to try to work up a new Adwords strategy that works with their new algorithm. I just thought I would give you a heads up as you would know better than I at this point if you saw any unusual changes in cost or ROI in Adwords keywords the past 2 weeks.

Friday, July 08, 2005

8 Simple Rules for SEO Copywriting

We all know that the lion's share of web traffic comes through the search engines. We also know that keywords and links to your site are the two things that affect your ranking in the search engines. Your keywords tell the search engines what you do, and the inbound links tell them how important you are. This combination is what determines your relevance. And relevance is what the search engines are after.


There's a lot of information around about how to incorporate keyword phrases into your HTML meta tags. But that's only half the battle. You need to think of these tags as street-signs. That's how the search engines view them. They look at your tags and then at your copy. If the keywords you use in your tags aren't used in your copy, your site won't be indexed for those keywords.

But the search engines don't stop there. They also consider how often the keyword phrase is used on the page.
To put it simply, if you don't pepper your site with your primary keywords, you won't appear in the search results when a potential customer searches for those keywords.
But how do you write keyword-rich copy without compromising readability?
Readability is all-important to visitors. And after all, it's the visitors that buy your product or service, not search engines.


By following these 8 simple guidelines, you'll be able to overhaul the copy on your website ensuring it's agreeable to both search engines and visitors.


1) Categorize Your Pages
Before writing, think about the structure of your site. If you haven't built your site yet, try to create your pages around key offerings or benefits. For example, divide your Second Hand Computers site into separate pages for Macs, and PCs, and then segment again into Notebooks, Desktops, etc. This way, you'll be able to incorporate very specific keyword phrases into your copy, thereby capturing a very targeted market. If you're working on an existing site, print out each page and label it with its key point, offering, or benefit.

2) Find Out What Keywords Your Customers Are Searching For
Go to WordTracker.com and subscribe for a day (this will only cost you about AUD$10). Type in the key points, offerings, and benefits you identified for each page, and spend some time analyzing what words customers use when they're searching for these things. These are the words you'll want to use to describe your product or service. (Make sure you read WordTracker's explanation of their results.)


3) Use Phrases, Not Single Words
Although this advice isn't specific to the web copy, it's so important that it's worth repeating here. Why? Well firstly, there's too much competition for single keywords. If you're in computer sales, don't choose "computers" as your primary keyword. Go to Google and search for "computers" and you'll see why... Secondly, research shows that customers are becoming more search-savvy - they're searching for more and more specific strings. They're learning that by being more specific, they find what they're looking for much faster. Ask yourself what's unique about your business? Perhaps you sell cheap second hand computers? Then why not use "cheap second hand computers" as your primary keyword phrase. This way, you'll not only stand a chance in the rankings, you'll also display in much more targeted searches. In other words, a higher percentage of your site's visitors will be people after cheap second hand computers. (WordTracker's results will help you choose the most appropriate phrases.)


4) Pick the Important Keyword Phrases
Don't include every keyword phrase on every page. Focus on one or two keyword phrases on each page. For your Macs page, focus on "cheap second hand macs". For the PCs page, focus on "cheap second hand pcs", etc.

5) Be Specific
Don't just say "our computers". Wherever you would normally say "our computers", ask yourself if you can get away with saying "our cheap second hand Macs" or "our cheap second hand PCs". If this doesn't affect \ your readability too badly, it's worth doing. It's a fine balance though. Remember, your site reflects the quality of your service. If your site is hard to read, people will infer a lot about your service...

6) Use Keyword Phrases In Links
Although you shouldn't focus on every keyword phrase on every page, it's a good idea to link your pages together with text links. This way, when the search engines look at your site, they'll see that the pages are related. Once again, the more text links the better, especially if the link text is a keyword phrase. So on your "Cheap Second Hand Macs" page, include a text link at the bottom to "Cheap Second Hand PCs". If you can manage it without affecting readability, also include one within the copy of the page. For example, "As well as providing cheap second hand Macs, we sell high quality cheap second hand PCs". TIP: If you don't want your links to be underlined and blue, include the following in your CSS file:



Then format the HTML of each link as follows:
As well as providing cheap second hand Macs, we sell high qualitycheap second hand pcs.


7) Use Keyword Phrases In Headings
Just as customers rely on headings to scan your site, so do search engines. This means headings play a big part in how the search engines will categorize your site. Try to include your primary keyword phrases in your headings. In fact, think about inserting extra headings just for this purpose. Generally this will also help the readability of the site because it will help customers scan read.


8) Test Keyword Phrase Density
Once you've made a first pass at the copy, run it through a density checker to get some metrics. Visit GoRank.com and type in the domain and keyword phrase you want to analyze. It'll give you a percentage for all the important parts of your page, including copy, title, meta keywords, meta description, etc. The higher the density the better. Generally speaking, a density measurement of at least 3-5% is what you're looking for. Any less, and you'll probably need to take another pass.


Follow these guidelines, and you'll be well on your way to effective SEO copy.
Just remember, don't overdo it. It's not easy to find the balance between copy written for search engines and copy written for customers. In many cases, this balance will be too difficult to achieve without professional help. Don't worry, though. If you've already performed your keyword analysis, a professional website copywriter should be able to work your primary keyword phrases into your copy at no extra charge.

Article by Glenn Murray for SiteProNews ©Copyright 2004

Thursday, July 07, 2005

SEO & PPC Management - Keyword Mining Part III

Finally: Making sense of the numbers (here comes the shock).
Ok, now that you understand the artificial skew and the alternatives that can correct for it, let's move on to analyze the numbers given by Overture's STST and Wordtracker's keyword selection service (KSS) using the search term(s) keyword(s).

An in depth look at Overture's STST numbers...
Overture's STST shows 180,468 searches were conducted. This represents the combined count of the search terms keyword, keywords, Keywords, KEYWORD and KEYWORDS - the combined total of all singular, plural, capitalized, upper and lower-case searches.
When we divide Overture's count (180,468) by 30 (because Overture's figures are for a 30-day period), the inference is there are 6,016 searches per day that meet this criteria. In actuality, they receive just 40-60 per day total (are we shocked yet?).

Here's how we're crunching the numbers.
Fact: Overture's STST suggests a combined average of 6,016 page views took place between Overture and its major partners - e.g. AltaVista, Yahoo, and others - each day for the month of December '03. We're referring to search result pages like this one at Yahoo.

Fact: Each of these results pages lists between 10 and 40 URLs with descriptions.

Factor in Zipf's Law which predicts that traffic for any particular keyword on a search engine will be proportional to its popularity rank.

Factor in how the title and description affect a user's propensity to click on a Web site.

Factor in the Penn State University's findings that 55% of users check out one search result only, and 80% stop after looking at three results.

Factor in known elements leading to an estimated, but educated, conclusion as such...

Since it's a fact that Wordtracker's Web site appears in the top-ten of Overture's results throughout their partner realm, they should be getting a guesstimateed 10% of the overall click-throughs from all major engines, pay-per-clicks, and directories.

That would equate to about 602 visitors per day.

However, Wordtracker is currently ranked 1-10 on only about 25% of the major engines, directories and pay-per-click portals for the search term, keyword(s)... Calculate the estimate...
...therefore, the Wordtracker site should expect roughly 25% of this predicted click-through traffic, which is 150 visitors per day.

Compare calculated estimate to known facts...
In fact, Wordtracker receives 10 - 15 visitors per day for the search term(s) keyword(s). In fact, Overture's STST overestimates this search query by a factor of 10.

Furthermore, since Wordtracker is estimating they receive approximately 25% of the total traffic then that would put the total traffic generated at 40 to 60 per day (25% of 40 to 60 = 10 to 15 visitors a day).

In fact, Overture's STST overestimates the total search query count by a factor of 100 ...based on 6,016 being more than 100 times greater than the 40 to 60 figure suggested by Wordtracker's actual visitors.

Experience shock and awe at the difference between the numbers!
Wordtracker's service provides very different numbers...

Using the same search term(s) keyword(s), we pulled a representative result from the Wordtracker database (on January 13, 2004) that predicts searches per day conducted throughout the major engines, directories and pay-per-clicks on the Internet.
The results were...

keyword - 93 searches (lower case, singular)Keyword - 39 searches (Capitalized, singular)keywords - 187 searches (lower case, plural)Keywords - 184 searches (Capitalized, plural)KEYWORD - 115 searches (UPPER case, singular)
Total Predicted Daily Searches for all Engines = 618

This figure - 618 - Wordtracker compiled directly from results taken from Meta-engines, Metacrawler and Dogpile in order to eliminate the artificial skew.

Wordtracker further adjusted the number downward by filtering out keyword sp@m (as defined above) based upon a proprietary formula used to identify search terms that are being searched at intervals too regular to have been conducted by actual humans.
These suspiciously regular and assumed to be artificially generated searches are therefore discounted in arriving at the final number - 618.

Even when taking into account such dependent variables such as position, title, and description, we would expect (logically guesstimate) the website to receive about 10% of the total traffic due to top-ten placement, targeted title and relevant link-description.

And finally, we should expect no better than 25% of that total traffic, due to the fact that Wordtracker has top-ten placement in only 25% of the relevant engines.

So the calculations show...
618 x 10% = 61.8 x 25% = approx 15 visits per day.

This is more in line with Wordtracker's actual 10-15 per day average number of visits generated by the 5 variations of the search term keyword across all of the major engines.
So, whose numbers should we trust?

When it comes to trusting the numbers, you should take into account what you are using them for. If you're looking to determine relative popularity of a given item, service, topic, or category, then Overture's STST can fill the bill nicely - and for free!

For instance, Overture's STST returns the following numbers for the following searches...
58,312 home insurance57,315 home owner insurance233,854 auto insurance570,337 car insurance
This tells us (for free) that car insurance gets about twice as many searches as auto insurance. It also tells us that home insurance gets about the same number of searches as home owner insurance ...and that searches for car insurance is TEN times more popular than home owner insurance.

No doubt about it, when researching what to sell online, this is valuable preliminary information that Overture's STST provides for free.

However, based upon what we now know about artificial skew, we'd want to get a third-party-review of the search terms - one that adjusted the numbers for skew - before we bought advertising on a pay-per-click engine or spent good time and money optimizing a site for organic (think Google) Web search results.

After all, if Overture shows 6,016 "hits" per day out of which Wordtracker is experiencing 15 visitors, then reality suggests we should do the math (i.e., apply the information) that distills the raw numbers into useful data. Let's first decide if "15" visitors per day will pay the advertising bill (duh!) ...and, if the reality count is anywhere near 6,016, we'll be ecstatic, right?

Always remember it's the amateurs that believe optimistically romanced numbers just before they lose their wallets on the way to bankruptcy. Professional marketers learn to err on the downside of expectations and then smile when the pleasant surprises shower down riches.
They know that nothing beats accurate information - the most powerful marketing tool on earth.

Article by Robin Nobles for SiteProNews

SEO & PPC Management - Keyword Mining Part II

Reason #2 - Duplicate Searches
As you most certainly must know, Overture's strength as a viable advertising medium for online businesses lies in the fact they provide results to "tens of thousands of Web sites" which include AltaVista, Yahoo, MSN Search, HotBot, and AllTheWeb just to name a few. They claim to reach more than 80% of active U.S. Internet users.

Potentially, this is great for advertisers! ...yet this very same structure is what so greatly contributes to the artificial skew leading to extremely over-inflated reporting of keyword queries.

According to Overture itself, statistics on searches in any previous month are compiled from Overture's partner search engines. To further understand how partnering tends to facilitate skewed query counts, let's examine what happens when a visitor conducts a search at AltaVista.
What's actually happening is that two searches are being conducted at one time - one at AltaVista, and another that lists the SPONSORED MATCHES supplied by Overture's pay-per-click engine.

Although it is next to impossible to know the exact figures, suffice it to say that a single human often generates multiple queries when doing a single search as calculated by Overture's STST. In some cases that same human could even generate additional "hits" for a given keyword simply by conducting the same search again on a different engine if such engine is also an Overture partner.

For instance, searching Yahoo, then searching again on MSN, then searching again on AltaVista, then again on AllTheWeb.com would tally at least five "hits" for the selected search term. In comparison, if Overture (like Google, for instance) counted only the searches that were done "on-site," such duplicate searches would not be counted and their search query numbers would be far more accurate.

This scenario, combined with the myriad artificial duplicate searches conducted by the various softwares (explained above), severely pumps up the number of queries for virtually every legitimate search term imaginable.

Reason #3 - Plurals and Singulars
Remember our STST example (above) regarding the 180,468 "searches" for the term "keyword"? Well, another factor to consider is that Overture's STST combines both the plural term (keywords) and the singular (keyword) in compiling that number.

And, Overture's STST not only combines the plural and singular versions of "keywords," they also combine upper and lower case searches as well. Obviously, these two factors also exert an upward effect on the query count tabulations.

Third: Examining The Alternatives.
So now the obvious question - Is there a "better" way to tabulate search term query counts? ...let's examine the alternatives.

Meta-engines - a better way to accurately tabulate queries.
Obviously we'd like to eliminate artificial and duplicate searches from our tabulations, and fortunately there is a way to do so. The solution is Meta-engines.

Composite (Meta) engines, like Metacrawler and Dogpile, are search engines that query all the major engines simultaneously. One of the key differences is that the ratio of human queries to automated queries for a meta-engine is much higher than for a major search engine. That's because it doesn't make sense for anyone to point their auto-bots at meta-engines.
Position monitoring, bid-optimizing, popularity checks, etc., are typically conducted directly at the search engines themselves. It would be pointless to conduct such automated queries on a meta-engine because meta-engines do not "add-url's" nor do they offer pay-per-click options. They are simply a search engine that queries other search engines. And, since there is no "metacrawler" of meta-engines, the search query counts are unlikely to be artificially skewed by such artificial searches. Furthermore, duplicate searches are eliminated because the query counts are being tabulated from a single source instead of combining results from myriad partners. Therefore, query counts taken from meta-engines are far, far more representative of the number of searches conducted by actual people - but even this is not yet a perfect solution due to a relatively obscure form of keyword spam.

Keyword spam (in this case not to be confused with word stuffing or repeating keywords within a Web page) refers to the practice of using cgi-scripting to manipulate the Metaspy metacrawler voyeur to artificially promote certain products or services.
By entering a flow of terms or phrases at predetermined intervals, such sp@mmers hope to inflate the importance and significance of certain search terms thereby artificially increasing the value of such terms related to their products.

In a perfect world, adjustments should be made to filter out this flavor of sp@m. In a minute we'll share with you how such filtering is done but first, let's address the issue of combining plurals with singulars and upper with lower-case searches.
Plural, singular, upper, and lower-case searches represent a decision-point for search engine optimizers because sometimes it's good to combine the search query numbers while other times it isn't.

For instance the search terms "keyword and keywords," whether singular, plural, or in upper or lower-case, are similar enough in meaning that they could arguably be combined into one search query number.
However, the search terms "tap, taps, Tap, and TAP" can have entirely different meanings. Take a look at the results for the search term "tap" on Overture. The following references were all found within the top ten sponsored listings:
Machine threading taps,Tap / Rap support softwareBeer tapsTap DancingTAP A StockTAP

Terminal Phone Numbers
Note that none of the above has any relation to the others! Obviously if we are selling any of these items, we'd want more specificity regarding the search queries than the simple 10,485 searches that STST reports were conducted in the past 30 days.

The example above illustrates the importance of obtaining search query tabulations for each version of a selected keyword independently of the other.
After all, it's easy to manually combine the numbers while it's impossible to break them out into their own categories once they've been compressed by Overture's STST into a single search term regardless of potentially different meanings.

Article by Robin Nobles for SiteProNews

SEO & PPC Management - Keyword Mining

The root of all success in search engine marketing begins with keywords. Period. Get them wrong and virtually everything about your online endeavor will fail. Only by targeting the right keywords can one expect to ride that exhilarating magic carpet to online prosperity.
Stating the obvious you say? ...well, if so, then why is it that virtually everyone - professional and amateur alike - is oblivious to the fact they are selecting, and frequently buying, keywords based on highly skewed numbers?

The fact is that very few online marketers understand the results supplied by the two most basic keyword selection tools. These are the very same tools being used globally to hone keyword choices into supposedly laser sharp focus in an effort to keep pace with the challenges of increasingly keen competition and ever-rising keyword pay-per-click costs.

The Critical Differences — Overture's STST vs. Wordtracker's KSS

As one of Wordtracker's technical support team, one of the most frequent questions we receive these days is...
Why are the keyword search query numbers supplied by Overture's search term suggestion tool (STST) so incredibly different than those supplied by Wordtracker's keyword selection service (KSS)?

Frankly, there isn't a better search engine related question one could ask. And, now's a good time to pay close attention because the surprising answer will likely change forever how you evaluate keywords!

First: Understanding Their Motives.
To help you understand the details we're about to reveal, let's examine the motives of the services that are providing the keyword query numbers.

Motive Analysis: Purpose
On the one hand, there's Overture's STST whose purpose is to help customers buy keywords.
On the other hand, there's Wordtracker whose purpose is to help customers select keywords.

Proposal:
Overture's STST suggests what keywords to buy from them.
Wordtracker suggests what keywords to use in your optimization efforts and/or which to buy elsewhere.

Success:
Overture's success depends on you believing there are Lots of search queries for whatever you are selling.
Wordtracker's success depends on you getting accurate numbers upon which you can reliably base your optimization and keyword purchase decisions.

Profits:
Overture's STST is free. Overture profits by selling you the keywords that STST reports on.
Wordtracker's KSS is fee based. They profit by selling you access to accurate and impartial information. Since they don't sell the keywords, there's no vested interest in query numbers beyond accuracy.

It's important to note there is no good-guy, bad-guy here - just two companies that provide information and do so with different incentives in mind.

Second: Understanding The Artificial Skew.

In researching the search term "keyword," Overture's STST indicates there were 180,468 searches for the 30-day period ending the last day of December '03. Of course, when we divide this number by 30 (days), one naturally assumes that's an average of 6,016 combined searches per day for the term "keyword" - (180,468/30=6,016).

Now, if you happen to be in a business that sells keywords (like Overture) then 6,016 pairs of eyeballs per day is a pretty encouraging number indeed! The problem is, there isn't anywhere even close to 6,016 per-day queries for the search term(s) "keyword(s)". In fact, the actual number, which we'll share with you in a minute, will no-doubt shock you!
But, for the moment, let's look at why that number is skewed.

Reason #1 — Artificial Searches

Overture's STST numbers are increased upward by automated queries. These include automated bid optimizers, position and ranking monitors, page popularity analyzers - anything other than a real person manually performing a search is considered an automated query. Monitoring a site's positioning at, say, AltaVista for the search term "keyword" tallies a "hit" within Overture's STST system for that search term. That's in spite of the fact that it was actually automated software that generated the hit. The same holds true for page-popularity checkers, pay-per-click bid optimizers or any other machine generated monitor or tabulator that queries an engine for a "pet" keyword and generates a hit in the process.
Then, when the same positioning query is done at, say, MSN (another Overture partner), STST records yet another hit. Understandably, STST cannot differentiate between automated and human queries. Neither can they tell when the auto-query has already been queried at another partner's site.

Now, when we take into consideration all of the position monitoring, page popularity checking and pay-per-click bid analyzing - there are well over 15 automated and semi-automated bid checking software programs alone - it's staggering to realize the significant effect these automated queries are having on the overall search term query tabulations.
However, artificial searches are only one aspect contributing to the artificial skew (defined as: the inflation of actual search queries for specific keywords performed by anything other than humans).

Article by Robin Nobles for SiteProNews

Wednesday, July 06, 2005

Google Page Rank - One Man's Theory

Google's PageRank (PR) is one of the most sought after, and yet misunderstood web page attributes. PageRank, named after one of the founders of the Google search engine, Larry Page, was the innovative foundation that the Google search engine was built on.


The theory was that a link from one web page to a web page of another site was in essence a vote for that page. The reasoning was that webmasters would only link to pages that they thought were interesting and of value to their viewers. Google used the number of inbound links (IBL) to a page to judge the importance and relevance of that page, and based on this calculation, and other factors, decided where to place that page on the search results page (SERP).


They devised a scale of measurement for PageRank from 1 to 10. Then for the information of webmasters and interested people they produced a toolbar that can be deployed in Internet Explorer that indicates the PageRank value of any page being viewed in the browser. These values have become known as PR0 to PR10.


Since PR values are a result of IBLs, Google decided to give them their own name and refers to inbound links as backlinks. As part of the toolbar there is a quick lookup of the number of backlinks that Google reports for the page that is currently being viewed in the browser. This search can also be done without the aid of the toolbar by simply typing "link:http://www.yourURL.com" into the Google search box.


The one trick to this link search is that Google does not display all backlinks. At one time it was thought that they only listed pages with a value of PR4 or greater. Today however, you will find backlinks reported from pages of lower PR values. So, at best, Google's backlink search seems to present some sample of pages linking to the site. Suffice it to say that this search is not a reliable measure of all IBLs to a page.


How is PageRank Calculated?
In simplest terms PR is calculated by the sharing of PR from all the IBL links to your page. This is not strictly accurate because Google also uses the internal links within a site in the calculation of PR. Each link to a page carries with it and passes PR value to the target page. The PR points or value passed depend on the PR value of the page they come from, and the total outbound links from the page. It is generally agreed that a page will only pass about 85% of its value to the page it links to. So a PR5 page with a single outbound link will pass 85% of the value of a PR5 page to the page it links to.


But virtually no page has only a single link -- remember internal links are also used in the total outbound link count -- so the value passed to any page is 85% of the PR, divided by the total number of outbound links.


The question now becomes what is the PR point value of the different PR levels. Most observers believe that the relationship between PR levels is logarithmic rather than linear. In other words PR5 is not worth 25% more than a PR4, but may be worth 4 to 6 times more.
It is also understood that a PR value is not a single number, but is in fact a range of values. So not all PR6s are equal. As the chart below shows a PR6 may be just on the upper boundary of a PR5 or it may be just short of the entry point for a PR7.


The chart that follows shows the range for each PR value. It also shows how much PR value or PR points a page with 50 outbound links will pass depending on of its own PR rank. From this I have calculated the number of links required from each value of PageRank necessary for a page to attain a desired page rank.


Here is a PageRank Calculation Chart: (http://www.sitepronews.com/pagerank.html).


The Assumptions And The Mathematics:
For those who are interested I have used logarithmic values of base 5.5. In other words the value range for a PR1 lies between 5.5 to the power of 1 and 5.5 to the power of 1.99, and PR2 lies between the value of 5.5 to the power of 2 and 5.5 to the power of 2.99 etc. The rest of the chart is fairly straight forward. It assumes that there are 50 links per page and that 85% of the PR value is passed to the recipient page.


The number of links required to attain any ranking is based on the median value of the donor page and the entry threshold of the desired PR value. In other words to achieve a PR5 you need 5,033 points and the average points available from a PR6 page with 50 links is 1507.
The chart was calculated with an Excel spreadsheet and it can be downloaded if anyone wants to play with the calculations and assumptions. It might be interesting to work with a different base number for the logarithmic calculation. And it is also interesting to see the impact of more or less outbound links from a page.


Are These Findings Valid?
Nobody knows for sure how Google calculates PR. I have shown this chart to a number of knowledgeable people and they have all agreed that my calculations look reasonable. One SEO guru from a major firm said the results were very similar to independent research that his firm had conducted.


So take it or leave it. It is probably a fair reflection of how PR is passed and accumulated.
The lesson that can be drawn from this explanation and chart is that if you want to increase your PR you need a few links from pages with equivalent or higher PR, or a great many links from sites with lower PR.

Article by Bob Wakfer for Sitepronews

Friday, July 01, 2005

Search Engine Optimization - Dead at age 7?

The question of the day causing all this ruckus, "Is SEO Dead"?
Well I'm here to dispel any misconceptions or confusion that currently exists surrounding this question. Have you ever heard the words, content is king, if you say enough times, then in your mind content will become your primary focus when you optimize your website and then ask yourself the question, "why don't I rank well on the search engines"?

Well the partial reality is web surfers use words to find related websites, but this in no way means content is king if you are trying to drive targeted visitors to your site. On the other side of the coin, you can't get quality links without quality content.

An SEO "blogger" has gone as far in saying "Organic SEO is dead", he's of the opinion that Content SEO is dead, and that anyone interested in raising their search engine rankings should focus only on link generation.

So which is it, what's happening off the page is more important than it's ever been. That's going to be an upward trend; this is common knowledge these days. No longer can an SEO pass themselves off as useful if they do not go after and get a respectable amount of links for their clients.

Here is a brief summary of key points to keep in mind when optimizing your website:

Carry on building links for ever.
Old links decay in value
Don't add them too fast, or risk the spam filter
Don't add them too slowly or you won't get enough
Get links from fresh pages
Ask linking site to move the link to a different page to make it "fresher"
Vary anchor text over time
Don't change the content of your key pages as not to reflect incoming anchor
Register your domain for several years
Use a solid hosting company with guaranteed uptime
Add new pages/content to your site all the time
Put Adsense on your site and make sure it gets good click through rate
Rank well in the past because Google counts your old rank in the current rank
Don't jump up and down in the serps too much. Google likes stable rankings
Make your site sticky. Sticky sites are favored
Aggressively pursuing a strong linking campaign gives you a fighting chance for competitive keywords. Google emphasizes links and sees them as indicative for the worthiness of your site to rank high as well as link text using it to determine what the linked to page is about.

Simply put, links imply that content is worth linking to and worth ranking high and more importantly your links should contain the keywords you are optimizing your website for and lastly links must be relevant to the nature of your online business.

Thursday, June 30, 2005

SEO & the Google Search Engine Algorithm Patent

Google has recently filed a patent that details many points that Google uses to rank web pages. The title of the patent is "Information retrieval based on historical data" and it confirms the existence of the Google sandbox and that it can apply to all web pages.

In this article, we're trying to find out what this means to your web site and what you have to do to optimize your web pages so that you get high rankings on Google.

Part 1: How your web page changes influence your rankings on Google

The patent specification revealed a lot of information about possible ways Google might use your web page changes to determine the ranking of your site.

In addition to web page content, the ranking of web pages is influenced by the frequency of page or site updates. Google measures content changes to determine how fresh or how stale a web page is. Google tries to distinguish between real and superfluous content changes.

This doesn't mean that it is always advisable to regularly change the content of your web pages. Google says that stale results might be desirable for information that doesn't need updating while fresh content is good for results that require it.

For example, seasonal results might go up and down in the result pages based on the time of the year.

Google possibly records the following web page changes:

the frequency of changes
the amount of changes (substantial or shallow changes)
the change in keyword density
the number of new web pages that link to a web page
the changes in anchor texts (the text that is used to link to a web page)
the number of links to low trust web sites (for example too many affiliate links on one web page)
Google might use the results of this analysis to specify the ranking of a web page in addition to its content.

Section 0128 in the patent filing reveals that you shouldn't change the focus of too many documents at once:

"A significant change over time in the set of topics associated with a document may indicate that the document has changed owners and previous document indicators, such as score, anchor text, etc., are no longer reliable.

Similarly, a spike in the number of topics could indicate spam. For example, if a particular document is associated with a set of one or more topics over what may be considered a 'stable' period of time and then a (sudden) spike occurs in the number of topics associated with the document, this may be an indication that the document has been taken over as a 'doorway' document.

Another indication may include the disappearance of the original topics associated with the document. If one or more of these situations are detected, then [Google] may reduce the relative score of such documents and/or the links, anchor text, or other data associated the document."

This means that the Google sandbox phenomenon may apply to your web site if you change your web pages.

What does this meant to your web site?

First of all, you should make sure that your web page content is optimized for Google. If your web page content is not optimized, all other ranking factors won't help you much.

Try to find out if the keywords you target on search engines require static or fresh search results and update your web site content accordingly. Make sure that you don't change too much at once so that your web site won't be put in the sandbox.

Google Sandbox & Search Engine Optimization

In a recent thread in an online forum webmasters discussed the question how long it takes to get listed in Google. A webmaster had submitted a web site with 15 individual pages to Google six months before and he was still not listed in Google although Googlebot visited his web site on a monthly basis.

It's normal Google behavior that a new web site is not listed in the natural (unpaid) search results for about six months. This Google practice is called the Google sandbox .

However, Google did not return any pages of this web site in its search results. Not even for obscure search terms or the company name.

Why was the website of this webmaster not listed in Google?

There were two main factors that prevented the site from showing up in Google's search results:

1. The age of the web site

Once a web site has been put into Google's sandbox, it takes six to eight months until it comes back to the normal index. It's likely that the web site of the webmaster is still in the sandbox.

2. The number and the quality of links to the web site

The webmaster admitted in the discussion that he only had a few links to his website and that these links didn't have high quality. Actually, a link popularity query on Google returned no links at all for that web site.

Yahoo showed only one backlink to the web site and the web site with the backlink was not accessible.

Google will only list a web site in its result pages if other good web sites link to it. If only a few other web sites link to your site and these web sites are of low quality, it will be difficult to make it into Google's search results.

You can find out the link popularity of your web site with this freeware link popularity check tool.

What does this mean to you and your web site?

There are three things you can do to get into Google's index:

1. Make sure that your web site is not in Google's sandbox.

As soon as you have finished your web site, submit it to Google. It takes about six to eight months to get out of Google's sandbox. The sooner you submit your site, the sooner you'll get in Google's normal search results.

2. Get high quality links from related web sites and Internet directories.

It's important that other web sites with similar content link to your site if you want to be listed in Google. A senior member in the discussion said it this way:

"Don't buy links... there's a good chance that they'll turn out to be dodgy in the long-term, and will do you more harm than good.

[...] search for sites that are similar or complimentary to your own, and send them a polite email asking if it would be possible to exchange links. Don't bother doing this unless you think your site is worth linking to.

DMOZ and Yahoo Directory listings are tremendously valuable, even though some might tell you otherwise, (usually because they couldn't get into them)."

An easy way to exchange links that way is our link popularity tool ARELIS.

3. Optimize your web pages.

It is important that your web pages are optimized for Google if you want to get high search engine rankings. Google must be able to find out what your web pages are all about.

A combination of optimized web pages and high link popularity leads to high rankings on Google. Make sure that your web site has both and you'll benefit from high rankings on Google as soon as your web site is out of Google's sandbox.

Wednesday, June 29, 2005

Leave Cloaking to the Romulans

Cloaking shields may be great for spaceships, but can be extremely damaging to a website. Here is a short discussion about cloaking pages.

When you choose a web hosting company, you don't expect that they are going to change your web pages. You also don't expect that they change your web pages for their own benefit. Unfortunately, that's exactly what some web hosts seem to do.

What do these web hosts do to your web pages?

These web hosting companies change your web pages when a search engine spider requests them. For example, if Googlebot visits your web pages, the web host will return a different web page than the web page that is returned when a normal web surfer visits your web pages (a technique with the name cloaking).

The changed web pages contain links to the web host web site so that the link popularity of the web host site is improved. In addition, the web host creates new page and complete sub directories full of links on your web site.

Note that these changed URLs and the new pages cannot be seen by you. The URLs are not static and you cannot see them with your FTP client. Only search engine spiders can see the changed web pages because the web host intercepts the requests and dynamically creates the pages.

Some hosts also seem to add links to some of their clients on your web pages. They do this to artificially improve the link popularity of their clients' sites. If you see any Secret backdoor to Google offers from your web host without further details, you should be skeptical. If a web host changes the web sites of other people, it is likely that they will also change your web site.

What does Google think about this?

Shortly after this issue has been raised in an online discussion forum, a Google spokesman commented on it:

"What a scuzzy practice. [...] I've seen stuff like that before, but it's usually pretty rare. Legitimate hosts have a lot to lose from deceiving their hosting clients like that [...] Practices like that just go way beyond legitimate and into scamming."

How can you find out if your web host changes your web pages?

Go to Google, search for your domain name and click on the "Cached" link next to the results. You'll see the web page that Google has indexed then.

If the web page in the Google page has links to other web pages that you don't know, chances are that your web host has changed your pages. In that case, you should contact your web host or Google so that the problem can be solved.

Web hosting companies with ethical business standards don't use these techniques. If you find out that your web host changes your pages, you should consider a new hosting company.

If Google finds out that your web site uses cloaking, you will get into trouble, even if the cloaking has been done by your web host and not by you.

Search Engine Optimization Technique Survey

1) Which top level domains did you have the most success with?
.com 57 (82.61%)
.net 7 (10.14%)
.biz 2 (2.90%)
.org 8 (11.59%)
Other 12 (17.39%)

2) What is the total number of web pages per website have you had the most success with?
1 3 (4.35%)
1-5 12 (17.39%)
5-10 11 (15.94%)
10-20 13 (18.84%)
20-30 13 (18.84%)
30-50 4 (5.80%)
50-100 5 (7.25%)
100 or More 18 (26.09%)

3) What % of keyword density has been most effective for you in the Body text?
Less than .5% 2 (2.90%)
Between .5% to 1% 3 (4.35%)
Between 1% to 2% 4 (5.80%)
Between 2% to 3% 9 (13.04%)
Between 4%to 5% 16 (23.19%)
Between 5% to 6% 10 (14.49%)
Between 6% to 7% 6 (8.70%)
Between 7% to 8% 5 (7.25%)
Between 9% to 10% 6 (8.70%)
More than 10% 8 (11.59%)

6) How many words per page have you had the most success with?
Less than 10 7 (10.14%)
Between 10 to 100 16 (23.19%)
Between 100 to 200 6 (8.70%)
Between 200 to 300 12 (17.39%)
Between 300 to 400 14 (20.29%)
Between 400 to 500 8 (11.59%)
Between 500 to 1,000 5 (7.25%)
More than 1,000 1 (1.45%)

7) What % of keyword density has been most effective for you in the Title tag?
Less than 10% 13 (18.84%)
Between 10% to 20% 14 (20.29%)
Between 20% to 30% 12 (17.39%)
Between 30% to 40% 6 (8.70%)
Between 40% to 50% 4 (5.80%)
Between 50% to 60% 7 (10.14%)
Between 60% to 70% 4 (5.80%)
Between 70% to 80% 1 (1.45%)
Between 80% to 90% 3 (4.35%)
Between 90% to 100% 5 (7.25%)

8) Does a keyword rich domain name guarantee better ranking?
Yes 41 (59.42%)
No 28 (40.58%)

9) Do you place your keywords in your folder/directory names?
Yes 38 (55.07%)
No 31 (44.93%)

10) Do you name your web pages with your keywords?
Yes 53 (76.81%)
No 16 (23.19%)

11) Do you name your image files with your keywords?
Yes 46 (66.67%)
No 23 (33.33%)

12) Do you use hyphens or underscores when naming your directories and web pages?
Hyphens 36 (52.17%)
Underscores 32 (46.38%)

13) Have you created a search engine friendly site map?
Yes 47 (68.12%)
No 22 (31.88%)

14) Do you optimize text-based navigation of the site with an optimized inter-linking structure?
Yes 45 (65.22%)
No 24 (34.78%)

15) Do you control the web page content headings through optimized CSS (Cascading Style Sheets)?
Yes 44 (63.77%)
No 25 (36.23%)

16) Do you add keyword rich H1-H6 tags to your web pages?
Yes 48 (69.57%)
No 21 (30.43%)

17) Do you create an optimized robots.txt file to control indexing of your website by the search engines?
Yes 32 (46.38%)
No 37 (53.62%)

18) Do you add a visible keyword rich text to the anchor text with the Title tag attribute?
Yes 35 (50.72%)
No 34 (49.28%)

19) Do you add a visible keyword rich text to most of the images with the Alt tag attribute?
Yes 51 (73.91%)
No 18 (26.09%)

20) Do you use Bold, Strong, Italic and other font attributes to emphasize keywords?
Yes 49 (71.01%)
No 20 (28.99%)

21) Do you include and optimize the META keyword and description tags?
Yes 65 (94.20%)
No 4 (5.80%)

22) Do you place your domain name in the Title and META tags?
Yes 30 (43.48%)
No 39 (56.52%)

23) Do you engage in reciprocal linking and maintain a resource or link page?
Yes 44 (63.77%)
No 25 (36.23%)

24) Do you solicit or buy one way inbound links?
Yes 17 (24.64%)
No 52 (75.36%)

25) Is a link a link, or do you look for quality authority websites to get links from within the same business as yours?
Yes 44 (63.77%)
No 25 (36.23%)

26) Do you specify to your link partners which internal page(s) you want their links to point to or most of your inbound links point to your home page?
Yes 38 (55.07%)
No 31 (44.93%)

27) Do you submit your website(s) to DMOZ?
Yes 50 (72.46%)
No 19 (27.54%)

28) Do you submit your website(s) to the Yahoo! directory?
Yes 50 (72.46%)
No 19 (27.54%)

29) Do you agree creating a community on your website(s) with a forum or web blog will help your ranking?
Yes 54 (78.26%)
No 15 (21.74%)

30) Is high Page Rank as effective as it used to be to get high ranking?
Yes 33 (47.83%)
No 36 (52.17%)

Symbiotic Search Engines

A recent study of Nielsen/NetRatings revealed that a minority of searchers exclusively use only one of the top three search engines Google, Yahoo and MSN Search.

Which search engines do web surfers use?

According to the study, 58 percent of Google searchers also visited at least one of the other top two search engines, MSN Search and Yahoo, showing that even though Google's market share is dominant today, there is significant opportunity for its competitors to grow their share.

The use of multiple search engines is not limited to Google's searchers. Nearly 71 percent of those who searched at Yahoo also visited at least one of the other top two search engines, and 70 percent of those who searched at MSN also tried their luck at one or both of the other two.

"While it shouldn't surprise anyone that Google is the search engine to beat, it is critical that all of the major search players, including Google, recognize that they exclusively own only a minority of their users," said Ken Cassar, director, strategic analysis, Nielsen/NetRatings.

"This highlights an opportunity and a threat to all of the established players in the market, and underscores the importance of continued innovation in a highly competitive market that is anything but mature."

What does this mean to you and your web site?

We have mentioned it before in this newsletter: don't focus on a single search engine. To get best results, optimize your web site for all the top three search engines.

Optimize some of your web pages for Google, some for Yahoo and some for MSN Search. Both Yahoo and MSN Search have so many visitors that they can drive a considerable amount of traffic to your web site.

While everybody else concentrates on Google, it's much easier to get targeted visitors from the other two big search engines.

Google is an important search engine but others are quickly catching up. If you optimize your web pages for more than one search engine, you will get targeted traffic from many sites. In addition, you are less dependent from a single source which is important if your rankings drop on one search engine.

Tuesday, June 28, 2005

PPC for Dummies Part Deux

Two of the most important factors of any Pay Per Click (PPC) campaign are creating successful ads and deciding how much to pay per click. There are many PPC options out there to choose from, I am going to focus on the two most popular, Google AdWords and Overture.

Creating Your Ads for AdWords
Creating your ad copy is the single most important part of any ad campaign. You want your ad to stand out amongst the others and scream out 'click me!' If your ad looks the same, and says the same, as everyone else's, searchers will simply pass it by.

Before creating your ads you need to determine your target market and keyword selections. If your company focuses on a specific market niche, try to target your ads in regards to that niche. Properly targeted ads will almost always out-perform those directed at a general audience.

The ad you create should include your main keywords either in the title or near the beginning of the body text. Draw attention by using call to action phrases and words that provoke enthusiasm and response. Use phrases like "Save on DVDs," "Get cheap stereos," or "Join now for 20% discount," etc. Just be cautious and be sure to follow Google's Guidelines. If you advertise something that you don't offer, Google will pull your ad. Also, if your ad offers something free, make certain that its listed on your landing page!

Once you are satisfied with your first ad, create 3 more ads that are radically different from the first. After 3 or 4 days, take a look at how your ads are doing. (If you are using less frequently searched terms you may have to wait 1-2 weeks for better results.) Check the click through rate (CTR) of each ad. In most cases, one of your 4 ads will be out-performing the rest. If this is the case, delete the poorly performing ads and create 3 new ads that are similar to the successful one, each with subtle differences in the title and body text.

Again, wait 3 or 4 days to see which ad is performing best. If you notice that one ad stands out, repeat the process. Eventually you will end up with 4 quality ads that are performing equally. Once the ads have leveled out, continue to keep an eye on them. I recommend that you do so daily. If one begins to slip, tweak the wording. Monitoring your ads is essential, if you want them to perform well.

Determining Your Max Cost Per Click with AdWords
With AdWords, when you enter your MAX CPC, Google will show the estimated average position for each keyword. (The position predictions provided by Google are based on historical data from previous advertisers and are not 100% accurate, but will give you an idea of what to expect.)

Unfortunately, there is no way to see what the competition is paying, so in most cases, it's a bit of a duck hunt. I suggest starting with a MAX CPC that is slightly higher than you might normally pay. This will ensure a slightly higher ranking for your ad and increase your chances of accumulating clicks. If your ad performs well, your rank will increase. Once you have attained a good click through rate (CTR), you can adjust your max CPC to reflect the position you wish to obtain. (See part one of this article to find out how Google ranks ads.)

Creating Your Ads for Overture
Writing the perfect ad for Overture is somewhat different than for AdWords. Overture only allows you to create one ad per keyword, so testing various ads and going with the obvious winner is not an option. However, the basics for creating your initial ad is virtually the same. After you have selected your target market and main keywords, write a specific ad targeting each individual keyword. Be sure to include the keyword in the title or beginning of the main body text along with a call to action phrase that will draw attention. Remember to check the status of your ads on a weekly basis and tweak as needed. Keep an eye on your click through rate and regularly modify poorly performing ads.
Determining Your Max Cost Per Click with Overture
Deciding how much to spend on Overture is simple. Take a look at what the competition is spending and out bid them. With Overture you should always try to be in the top 3 if you wish to have your ad dispersed among partner sites (Yahoo, Lycos, MSN, etc). If the number 1 spot is currently paying 25 cents per click, you need only bid 26 cents to grab the number 1 spot. If you want the number one spot but are also willing to pay more, you can bid 40 cents, and will only be charged the 26 cents. One penny above the competition. Keep in mind though, if someone else increases their bid, your actual cost will also increase up to the max CPC you have entered.

Managing an AdWords or Overture PPC campaign can be confusing at first, but it doesn't take long to get a handle on what works. Creating a highly successful ad the first time around with either AdWords or Overture is a rare occurrence, but with a bit of regular maintenance and a well targeted campaign, it doesn't take long to start seeing results.


Highlights from a SiteProNews article by Scott Van Achte

PPC for Dummies

For the beginner, understanding PPC (Pay Per Click) services can be utterly confusing. With so many search engines to choose from, and so many options within each one: different billing schemes, different terminology, and different techniques for ranking in the top spot, the learning curve is quite substantial. So why would anyone go to the trouble?

For quite some time now Google has been the primary source for web search. Nearly everyone who has ever used a computer has either used, or at least heard of, Google. But as the Florida update has shown us, free placements in the search engines are not as stable as we would like them to be. Sure, after an algorithm change, we can go back to the drawing board to figure out the newest line of attack, re-optimize a site, and bring back that first page placement, but how much traffic and sales are lost as a result of the down time?

When it comes to most PPC campaigns you can be sure of one thing: Your rankings are stable. When you go to bed, you know that when you wake up the next morning your placements will still be there. Now, of course in many cases you may be out bid in Overture and find yourself slipping a couple of notches, but after a quick adjustment to your maximum bid, you're back in contention. This is a far cry from the months potentially lost after slipping (in some cases off the charts) into the dark abyss of positioning.

Google is not going away any time soon, so it is still very important to optimize and try to get those top placements regardless of whether or not you wish to pursue a PPC campaign. If you are ranking well on Google, in many cases it is still well worth it to pursue PPC placements as well to get that extra exposure. With a PPC campaign it's important to remember that it isn't always as simple as paying top dollar to dominate the number one spot. Regular tweaking and maintenance are required.

So what is involved in achieving top spot in a PPC campaign?

Google Adwords
Your Google AdWords Ad is given a ranking value by multiplying your maximum Cost Per Click (CPC) with your current Click Through Rate (CTR) and ads are sorted accordingly.

For AdWords, you must constantly monitor the performance of your keywords and ads. If the CTR of your keywords begins to slip then your position will most likely drop, and it's time to either re-write your ads to draw attention, adjust your max CPC, or a combination of both. What will work best, depends on a variety of variables; your CTR, current CPC, how competitive your keyword phrase is, and the wording in your competitors ads. Remember you want to stand out as the obvious best choice.

Looksmart
Looksmart has a PPC Service that is somewhat different than AdWords and Overture. With Looksmart you write your own title and ad text for your listing and pay a set rate PPC. The ranking order for listings is "based solely on their relevance to a user's search as determined by LookSmart's proprietary search algorithm. Payment does not influence the appearance or rank of the listings in the Reviewed Web Sites section." ( - Looksmart)

If you choose to use Looksmart, it is essential for your website to be properly optimized. The main downside to Looksmart is that your CPC payment is just to get you listed, and does not guarantee any positioning.

Overture
The ranking of your Overture listings is determined by one thing and one thing only. How much you are willing to pay. If your ad position drops, increase your bid and within seconds you are back to where you left off. Now remember, being number one is not everything. If people see no interest in your listing, they will simply click on number two. Of course, this doesn't cost you anything directly, but indirectly you may be losing the all so important sales. This is why it's important to have carefully written copy for your listing.

In the case of Overture, Looksmart and Google ads, the copy you choose does not affect your position, so you don't need to worry about the ad being 'search engine friendly', but you do need to ensure it is searcher friendly. Carefully select the wording to use in your ad copy and be sure to include the keyword phrase in either the title or the beginning of the text. Say something that will jump out at the reader. You want them to see your ad as being highly relevant to their search, as well as being interesting and inviting. Remember; just because you dominate the top spot, does not mean you will necessarily draw all the traffic (although it does help!)

Before you get started with any PPC Campaign be sure you understand the search engine's billing practices. Google AdWords charges a one time, $5.00 setup fee, and after that you pay only for delivered traffic. Overture does not have a setup fee, but they do require a minimum charge of $25/month, regardless of whether or not your click through's have accumulated to that total. Looksmart bills at a flat rate of 15 cents per click. Each engine has a different billing plan for minimum usage, and it's important to understand them so that you don't get burned.

Once you have selected what search engine, or engines, you wish to use, start off by reading through their FAQ page, guidelines, tips pages, and absorb as much information as you can to get a good grasp on how their PPC system operates. If you are new to all of this, AdWords and Overture will seem overwhelming at first glance, but your understanding will grow the more you review the information offered by these engines. It doesn't take long to get a firm grasp of the various systems.

Highlight from a SiteProNews article by Scott Van Achte

Thursday, June 23, 2005

Search Engine Optimization using short URLs

Use Short Relative URLs


Brevity is a good thing. Absolute, verbose URLs are out, relative short URLs are in. By carefully naming your files and directories, and judicious use of abbreviation with mod_rewrite and content negotiation with same, you can speed up your pages while maintaining legibility and search engine positioning. Here's an illustrative example of a overextended URL:

[img src="http://www.domain.com/files/images/
single_pixel_transparent_gif.gif" width="1" height="1"]

This URL has a number of problems, including its length and missing ALT attribute. Let's see how we can shorten it into a more reasonable size. But first, a brief URL tutorial.


Anatomy of an URL

A uniform resource locator, or URL, is a unique address that points to where a resource is located on the Internet. The URL consists of the document's name preceded by the directories where it resides, preceded by a domain name, preceded by the protocol required to retrieve the document.

protocol://server_domain_name/pathname


Relative URLs

When possible, use relative URLs rather than absolute URLs. Absolute URLs include the server and protocol so they are unambiguous, but can cause problems when moving your files to another location. They are also unnecessarily long. Relative URLs base their location on the document's base URL, and browsers fill in the rest. Our too-long-at-the-party single pixel GIF could be shortened to:
/files/images/single_pixel_transparent_gif.gif

Abbreviate File and Directory Names

Even better, abbreviate the filename of this non-functional spacer graphic, like this. /files/images/t.gif
You can go even further and use content negotiation to omit the ".gif" part entirely. But why stop there? Move the image, your logo, and other frequently used resources up to the root level of your web server, and use the following minimalist URL.

/t.gif

The Base Element


To make relatively linked pages work offline and shorten your URLs, you can use the [base href] header element. The base tag must appear within the header of your XHTML document. Normally, browsers fill in relative URLs based on the URL of the current document. You can change that behavior with the base element to reference the base URL, not the current document's URL when resolving resources, like this:
[base href="http://www.domain.com/" /]
Now your relative URLs will resolve to this domain's top-level directory and also work from your hard drive.

Relative Base Element

One little-known technique is to use a relative URL as your base href to save space. By referencing a frequently used directory, you can save space within your XHTML files. For our single pixel GIF example you could use the following base element.
[base href="/files/images/"]

Now when you reference an image, you can just refer to the file name itself, without the directory location. For pages with numerous images that you may not want to move to the top level directory, this can save a substantial amount of space.
What About Search Engine Optimization?

Search engine optimizers and information architects naturally encourage descriptive filenames and directories. Search engine spiders feast on keyword-filled URLs, auto-breadcrumb scripts display directories and files, and logical hierarchies help users navigate. To avoid over-abbreviation, some webmasters choose to abbreviate and relocate only frequently used resources on high traffic pages, or use mod_rewrite or the base element for the best of both worlds.

Conclusion


Using short, relative URLs can make your pages download faster, and ease migration headaches. Using the base element and mod_rewrite can alleviate the need for absolute URLs, and save additional space.

Mod_rewrite makes linking easy

Abbreviate URLs with mod_rewrite

Called the "Swiss Army knife" of Apache modules, mod_rewrite can be used for everything from URL rewriting to load balancing. Where mod_rewrite and its ilk shine is in abbreviating and rewriting URLs. One of the most effective optimization techniques available to web developers, URL abbreviation substitutes short URLs like "r/pg" for longer ones like "/programming" to save space. Apache and IIS, Manilla, and Zope all support this technique. Yahoo.com, WebReference.com, and other popular sites use URL abbreviation to shave anywhere from 20% to 30% off of HTML file size. The more links you have, the more effective this technique.

How mod_rewrite Works

As its name implies mod_rewrite rewrites URLs using regular expression pattern matching. If a URL matches a pattern that you specify, mod_rewrite rewrites it according to the rule conditions that you set. mod_rewrite essentially works as a smart abbreviation expander. Let's take our example above from WebReference.com. To expand "r/pg" into "/programming" Apache requires two directives, one turns on the rewriting machine (RewriteEngine On) and the other specifies the rewrite pattern matching rule (RewriteRule). The RewriteRule syntax looks like this:
RewriteRule
Becomes:RewriteEngine OnRewriteRule ^/r/pg(.*) /programming$1

This regular expression matches a URL that begins with the /r/ (we chose this sequence to signify a redirect to expand) with "pg" following immediately afterwords. The pattern (.*) matches one or more characters after the "pg." So when a request comes in for the URL a href="/r/pg/perl/ the rewrite rule expands this abbreviated URI into a href="/programming/perl/.

RewriteMap for Multiple Abbreviations
That'll work well for a few abbreviations, but what if you have lots of links? That's where the RewriteMap directive comes in. RewriteMaps group multiple lookup keys (abbreviations) and their corresponding expanded values into one tab-delimited file. Here's an example map file snippet from WebReference.com.
d dhtml/
dc dhtml/column
pg programming
h html/
ht html/tools/

The MapName file maps keys to values for a rewrite rule using the following syntax:${ MapName : LookupKey DefaultValue }

MapNames require a generalized RewriteRule using regular expressions. The RewriteRule references the MapName instead of a hard-coded value. If there is a key match, the mapping function substitutes the expanded value into the regular expression. If there's no match, the rule substitutes a default value or a blank string.

To use this MapName we need a RewriteMap directive to show where the mapping file is, and a generalized regular expression for our RewriteRule.
RewriteEngine OnRewriteMap abbr txt:/www/misc/redir/abbr_webref.txtRewriteRule ^/r/([^/]*)/?(.*) $(abbr:$1}$2 [redirect=permanent,last]

The new RewriteMap rule points the rewrite module to the text version of our map file. The revamped RewriteRule looks up the value for matching keys in the map file. The permanent redirect (301 instead of 302) boosts performance by stopping processing once the matching abbreviation is found in the map file.

Binary Hash RewriteMaps

For maximum speed you should convert your text map files into binary *DBM hash file, which is optimized for maximum lookup speed. Then the above RewriteMap line would look like this:RewriteMap abbr txt:/www/misc/redir/abbr_webref
Automating URL Abbreviation
The above URL abbreviation technique works well for URLs that don't change very often. But what about news or blog sites where URLs change every hour or every minute? You can create a shell script that automatically scans and abbreviates incoming URLs or use the free open source script available at WebReference.com (http://www.webreference.com/scripts/) that does just that.

Conclusion
Abbreviating URLs with mod_rewrite is one of the most effective techniques available to optimize HTML files. File size savings can range up to 20% to 30%, depending on the number of links in your HTML page. You can combine this technique with URL Rewriting with Content Negotiation for maximum savings. Best used on high traffic pages like home pages, automated URL abbreviation can squeeze more bytes out of critical pages for server-savvy developers.