Archive for the ‘Search Metrics’ Category

Bing Growing, Yahoo Steady - Search Engine Market Share Update

July 28th, 2010 by Richard Zwicky


It’s been almost seven months since I last provided insight into the search engine market shares based on click through activity. After holding relatively steady for months, this latest update shows Bing has grown by 2.0%. Perhaps most interestingly, it’s no longer growing at the expense of Yahoo, which was previously the case.

Here’s the raw numbers:

Google Yahoo Bing Other
September 7 78.68% 11.51% 6.80% 3.01%
September 14 78.35% 11.13% 6.50% 4.02%
September 21 77.43% 11.35% 7.11% 4.11%
September 28 77.65% 10.80% 7.27% 4.28%
October 4 77.78% 10.66% 7.23% 4.33%
October 12 77.78% 10.66% 7.21% 4.35%
October 18 77.89% 10.65% 7.29% 4.17%
October 25 77.83% 10.56% 7.56% 4.05%
November 1 77.75% 10.46% 7.66% 4.12%
November 8 77.96% 10.21% 7.75% 4.08%
November 15 77.60% 10.39% 7.59% 4.42%
November 22 77.59% 10.41% 7.67% 4.37%
December 22 78.43% 9.73% 7.86% 3.97%
Month of June 2010 75.93% 9.94% 9.82% 3.83%

Eightfold Logic collects data from a network of web sites distributed globally. The data used in this reports represents web sites distributed globally, accessed by searchers located in the U.S., and reflects click-through activity data.


Changes in Natural Linking - Death, Rebirth, or a Return to the Roots?

July 15th, 2010 by Richard Zwicky

When contemplating link-building strategies you need to take the user experience into account.  They should look at your citations (links) as references to substantiating or relevant documents. Search engines will use these signals to define your site.  Consider the user experience when focusing on building contextually relevant links for your business, and you will be rewarded with higher amounts of relevant direct referrals and relevant search engine placement.

There’s been some excellent articles recently focused on this issue, including changes in natural linking by Eric Enge, Editorial Citation by Rand Fishkin and natural link building strategies by Michael Gray.

Michael’s analysis was interesting in that he took Google’s guidelines at face value, created great content, and spent only 10% of his time building links manually.  After six months of blogging, Google represented just over 0.5% of his referral traffic.  That’s a pathetic amount, considering how popular the posts were in StumbleUpon, Digg, and other social networks. In fact, when he analyzed his traffic, he discovered that the blog didn’t perform in the top 100 for even the simplest keywords.  The exception was one post for which Michael did a little link building.  Apart from this limited effort, certain posts which received over 30,000 views from social marketing generated almost no natural inbound links.

Now, Michael isn’t advocating that content doesn’t matter.  It does, as it engages readers and entices them to return.  His article restates a point everyone in the industry has been making for years: If you build it, they don’t just come, or in this case, the links don’t just happen, and the engines won’t just refer anyone your way. As Ian Lurie wrote: “content alone is not going to boost you into the top 10 for any even remotely relevant phrase”

Of particular interest to me in juxtaposition to the pieces by Ian and Michael were the articles by Eric Enge and Rand Fishkin.  Rand hypothesizes that 20% of the web’s links exist to influence the search engines.  That’s a lot of noise, but that also means 80% are not there just to influence search engines. Eric makes the point that 80% leaves lots of meat to work with and links are still a big factor, but he also estimates that you need to spend 30% of marketing energy into social media, which is interesting in the context of Michael’s lack of success with organic link building from social networks.

There’s an interesting question in Eric’s article, which every site operator should ask themselves: “If you aren’t good enough to be worth linking to, then what do you have anyway?” The answer is of course, you need to build better content.  But, a hint to the broader correct answer can actually be found in the title of Rand’s article: Editorial Citation.

Rand notes this in his reference to three periods of linking:  1) early web; links were editorial like footnotes and citations, helping people navigate the web; 2) The engines incorporate web page links as a value metric in ranking algorithms (Google / Alltheweb/ Teoma); 3) non-webpage citations.  Google’s recent patent publication which was reviewed by Eric Ward, supports this last point - see Eric’s point #5.

However, another point in the patent leads us in towards additional context: user interaction with links may determine their value, and may go a long way to resolving the dichotomy between the points these articles circle, but also may point search marketers towards clues around link building strategies going forward.

Links from different areas of the document will have different value, and will pass different amounts of link juice flow.  Obviously, a link from one site to the next where the link is located in the main body content and is relevant to both the origin and destination will end up scoring higher than an irrelevant link, or links within footers, template side navs and the like.  Which really is the point: links as citations are the oldest form of linking, and still carry the most value.



Bryan Eisenberg & Richard Zwicky at SES Toronto

July 13th, 2010 by Richard Zwicky

Bryan Eisenberg, who is the best-selling author of  “Waiting for Your Cat to Bark?: Persuading Customers When They Ignore Marketing” and many other books and I did a panel together recently at SES Toronto.  After the panel, he interviewed me with regards to meaningful metrics.  It was a good panel, and interview.

Perhaps the most salient point from both panel and interview is that when a business is trying to understand and evaluate key metrics in online marketing they need to look through the entire value-chain. The challenges of the changing marketplace make it very difficult for marketers to measure all channels equitably and fairly, balancing search, social media, email, newsletters, etc. Bryan highlighted the importance of cross channel metrics, which I was able to substantiate with an example of a client who was struggling to find value in PPC after having only invested in, and measured, one channel. Upon examination, the client discovered that a significant portion of the business’s social and organic search traffic was preceded by visits from the paid channel, and that these multi-touch visits were actually converting and providing measurable results at a higher rate than single visit traffic.

I hope you take the 5 minutes to listen to the interview, and feel free to send me any questions that you have as a consequence.


Enquisite Suite Update - April 2010

April 22nd, 2010 by Richard Zwicky

Earlier this week we released an update to the Enquisite reporting suite, extending the functionality of some key components. We are very excited to share the news, and outline the features and benefits with you. Many of these changes are very significant, and are benefits unique to Enquisite: you can’t do most of these things with any other analytics package on the marketplace.

Cross-Domain and Sub-Domain Tracking
We’ve added the ability to track actions and conversions across different domains and display the results as a unified reporting set. This is particularity useful if a booking engine or shopping cart is hosted on a separate domain, or if multiple domains all point in to one site for conversion purposes. Please note that cross-domain and sub-domain tracking will require assistance from the Enquisite team to implement: it’s not complicated, and shouldn’t require any effort from you. If you would like to take advantage of these new capabilities, please send us an email at [email protected].

Sub-Domain Tracking
Cookies can now be tracked across sub-domains and we now tie sub-domain activity together using a single Enquisite tracking code. This allows you to take advantage of our entire platform of functionality while ensuring accurate cross-channel attribution of actions and conversions, no matter where your customers go on your site.

Advanced Organic Keyword Predictions
We are already widely regarded as industry leader in Organic Keyword Research and Predictive Analysis. But being the best isn’t good enough. There’s always room to improve. During the last year, our research scientists have been working hard to enhance the core algorithms and validation routines that deliver predictive keyword suggestions to Enquisite Campaign users. This update is a major step forward on a lot of fronts, and it means you’ll benefit even more than ever from our predictive insights far in advance of the rest of your competitors in the marketplace.

Advanced Regex Segmentation
Enquisite’s Search and Social Analytics platform, Optimizer, is recognized by advanced search marketers as the fastest and most accurate way to segmenting your search traffic. We’ve enhanced your ability to segment by adding the ability for you to take advantage of the most powerful commands in the regular expression (regex) database query set.

In doing so, we have opened up a whole new way to look at your search data. These newly added commands can also function with lists, meaning that if there is a large selection of keywords that you need to work with you simply create a list, and use the list within an expression. Advanced segmentation functions are available when using the ‘contains’ and ‘does not contain’ match type in the Optimizer Longtail segmentation panel. Commands and variables include:

* An asterix is any number of characters,
? A question mark is a single wild character.
| The pipe symbol denotes alternation (either of a number of alternatives).
() Parentheses may be used to group items into a single item. (e.g. (a|b|c))
{x} Brackets indicate a list substitution where ‘x’ is the name of the defined list of words
\ Backward Slash before a control character denotes it as that character not the command (e.g. ” \ * ” is the ” * ” character, not a wildcard)

The search marketers we previewed the regex features to were absolutely thrilled. It makes the management of large datasets so much easier on a lot of levels, and saves them days of work each month.

We Want Your Feedback
We are eager to get any thoughts or suggestions on improvements or new features that you would like to see in the Enquisite Performance Suite in order to exceed your expectations and best meets your needs. You can submit your ideas at [email protected].


Search Engine Market Share by Click Through Activity - December 2009

December 21st, 2009 by Richard Zwicky

Surprisingly, I haven’t posted a search engine market share report in 30 days. We did post lots of other interesting data in the interim however. This week, we’re getting back to the evolving search engine landscape. Of course, not a lot overall has changed since our last look at the data.

Google continues to own almost 80% of the actual click through market share. We recognize that our numbers are different from some other reports. The core difference is our reports reflect click through activity, as opposed to general activity. As demonstrated in the post “how long is normal,” while most search lookup activity is on one word queries, click throughs occur most often on three-word searches. The same holds true for the various engines. A lot of people apparently run searches on Bing / Yahoo, but they refine their searches prior to clicking through. Hence, Google shows a much higher market share when we examine just click through activity.

As it relates to the change in activity over the last month, Bing continues to show strong forward momentum, and Yahoo continues to fade away. Sad, really. Google’s decline which started in June appears to have stabilized at a dominating ~78.4% market share. If we look at areas outside the US, Google’s share is even higher.

For convenience, this graph shows the change in Yahoo / Bing / and other non-google shares since May 2009. If you want to look at the raw data that for back you can view it on the prior blog post about search engine market shares. The data table is getting so long however that we’ll just show the last 4 months from here on out. I’m using an “all-time” chart to show the trends though.

The raw data for those who prefer the numbers:

Google Yahoo Bing Other
September 7 78.68% 11.51%  6.80%  3.01%
September 14 78.35% 11.13%  6.50%  4.02%
September 21 77.43% 11.35%  7.11%  4.11%
September 28 77.65% 10.80%  7.27%  4.28%
October 4 77.78% 10.66%  7.23%  4.33%
October 12 77.78% 10.66%  7.21%  4.35%
October 18 77.89% 10.65%  7.29%  4.17%
October 25 77.83% 10.56%  7.56%  4.05%
November 1 77.75% 10.46%  7.66%  4.12%
November 8 77.96% 10.21%  7.75%  4.08%
November 15 77.60% 10.39%  7.59%  4.42%
November 22 77.59% 10.41%  7.67%  4.37%
December 22 78.43%  9.73%  7.86%  3.97%

Enquisite collects data from a network of thousands of web sites distributed globally. The data used in this reports represents web sites distributed globally, accessed by searchers located in the U.S., and reflects click-through activity data.


Is Longer Better? What the Best Length for a Query?

December 17th, 2009 by Richard Zwicky

In my recent post, “How Long is Normal?” I published data which showed that based on click through rate, four-word queries are more common than one-word queries, and five-word ones are almost as common.

Today, I’m adding to that information with an additional layer showing a correlation between the number of words used in a query and the time on site, and pages viewed.

One would assume that a more specific query would result in longer time on site and pages viewed. Surprisingly, That’s not the case. In fact it appears that the more specific a query, the more a search referral visitor’s behaviour will reflect a decisive intent and higher level of sophistication in how they navigate web sites. They use the search process to pre-filter results more aggressively, and then they get to the point of their visit very quickly.

This information will of course have implications for bounce rate reporting, as a significant number of search referrals which normally could be classified as bounces more likely indicate a higher than expected level of satisfaction with the results.

Words in Query Percentage of Queries Avg Pages Viewed Avg Time on site
1 11.08% 6.64 4:32
2 24.56% 4.13 2:53
3 25.77% 3.06 1:57
4 17.68% 2.62 1:42
5 10.03% 2.29 1:27
6 5.36% 2.11 1:21
7 2.65% 1.97 1.14
8 1.36% 1.84 1:07
9 0.70% 1.74 1:04
10 0.37% 1.69 0:59

The longest query recorded in this data sample was a search referral with 594 “keywords” in it. Likely it was someone was searching for exact copies of an article, either to identify plagiarism, or link opportunities.

So, if this is “normal” for the Internet, how does your site match up? Interesting to think of this as one more way to determine if your web site’s SEO strategy is healthy.

About the data. Enquisite works with thousands of sites worldwide and captures a trove of relevant search-related data every day. The browser shares reported here are based on data from a selection of Enquisite-tagged sites that cumulatively represent over 350 million page views/month, across most major industry sectors - a very significant sample size. The information published reported solely reflects our data.


Which Mobile Browsers have the Most Sophisticated Users?

December 10th, 2009 by Richard Zwicky

Since I just posted about desktop browser usage, and reported that Mac users may not be, by default, any more sophisticated than Microsoft users, I thought it might be interesting to look at mobile browser usage.

Looks like Blackberry users view the least pages per mobile browser sessions (that’s me), and surprisingly, Palm Pre users are the fastest browsers. On the whole, not a lot of difference across the browsers, which surprised me. I though that iPhone and Android users would exhibit dramatically different behavior than others.

Mobile Browser Average Pages Viewed Average Time on Site
iPhone 2.49 02:38
Android 2.45 02:51
BlackBerry 2.13 02:48
Palm Pre 2.78 02:36
IE Mobile 2.48 03:13

About the data. Enquisite works with thousands of sites worldwide and captures a trove of relevant search-related data every day. The browser shares reported here are based on data from a selection of Enquisite-tagged sites that cumulatively represent over 350 million page views/month, across most major industry sectors - a very significant sample size. The data reported solely reflects our data.


Browser Share Report and More…

December 10th, 2009 by Richard Zwicky

Last week I posted some information about user behavior in relation to depth of visit. This week I’m going to share some data regarding how different browsers result in varying user behavior.

For the month of November, I decided to break down the user behavior differences behind Microsoft Internet Explorer (MSIE), Firefox, Apple’s Safari, and Google’s Chrome. At first glance one would assume that if someone visits a web site time on site and pages viewed should not be affected by browser. Yet, this is not the case. One could argue that Chrome and Firefox users are more sophisticated, as evidenced by the fact that they deleted their default browser, Safari and MSIE usage is almost identical, which should be the norm if default browsers were used, as it reflects the simplest behavior patterns. The most sophisticated users would change away from the defaults, and be faster / less patient in navigating sites.

Are Mac users really any more sophisticated than Windows users; perhaps not…?

Browser Percentage of Visitors Average Pages Viewed Average Time on Site
MSIE 60.38% 4.60 0:04:08
Firefox 25.08% 3.85 0:03:42
Safari 8.58% 4.33 0:04:01
Chrome 3.42% 3.65 0:03:35

The change in browser usage away from MSIE is truly stunning. I’m going to monitor this drop, and Chrome’s surge in case it was Holiday related. Stranger things have happened.

About the data. Enquisite works with thousands of sites worldwide and captures a trove of relevant search-related data every day. The browser shares reported here are based on data from a selection of Enquisite-tagged sites that cumulatively represent over 350 million page views/month, across most major industry sectors - a very significant sample size. The data reported solely reflects our data.


Google Search Update: Ranking Report Really is Dead (finally)

December 10th, 2009 by Richard Zwicky

This week I had the pleasure of moderating and speaking at SES Chicago. It was probably my favorite Chicago show yet. What a change from last year when everyone was nervous about how deep the economy would slide into chaos.

One subject that did create some buzz - no surprise - was Google’s announcement of an always-on personalized search. There’s been lots written about it, and the change truly is spectacular. Unfortunately, spectacular doesn’t always equate good.

Rather than dwell on all the questionable issues that the always-on personalized search system raises, I’m going to comment about something that’s actually good in this update: The death of the ranking report. Finally! Finally, rankings are totally meaningless as a reporting metric. Ranking reports which scrape results to identify a position in the search results have been deceptive for years, but now they are unquestionably and completely useless. Anyone providing a ranking report as authoritative is deceiving their clients.

In a way, I am thrilled with Google’s personalization changes, as they make the performance reporting used in Enquisite Optimizer even more valuable. It now is definitely the only real way to measure true page and rank positioning. Optimizer shows where people located anywhere in the world are finding your site in the results, based on actual click-through activity, not some bogus ranking report. This is only analytical platform which report back to you on what your customers are actually seeing in the search results.

People who use traditional ranking reports as a reporting metric are no longer able to report any meaningful data. First off, the data collected are unique to that computer. Second, other activity from that computer affects the results. Run just one site’s reports from a system? Do anything else with it? Anything you search for with that computer can now affect the results you’re seeing. Wait until Caffeine rolls out, and anything you do with that computer will cause variations. Use Google Docs, Gmail, or any other Google products? Your results will vary.

So how can any ranking report based on what one, or even 100 computers which repeatedly run ranking analysis reports be accurate? They can’t. The ranking report you used to use as a metric is dead.

If, as a user, you’re not comfortable with the new personalized search “benefit” just wait for caffeine to roll-out in full next year. Me? I’ve already changed my default search engine in Firefox to Bing. Strange, I’m not concerned about how responsibly Microsoft will handle my information.


Does Depth of Referral Affect Quality of Visit?

December 4th, 2009 by Richard Zwicky

Yesterday I published data around click through rates from the search results. That data shows that 95% of all search referrals now arrive from page 1 in the search results. The number is higher in paid, and slightly lower in organic search, but 5% for everything not on page 1 doesn’t leave a lot of room for any other positioning.

I thought it would be interesting to start comparing that data against quality of visit, from the perspective of engagement. A longer time on site and / or more pages viewed should give a good indication of engagement. What I found was quite surprising. You would think that a searcher who is going to bother to drill deeper into the search results would be more motivated to find the right information, and thus would stay engaged in a destination site longer. In fact, the opposite is true. As people drill deeper into the results they become less patient.

The information shown demonstrates how there is a relationship between where in the search results people click, and the quality of their visit to your business. In this case longer time on site and more pages viewed would indicate a better quality of visitor. Counter-intuitively, it’s not the people who drill deeper in the search results that are showing the greatest satisfaction when they land on a destination site, it’s the visitors from page one:

Referrals from Page # Pages Viewed Time on Site
  (average) mm:ss:
1 3.59 2:27
2 2.16 1:06
3 2.12 1:01
4 2.08 0:57
5 2.05 0:55

What this data demonstrates is that visitors from page one in the SERPs are, on average, spending twice as much time and viewing almost twice as many pages on the web sites they visit as visitors who arrive from clicking deeper within the results pages.

Not only is page one more valuable from the perspective of amount of traffic, but also quality. When viewed graphically, the similarity between pages viewed and time on site is stunning, both in relation to time on site v. the referring page number in the search results:

As well as to pages viewed v. the referring page number in the search results:

This less patient user behavior is also reflected in how people search using longer and longer queries. I published data a few weeks ago around how many words are in a typical referring query. What I found was while people might start searching with one word queries, they quickly move to longer, more specific requests. In the next few weeks I’ll expand on that post with some page view and time on site behavioral metrics as well.

As always, Enquisite collects data from a network of web sites distributed globally. The data used in this reports represents web sites distributed globally, and reflects click-through activity data.