Archive for the ‘Ranking Reports’ Category

Enquisite Suite Update - December 2009

December 23rd, 2009 by Richard Zwicky

To end 2009 we’ve updated and added to Enquisite’s Suite of search marketing tools and products. In fact, there are three major enhancements available to you today, plus a new product in pre-release:

  • The Enquisite Performance Dashboard
  • A Fresh New Look for a New Year
  • Feature Enhancement: Transferring Segment Data from Optimizer to Campaign
  • A really cool new product in pre-release

The Enquisite Performance Dashboard

We’ve added a new Enquisite Performance Dashboard to provide you with an at-a-glance overview of all your website’s critical search activity metrics as well as a summary of the performance data for each of your campaigns.

Marketers keep saying that 2010 is the year of the Dashboard, and if the prognosticators are correct, we’re kicking it off right with the first phase of a vital new report to help you understand what facets of your online marketing campaigns are driving your success.

By starting your day with a quick glance at your dashboard, you’ll be able to see trends in your search activity on a site-by-site basis, and know which campaigns you need to focus on for improvement and which ones you can refine to further enhance the performance of your business.

We recognize that everyone always wants slightly different information on their own dashboard, or to be able to pull information from one application into something else. We’ve got API’s for you!

A Fresh New Look for a New Year

When you next log into the Enquisite Performance Suite, you’ll immediately notice a dramatic improvement in the load speed, and look and feel of the platform. We are constantly listenening to your feedback on how to make the application easier to use, and have incorporated these suggestions into the new user interface. Among the many improvements of the new design, the new taskbar will greatly simplify the process of building and managing your campaigns. The overall navigation of the application has been streamlined to make the application easier to use and to improve your overall experience with it. Of course, none of that matters if the application doesn’t get faster too.


Enquisite is the fastest real-time analytics application on the market. Search marketers want to spend most of their time improving SEO and PPC campaigns, not analyzing and reporting. You need insights to act quickly. Enquisite is the only real-time search intelligence and decision support application on the market. With Enquisite, you don’t need to spend 80% of your time figuring out what to do, you can spend that time making a difference to the bottom line of your business!

Transferring Segment Data from Optimizer to Campaign

In our last release, Enquisite provided you with the ability to export segments created in Enquisite Optimizer to Enquisite Campaign. In the current release we enhanced this feature, giving you more options to where you export segment data from Optimizer to Enquisite Campaign.

This feature is found within the “Longtail” section of Enquisite Optimizer, which allows you to segment actual search referral traffic in real-time. Within the Longtail, you can partition actual search referral data on the fly by a variety of dimensions, including geographic location, referring search engine, actions, and conversions. Once you have created a segment of search referral traffic, Optimizer allows you to find the specific keyword phrases of searches from users of that segment that led them to your website. The feature now allows you to take these keywords that you discover and directly import them into either a new campaign or an existing campaign within Enquisite Campaign.

With this new feature, you can now quickly and easily take your analysis of past search referral data in Enquisite Optimizer and use it to better craft your future organic search-based campaigns within Enquisite Campaign.

A really cool new product in pre-release

As you are aware, Enquisite is a company founded on innovation. The very first beta product iteration back in 2005 included our patent-pending technology to extract, and report on keyword referral data based on the position the keyword was listed to browsers in the search engine results. That reporting was a break-through for search marketers hoping to report value more accurately to their clients, and also making it possible to be more sophisticated in optimization strategies. Now, with Google’s Personalized Search updates, that technology is more valuable than ever to search marketers.

That was our very first beta version. We’ve now got many very distinct patents-pending, some of which are incorporated into our Auditor, Optimizer and Campaign products. We’ve never sat still, and have continued to innovate to drive value for our clients.

This brings us to our latest product which we plan to release in Q1 of 2010. In the short term, this initial phase of the product will be available to partners as we ramp up. It’s a very cool, simple to use application which addresses the links pillar of SEO in a novel and fundamental manner. This product doesn’t compete with analysis apps like SEOmoz’s Linkscape. In fact, we’ll be incorporating some of SEOmoz’s data into ours to help your analysis: it offers search marketers something completely different, and incredibly valuable.

If links, and link-building are important to you, (yes they’re important for everybody), then you’ll need this product, or you’ll quickly be left in the dust by your competitors.

We’ll be announcing more around this new product as it approaches release in the New Year. If you want to be part of the pre-release as we move forward, let me know, and we’ll add you to the list as soon as possible.


Google Search Update: Ranking Report Really is Dead (finally)

December 10th, 2009 by Richard Zwicky

This week I had the pleasure of moderating and speaking at SES Chicago. It was probably my favorite Chicago show yet. What a change from last year when everyone was nervous about how deep the economy would slide into chaos.

One subject that did create some buzz - no surprise - was Google’s announcement of an always-on personalized search. There’s been lots written about it, and the change truly is spectacular. Unfortunately, spectacular doesn’t always equate good.

Rather than dwell on all the questionable issues that the always-on personalized search system raises, I’m going to comment about something that’s actually good in this update: The death of the ranking report. Finally! Finally, rankings are totally meaningless as a reporting metric. Ranking reports which scrape results to identify a position in the search results have been deceptive for years, but now they are unquestionably and completely useless. Anyone providing a ranking report as authoritative is deceiving their clients.

In a way, I am thrilled with Google’s personalization changes, as they make the performance reporting used in Enquisite Optimizer even more valuable. It now is definitely the only real way to measure true page and rank positioning. Optimizer shows where people located anywhere in the world are finding your site in the results, based on actual click-through activity, not some bogus ranking report. This is only analytical platform which report back to you on what your customers are actually seeing in the search results.

People who use traditional ranking reports as a reporting metric are no longer able to report any meaningful data. First off, the data collected are unique to that computer. Second, other activity from that computer affects the results. Run just one site’s reports from a system? Do anything else with it? Anything you search for with that computer can now affect the results you’re seeing. Wait until Caffeine rolls out, and anything you do with that computer will cause variations. Use Google Docs, Gmail, or any other Google products? Your results will vary.

So how can any ranking report based on what one, or even 100 computers which repeatedly run ranking analysis reports be accurate? They can’t. The ranking report you used to use as a metric is dead.

If, as a user, you’re not comfortable with the new personalized search “benefit” just wait for caffeine to roll-out in full next year. Me? I’ve already changed my default search engine in Firefox to Bing. Strange, I’m not concerned about how responsibly Microsoft will handle my information.


Building a New Business Begins

June 3rd, 2009 by Richard Zwicky

continued from part 3 Starting to Build the Campaign Platform

So, In June 2005, I set about separating Enquisite as an entity from Metamend. This was a crucial and critical step for me, and the company. I needed to establish a firewall between the companies for many reasons, not the least of which was to ensure that competing agencies would never have to fear that their data was accessible to potential competitors. My co-founder at Metamend, Todd Hooge, along with Glenn Convey, a very talented individual in his own right, took over operating that business. It took a few months, but I removed myself from all operational involvement, which was personally challenging. But by December 2005, it was done, and I was out.

By this time, I also had a complete outline for what would become the core elements of Enquisite Campaign, and the next generation of products still to come after it. Over the next few months, I proceeded to break down the functions into manageable pieces we could build as foundational elements for the platform. I received a lot of valuable guidance for each foundational block along the way, and added some key people to the company whom I still depend upon today. Key internal technical counsels for me were our Lead Developer, Rick Morris, and our VP for Technical Operations, Greg Caws. They foresaw many technical hurdles we would face long before we ran into them. Their advice made a dramatic difference every step of the way.

Building the foundation of Campaign was not trivial. Fundamentally, we needed five pillars to support the architecture for the system. We built the various pillars and released them as individual products. So while we’ve received great reviews and feedback for Pro (Now Enquisite™ Optimizer), PPC Assurance (now Enquisite™ Auditor) and our Links Report (within Optimizer), we’ve really looked at these as stepping stones. The last two pillars - the internally-facing “Collector” and a massively scalable, super-fast database - rounded out the foundation.

To build the opportunity analysis and the reporting functions for Campaign, we needed to deal with paid search traffic in a way no one before had. We needed to understand, and value, paid campaigns in relation to organic ones. This perspective, and the requirement to segment out paid search traffic in a new way, led to our PPC Assurance product. This application reveals click accuracy for PPC with amazing precision, enabling advertisers to credibly claim credits from Google, Yahoo! and others and potentially save thousands, even millions of dollars. Equally important, this first step allowed us to understand paid search in a different way from existing solutions.

As a bonus, the work on Enquisite Auditor led to our second product, Enquisite Pro. Earlier this year it was officially recognized by Yahoo!, when they began recommending it via their Traffic Quality Center. We built the early version of Pro - now renamed Optimizer for a purpose - to provide search engine positioning reports, and to segment traffic and campaigns in ways that had not done before. We also pushed the development team to devise a new way to collect data which resulted in the simply named “Enquisite™ Collector.” In case you haven’t noticed, we have very simple and functional product names. In this case, the Collector is an fantastically scalable next-generation data logging system that uses over 26,000 web servers distributed worldwide to collect log file data for our reports.

Continued in part 5….


Starting to Build the Campaign Platform

June 2nd, 2009 by Richard Zwicky

continued from part 2 The Genesis of the Enquisite Campaign Idea

To make Campaign work technically, I knew we needed at least two things off the bat: (1) we needed to capture some critical information which wasn’t presently available; and (2) due to the absolute scale and magnitude of data, we needed to capture it much more efficiently than present web-based logging systems did. The first conundrum was capturing that critical information, and it was during an animated discussion with my business partner at Metamend, Todd Hooge, that we hit on a means to gather it. This led to Enquisite Optimizer’s (formerly Enquisite Pro) search engine positioning reports.

At first, I didn’t know if the rank reporting process would work the way I needed it to. So we started testing, and I kept dreaming up ideas around a better business model. I didn’t share these ideas with Todd, as we were far away from capturing the data necessary to power a new reporting system. I also suspected that if I had shared these thoughts, he probably would have looked at me like I was nuts (again), and made some comment like “it would also be nice if cars could fly.” Plus, I was still thinking about all the things we would need to build for the “finished” system, including the nuts-and-bolts implementation. Anyhow, Todd helped me tremendously to break down that first, crucial, barrier around capturing the necessary data. It may seem trivial today, but it was a big deal then. Even today it’s still a big deal to people when they get an explanation of the entire data capture system.

Once we worked out the germ of the idea on how to capture the data, we were off. I hired a developer to turn the theory into practice. Simultaneously, I started writing up some of the ideas for patent submissions. By June of 2005, we were ready to try out a basic data capture and reporting system, and I had written a phonebook’s worth of documents to file for patent protection.

When I showed the initial reports to some of my search colleagues, they all said, “When can we have it?” It was at this point that I seriously started considering building out the entire suite for a larger purpose. While at first it was an interesting and meaningful project to help me build out my search marketing firm, based on the initial reaction to the first reports (which now sit within Enquisite Optimizer), I realized we could build a system for everyone, not just ourselves.

Continued in Part 4:


New Enquisite Feature - Opportunity Analysis Report

January 12th, 2009 by Richard Zwicky

Ever wonder if you’re missing out on fantastic opportunities that are lurking within your own web site? Now you can find out. The page 2 optimization strategy I’ve written about in the past is a great way to discover potentially lucrative opportunities, but that strategy focuses on identifying existing opportunities in the search rankings-your pages are out there and recognized by the search engines-you just haven’t made it onto page 1 yet.

The strategy I’m going to talk about today is different. The page two strategy works by identifying the low-hanging fruit that, with a little optimization work, can move new pages onto page 1, an action that typically results in a 4500% increase in traffic. The Opportunity Analysis Report delivers you another way to improve your search referral traffic, and conversions!

Enquisite’s Opportunity Analysis Report is found within the Search Engine Comparison report section. It helps you identify which phrase are driving referrals, actions, and conversions from one or many search engines, but not all. Let me explain. Imagine that you have a keyword phrase that’s driving conversions from both MSN and Yahoo, but not Google. Wouldn’t that be nice to know, at a glance, in 10 seconds or less? Would that have an impact on the search phrases you bid for on Google AdWords? Exactly. The Opportunity Report saves you hours of analysis and decision making by highlighting those phrases that have opportunities on specific search engines, and exporting those phrases to a list that you can easily drop right into your bid management system.

To access this report, simply log in to your Enquisite reports, go to the Comparison tab, and select the “Opportunities” option. Then, choose the search engine (or engines) you want to use as a source, as well as the target search engine-the target will be the engine that isn’t referring traffic (or conversions) for terms that are performing well on other search sources.

And that’s it. We think it’s the most innovative way to do real keyword research on your site. Try it out, and let us know what you think!


Enquisite Pro Now Available

March 17th, 2008 by Richard Zwicky

I’m happy to announce that Enquisite Pro is now available to all Enquisite users.

You’ll see lots of great changes and updates. Tons of new features, enhancements, and additions, including of course the Long Tail reports, completely flexible date ranges, custom reporting (build and save your favorites), the ability to group terms, or engines, and an advanced comparison report.

We’ve spent a lot of time building or rather re-building from the foundation up. We’ve got the most accurate web based logging system available, and the fastest one too. We engineered for scale, and built based on your feedback and requests.

We’ve got a lot more to come, but it’s already superb, so let’s start here.

Just log in as usual, and enjoy.

We will be migrating to a paid version; but we’re still going to keep portions free. It’s a commitment we made. In the new reports the first two tabs are going to remain free, as long as possible. They’re your summary reports, and your trends over time. We’ve enhanced them substantially from the Enquisite Beta, so you’re getting more information than ever in these free portions. Features like the Long Tail and advance comparison tools will be paid features, but you’re getting them free for a few weeks. Try them out, tell us how you like them, and what else you’d like to see. We’re building a lot more cool elements in, and completely new reports, but we can only build the features we know people will want…

Thanks, and I look forward to hearing from you.


Search Stats Update - Search Market Share - Ask.com

March 11th, 2008 by Richard Zwicky

I haven’t made a Search Statistics update in a while. No excuses. Just haven’t. I’m going to rectify that now, and I’ll put up some more numbers later today or tomorrow.

With all the uncertainty around Ask, and a lot of people discussing how it’s looking like it’s dropping out of the race, I thought I should should post some numbers which reflect what we’re seeing for their share of the search marketplace over the last year and a bit. We used data representing more than 250 Million search referrals since Jan 1 2007.

2007-01 2.50%
2007-02 2.99%
2007-03 1.74%
2007-04 1.68%
2007-05 1.67%
2007-06 1.26%
2007-07 1.02%
2007-08 0.94%
2007-09 1.15%
2007-10 1.23%
2007-11 1.17%
2007-12 1.19%
2008-01 1.25%
2008-02 1.03%
2008-03 0.90%

If a tree falls in the forest, does anyone hear?


Web Analytics World

January 23rd, 2008 by Richard Zwicky

Manoj Jasra who writes the Web Analytics World blog. Recently, he invited me to start contributing to the blog, and today I made my first post there.

I’ll try and post there regularly, and am also going to strive to post more regularly here at my own blog as well. No, that’s not a New Year’s Resolution. I don’t make those.

Today’s first post draws on some information I used in explaining User Behavior at the SES Chicago and SES Paris shows recently. Basically it’s all about the value of being found on page one within the search results.

Please read the post, and think about it. Are you paying (or charging) a fair price for your SEO services?

Looking at the data, I think it’s pretty easy to argue that SEO’s are not being properly compensated for getting sites into the top 10 for meaningful, competitive key phrases. Not when you compare the cost per click on PPC v. SEO.


SES Chicago Notes

December 17th, 2007 by Richard Zwicky

Alright, I’m way behind. The Search Engine Strategies conference in Chicago ended ten days ago, and I haven’t yet posted anything.

One of the things I wanted to comment on was SES Chicago. SES Chicago is the smallest of the big 3 SES shows of the year. Each SES has its own flavor. SES NY in March seems to lead attendance, and there’s heavy participation from ad agencies and advertisers. SES San Jose in August draws heavily on the SEO and SEM practitioner crowd. Chicago seems to be drawing a more corporate crowd, and is the only one which seems to draw almost exclusively from the Midwest. The other two draw a more national crowd. All three shows are extremely worthwhile.The conference itself seemed to be enjoying about the same attendance as the previous year. I was told that registrations were off by less than 1% over the previous year. A lot of us were concerned that with WebMasterWorld running in Vegas at the same time as SES, there would be a significant drop in attendance. After all, only a few people would choose snowy Chicago over sunny Vegas, or so the thinking went. I actually like Chicago and snow! This year I accidentally stumbled into a German style Christmas market at the corner of Dearborn and Washington. It’s worth checking out.

In fact, when I spoke to people who had been to Vegas (I flew into Vegas Thursday afternoon right after the SES show, and just in time for the MSN party at Ghost Bar), I was surprised to hear that they were disappointed in attendance. Others told me attendance was on par with previous years, and they were happy. The crowd for this show would be more akin to an SMX Advanced conference. The emphasis is on practitioners. Many of the opinion makers in the search industry appeared in Vegas, and will appear at SMX Advanced. Almost all the other ones who weren’t there were in Chicago.

As to the sessions, I spoke on two panels. Search Marketers on Click Fraud, and User Behavior, Personalization & Universal Search. Having done the latter panel in San Jose in August, the second time around was a lot easier. What made it challenging was that presentation time was reduced to 5 minutes, with a longer Q&A session. While the shorter presentation time made presenters get more to the point, there was a lot to discuss. Each presentation was unique, and I believe a lot of useful information was shared.

Based on questions asked, and comments I received afterwards, I believe the audience got a lot of value from both the presentations and the discussion. Greg Jarboe of SEO-PR did an excellent job moderating. Knowing that each presenter brought something different to the table, he ensured that questions were answered by everyone, so that the audience got a well rounded perspective on issues. He also posed a couple of questions to presenters, which assisted everyone in highlighting points of interest.

The second session I presented at was on Click Fraud. This became a two part session for the conference. I the past there was one panel, with click fraud specialists, marketers and the engines themselves on the same panel. I strongly prefer the new format. Tom Cuthbert from Click Forensics apparently does not, as he actively complained from the stage that he preferred to sit on the panel with the engines. I disagree. The “discussions which occurred when the engines and the click fraud specialists were on the same panel were often not overly productive.

Marketers want solutions. They know they are trying to deal with serious issues, and want to learn information and strategies for dealing with the problems. By splitting the click fraud sessions into two parts, SES is doing something very positive for attendees. The search engine’s / ad networks session allowed marketers to learn about what the engines are doing to attempt to combat the problem, and to ask them specific questions about specific issues. The engines also provided tips as to how they suggest you as a marketer can help combat click fraud.

Immediately after the search engines on click fraud session came search marketers on click fraud. I believe that all presenters attended both sessions, so really, all SES did was allow for twice as much time, and gave marketers an opportunity to focus on the issue from distinct perspectives. The session was moderated by Jeff Rohrs, and each presenter attempted to focus on different issues around click fraud. In my case I focused on campaign issues which often get labeled as click fraud, but really are cases of the ad networks serving out ads improperly.

As you spread your ads out across the content networks, the incidence of mistakes increases. Detecting and providing you a means to recover the costs associated to these undesired clicks are what PPC Assurance focuses on. Click through traffic which does not match the terms and conditions of your contract is undesired traffic. Auditing and verifying your PPC traffic is what we do, and resolving campaign issues through our unique one click refund claim submission is what sets us apart.

I attended a few other sessions in Chicago. The quality was excellent.

Anyone involved search marketing who attends an SES conference will receive tremendous value for the experience. Even after all these years, every time I show up I learn more. It’s not just a great environment for people who know what they are doing to gather and exchange tidbits. There’s opportunities for anyone at every level to get educated. If you’re a decisionmaker, and want to understand the marketplace, there’s sessions which are right for you. If you are the practitioner who deals with the nuts and bolts, then there are sessions for you as well. It doesn’t matter what level you are at, there’s always something for you at an SES Conference.


Page 2 Search Engine Optimization

December 15th, 2007 by Richard Zwicky

Yes the title is right. Have you ever thought about Page 2 SEO?

Having been an SEO, I know everyone focuses on Page 1. But have you thought about focusing on page 2 listings, or page 3, 4, 5? Ludicruous you say? Nope, hear me out.

Three different people in the last two weeks have told me they are using Enquisite’s free analytics reports for this very purpose. The first who mentioned it was Eric Enge from Stone Temple. He mentioned it to me at SES Chicago, where we were both speaking. I have to admit it, but I’d never thought about it the way Eric and two others since have suggested. I’d looked at it from a different perspective, but never as actively as Eric has.

Enquisite allows you to see which web pages are getting traffic from whch pages within the search results. If a page is getting relevant traffic from page 1, you probably don’t want to mess with it, even if it’s not your primary phrase. But how to choose which pages to work with?

Using Enquisite you can identify web pages that get traffic from page 2, 3, 4, 5 etc. Focus your optimization work on those pages. These are the pages which can give you the biggest upside in any campaign.

The reason is simple. Over 90% of search engine referral traffic comes from page 1 in the search results. The web pages found on page 2+ of the search results are almost good enough for page 1. Almost. They’re not seen as being quite as relevant enough to be found on page 1. But imagine you focus your optimization work on those pages. They will move up. Do it right, and all your pages move up, as the overall site authority increases.

Get more pages on page 1, and your traffic skyrockets. How? Implement page 2 SEO strategies.