Archive for the ‘Ranking Reports’ Category
Google Search Update: Ranking Report Really is Dead (finally)
This week I had the pleasure of moderating and speaking at SES Chicago. It was probably my favorite Chicago show yet. What a change from last year when everyone was nervous about how deep the economy would slide into chaos.
One subject that did create some buzz - no surprise - was Google’s announcement of an always-on personalized search. There’s been lots written about it, and the change truly is spectacular. Unfortunately, spectacular doesn’t always equate good.
Rather than dwell on all the questionable issues that the always-on personalized search system raises, I’m going to comment about something that’s actually good in this update: The death of the ranking report. Finally! Finally, rankings are totally meaningless as a reporting metric. Ranking reports which scrape results to identify a position in the search results have been deceptive for years, but now they are unquestionably and completely useless. Anyone providing a ranking report as authoritative is deceiving their clients.
In a way, I am thrilled with Google’s personalization changes, as they make the performance reporting used in Enquisite Optimizer even more valuable. It now is definitely the only real way to measure true page and rank positioning. Optimizer shows where people located anywhere in the world are finding your site in the results, based on actual click-through activity, not some bogus ranking report. This is only analytical platform which report back to you on what your customers are actually seeing in the search results.
People who use traditional ranking reports as a reporting metric are no longer able to report any meaningful data. First off, the data collected are unique to that computer. Second, other activity from that computer affects the results. Run just one site’s reports from a system? Do anything else with it? Anything you search for with that computer can now affect the results you’re seeing. Wait until Caffeine rolls out, and anything you do with that computer will cause variations. Use Google Docs, Gmail, or any other Google products? Your results will vary.
So how can any ranking report based on what one, or even 100 computers which repeatedly run ranking analysis reports be accurate? They can’t. The ranking report you used to use as a metric is dead.
If, as a user, you’re not comfortable with the new personalized search “benefit” just wait for caffeine to roll-out in full next year. Me? I’ve already changed my default search engine in Firefox to Bing. Strange, I’m not concerned about how responsibly Microsoft will handle my information.
Building a New Business Begins
continued from part 3 Starting to Build the Campaign Platform
So, In June 2005, I set about separating Enquisite as an entity from Metamend. This was a crucial and critical step for me, and the company. I needed to establish a firewall between the companies for many reasons, not the least of which was to ensure that competing agencies would never have to fear that their data was accessible to potential competitors. My co-founder at Metamend, Todd Hooge, along with Glenn Convey, a very talented individual in his own right, took over operating that business. It took a few months, but I removed myself from all operational involvement, which was personally challenging. But by December 2005, it was done, and I was out.
By this time, I also had a complete outline for what would become the core elements of Enquisite Campaign, and the next generation of products still to come after it. Over the next few months, I proceeded to break down the functions into manageable pieces we could build as foundational elements for the platform. I received a lot of valuable guidance for each foundational block along the way, and added some key people to the company whom I still depend upon today. Key internal technical counsels for me were our Lead Developer, Rick Morris, and our VP for Technical Operations, Greg Caws. They foresaw many technical hurdles we would face long before we ran into them. Their advice made a dramatic difference every step of the way.
Building the foundation of Campaign was not trivial. Fundamentally, we needed five pillars to support the architecture for the system. We built the various pillars and released them as individual products. So while we’ve received great reviews and feedback for Pro (Now Enquisite™ Optimizer), PPC Assurance (now Enquisite™ Auditor) and our Links Report (within Optimizer), we’ve really looked at these as stepping stones. The last two pillars - the internally-facing “Collector” and a massively scalable, super-fast database - rounded out the foundation.
To build the opportunity analysis and the reporting functions for Campaign, we needed to deal with paid search traffic in a way no one before had. We needed to understand, and value, paid campaigns in relation to organic ones. This perspective, and the requirement to segment out paid search traffic in a new way, led to our PPC Assurance product. This application reveals click accuracy for PPC with amazing precision, enabling advertisers to credibly claim credits from Google, Yahoo! and others and potentially save thousands, even millions of dollars. Equally important, this first step allowed us to understand paid search in a different way from existing solutions.
As a bonus, the work on Enquisite Auditor led to our second product, Enquisite Pro. Earlier this year it was officially recognized by Yahoo!, when they began recommending it via their Traffic Quality Center. We built the early version of Pro - now renamed Optimizer for a purpose - to provide search engine positioning reports, and to segment traffic and campaigns in ways that had not done before. We also pushed the development team to devise a new way to collect data which resulted in the simply named “Enquisite™ Collector.” In case you haven’t noticed, we have very simple and functional product names. In this case, the Collector is an fantastically scalable next-generation data logging system that uses over 26,000 web servers distributed worldwide to collect log file data for our reports.
Continued in part 5….
Starting to Build the Campaign Platform
continued from part 2 The Genesis of the Enquisite Campaign Idea
To make Campaign work technically, I knew we needed at least two things off the bat: (1) we needed to capture some critical information which wasn’t presently available; and (2) due to the absolute scale and magnitude of data, we needed to capture it much more efficiently than present web-based logging systems did. The first conundrum was capturing that critical information, and it was during an animated discussion with my business partner at Metamend, Todd Hooge, that we hit on a means to gather it. This led to Enquisite Optimizer’s (formerly Enquisite Pro) search engine positioning reports.
At first, I didn’t know if the rank reporting process would work the way I needed it to. So we started testing, and I kept dreaming up ideas around a better business model. I didn’t share these ideas with Todd, as we were far away from capturing the data necessary to power a new reporting system. I also suspected that if I had shared these thoughts, he probably would have looked at me like I was nuts (again), and made some comment like “it would also be nice if cars could fly.” Plus, I was still thinking about all the things we would need to build for the “finished” system, including the nuts-and-bolts implementation. Anyhow, Todd helped me tremendously to break down that first, crucial, barrier around capturing the necessary data. It may seem trivial today, but it was a big deal then. Even today it’s still a big deal to people when they get an explanation of the entire data capture system.
Once we worked out the germ of the idea on how to capture the data, we were off. I hired a developer to turn the theory into practice. Simultaneously, I started writing up some of the ideas for patent submissions. By June of 2005, we were ready to try out a basic data capture and reporting system, and I had written a phonebook’s worth of documents to file for patent protection.
When I showed the initial reports to some of my search colleagues, they all said, “When can we have it?” It was at this point that I seriously started considering building out the entire suite for a larger purpose. While at first it was an interesting and meaningful project to help me build out my search marketing firm, based on the initial reaction to the first reports (which now sit within Enquisite Optimizer), I realized we could build a system for everyone, not just ourselves.
Continued in Part 4:
New Enquisite Feature - Opportunity Analysis Report
Ever wonder if you’re missing out on fantastic opportunities that are lurking within your own web site? Now you can find out. The page 2 optimization strategy I’ve written about in the past is a great way to discover potentially lucrative opportunities, but that strategy focuses on identifying existing opportunities in the search rankings-your pages are out there and recognized by the search engines-you just haven’t made it onto page 1 yet.
The strategy I’m going to talk about today is different. The page two strategy works by identifying the low-hanging fruit that, with a little optimization work, can move new pages onto page 1, an action that typically results in a 4500% increase in traffic. The Opportunity Analysis Report delivers you another way to improve your search referral traffic, and conversions!
Enquisite’s Opportunity Analysis Report is found within the Search Engine Comparison report section. It helps you identify which phrase are driving referrals, actions, and conversions from one or many search engines, but not all. Let me explain. Imagine that you have a keyword phrase that’s driving conversions from both MSN and Yahoo, but not Google. Wouldn’t that be nice to know, at a glance, in 10 seconds or less? Would that have an impact on the search phrases you bid for on Google AdWords? Exactly. The Opportunity Report saves you hours of analysis and decision making by highlighting those phrases that have opportunities on specific search engines, and exporting those phrases to a list that you can easily drop right into your bid management system.
To access this report, simply log in to your Enquisite reports, go to the Comparison tab, and select the “Opportunities” option. Then, choose the search engine (or engines) you want to use as a source, as well as the target search engine-the target will be the engine that isn’t referring traffic (or conversions) for terms that are performing well on other search sources.
And that’s it. We think it’s the most innovative way to do real keyword research on your site. Try it out, and let us know what you think!
Enquisite Pro Now Available
I’m happy to announce that Enquisite Pro is now available to all Enquisite users.
You’ll see lots of great changes and updates. Tons of new features, enhancements, and additions, including of course the Long Tail reports, completely flexible date ranges, custom reporting (build and save your favorites), the ability to group terms, or engines, and an advanced comparison report.
We’ve spent a lot of time building or rather re-building from the foundation up. We’ve got the most accurate web based logging system available, and the fastest one too. We engineered for scale, and built based on your feedback and requests.
We’ve got a lot more to come, but it’s already superb, so let’s start here.
Just log in as usual, and enjoy.
We will be migrating to a paid version; but we’re still going to keep portions free. It’s a commitment we made. In the new reports the first two tabs are going to remain free, as long as possible. They’re your summary reports, and your trends over time. We’ve enhanced them substantially from the Enquisite Beta, so you’re getting more information than ever in these free portions. Features like the Long Tail and advance comparison tools will be paid features, but you’re getting them free for a few weeks. Try them out, tell us how you like them, and what else you’d like to see. We’re building a lot more cool elements in, and completely new reports, but we can only build the features we know people will want…
Thanks, and I look forward to hearing from you.
Search Stats Update - Search Market Share - Ask.com
I haven’t made a Search Statistics update in a while. No excuses. Just haven’t. I’m going to rectify that now, and I’ll put up some more numbers later today or tomorrow.
With all the uncertainty around Ask, and a lot of people discussing how it’s looking like it’s dropping out of the race, I thought I should should post some numbers which reflect what we’re seeing for their share of the search marketplace over the last year and a bit. We used data representing more than 250 Million search referrals since Jan 1 2007.
2007-01 2.50%
2007-02 2.99%
2007-03 1.74%
2007-04 1.68%
2007-05 1.67%
2007-06 1.26%
2007-07 1.02%
2007-08 0.94%
2007-09 1.15%
2007-10 1.23%
2007-11 1.17%
2007-12 1.19%
2008-01 1.25%
2008-02 1.03%
2008-03 0.90%
If a tree falls in the forest, does anyone hear?
Web Analytics World
Manoj Jasra who writes the Web Analytics World blog. Recently, he invited me to start contributing to the blog, and today I made my first post there.
I’ll try and post there regularly, and am also going to strive to post more regularly here at my own blog as well. No, that’s not a New Year’s Resolution. I don’t make those.
Today’s first post draws on some information I used in explaining User Behavior at the SES Chicago and SES Paris shows recently. Basically it’s all about the value of being found on page one within the search results.
Please read the post, and think about it. Are you paying (or charging) a fair price for your SEO services?
Looking at the data, I think it’s pretty easy to argue that SEO’s are not being properly compensated for getting sites into the top 10 for meaningful, competitive key phrases. Not when you compare the cost per click on PPC v. SEO.
SES Chicago Notes
Alright, I’m way behind. The Search Engine Strategies conference in Chicago ended ten days ago, and I haven’t yet posted anything.
One of the things I wanted to comment on was SES Chicago. SES Chicago is the smallest of the big 3 SES shows of the year. Each SES has its own flavor. SES NY in March seems to lead attendance, and there’s heavy participation from ad agencies and advertisers. SES San Jose in August draws heavily on the SEO and SEM practitioner crowd. Chicago seems to be drawing a more corporate crowd, and is the only one which seems to draw almost exclusively from the Midwest. The other two draw a more national crowd. All three shows are extremely worthwhile.The conference itself seemed to be enjoying about the same attendance as the previous year. I was told that registrations were off by less than 1% over the previous year. A lot of us were concerned that with WebMasterWorld running in Vegas at the same time as SES, there would be a significant drop in attendance. After all, only a few people would choose snowy Chicago over sunny Vegas, or so the thinking went. I actually like Chicago and snow! This year I accidentally stumbled into a German style Christmas market at the corner of Dearborn and Washington. It’s worth checking out.
In fact, when I spoke to people who had been to Vegas (I flew into Vegas Thursday afternoon right after the SES show, and just in time for the MSN party at Ghost Bar), I was surprised to hear that they were disappointed in attendance. Others told me attendance was on par with previous years, and they were happy. The crowd for this show would be more akin to an SMX Advanced conference. The emphasis is on practitioners. Many of the opinion makers in the search industry appeared in Vegas, and will appear at SMX Advanced. Almost all the other ones who weren’t there were in Chicago.
As to the sessions, I spoke on two panels. Search Marketers on Click Fraud, and User Behavior, Personalization & Universal Search. Having done the latter panel in San Jose in August, the second time around was a lot easier. What made it challenging was that presentation time was reduced to 5 minutes, with a longer Q&A session. While the shorter presentation time made presenters get more to the point, there was a lot to discuss. Each presentation was unique, and I believe a lot of useful information was shared.
Based on questions asked, and comments I received afterwards, I believe the audience got a lot of value from both the presentations and the discussion. Greg Jarboe of SEO-PR did an excellent job moderating. Knowing that each presenter brought something different to the table, he ensured that questions were answered by everyone, so that the audience got a well rounded perspective on issues. He also posed a couple of questions to presenters, which assisted everyone in highlighting points of interest.
The second session I presented at was on Click Fraud. This became a two part session for the conference. I the past there was one panel, with click fraud specialists, marketers and the engines themselves on the same panel. I strongly prefer the new format. Tom Cuthbert from Click Forensics apparently does not, as he actively complained from the stage that he preferred to sit on the panel with the engines. I disagree. The “discussions which occurred when the engines and the click fraud specialists were on the same panel were often not overly productive.
Marketers want solutions. They know they are trying to deal with serious issues, and want to learn information and strategies for dealing with the problems. By splitting the click fraud sessions into two parts, SES is doing something very positive for attendees. The search engine’s / ad networks session allowed marketers to learn about what the engines are doing to attempt to combat the problem, and to ask them specific questions about specific issues. The engines also provided tips as to how they suggest you as a marketer can help combat click fraud.
Immediately after the search engines on click fraud session came search marketers on click fraud. I believe that all presenters attended both sessions, so really, all SES did was allow for twice as much time, and gave marketers an opportunity to focus on the issue from distinct perspectives. The session was moderated by Jeff Rohrs, and each presenter attempted to focus on different issues around click fraud. In my case I focused on campaign issues which often get labeled as click fraud, but really are cases of the ad networks serving out ads improperly.
As you spread your ads out across the content networks, the incidence of mistakes increases. Detecting and providing you a means to recover the costs associated to these undesired clicks are what PPC Assurance focuses on. Click through traffic which does not match the terms and conditions of your contract is undesired traffic. Auditing and verifying your PPC traffic is what we do, and resolving campaign issues through our unique one click refund claim submission is what sets us apart.
I attended a few other sessions in Chicago. The quality was excellent.
Anyone involved search marketing who attends an SES conference will receive tremendous value for the experience. Even after all these years, every time I show up I learn more. It’s not just a great environment for people who know what they are doing to gather and exchange tidbits. There’s opportunities for anyone at every level to get educated. If you’re a decisionmaker, and want to understand the marketplace, there’s sessions which are right for you. If you are the practitioner who deals with the nuts and bolts, then there are sessions for you as well. It doesn’t matter what level you are at, there’s always something for you at an SES Conference.
Page 2 Search Engine Optimization
Yes the title is right. Have you ever thought about Page 2 SEO?
Having been an SEO, I know everyone focuses on Page 1. But have you thought about focusing on page 2 listings, or page 3, 4, 5? Ludicruous you say? Nope, hear me out.
Three different people in the last two weeks have told me they are using Enquisite’s free analytics reports for this very purpose. The first who mentioned it was Eric Enge from Stone Temple. He mentioned it to me at SES Chicago, where we were both speaking. I have to admit it, but I’d never thought about it the way Eric and two others since have suggested. I’d looked at it from a different perspective, but never as actively as Eric has.
Enquisite allows you to see which web pages are getting traffic from whch pages within the search results. If a page is getting relevant traffic from page 1, you probably don’t want to mess with it, even if it’s not your primary phrase. But how to choose which pages to work with?
Using Enquisite you can identify web pages that get traffic from page 2, 3, 4, 5 etc. Focus your optimization work on those pages. These are the pages which can give you the biggest upside in any campaign.
The reason is simple. Over 90% of search engine referral traffic comes from page 1 in the search results. The web pages found on page 2+ of the search results are almost good enough for page 1. Almost. They’re not seen as being quite as relevant enough to be found on page 1. But imagine you focus your optimization work on those pages. They will move up. Do it right, and all your pages move up, as the overall site authority increases.
Get more pages on page 1, and your traffic skyrockets. How? Implement page 2 SEO strategies.