Archive for the ‘Search Engines’ Category
Changes in Natural Linking - Death, Rebirth, or a Return to the Roots?
When contemplating link-building strategies you need to take the user experience into account. They should look at your citations (links) as references to substantiating or relevant documents. Search engines will use these signals to define your site. Consider the user experience when focusing on building contextually relevant links for your business, and you will be rewarded with higher amounts of relevant direct referrals and relevant search engine placement.
There’s been some excellent articles recently focused on this issue, including changes in natural linking by Eric Enge, Editorial Citation by Rand Fishkin and natural link building strategies by Michael Gray.
Michael’s analysis was interesting in that he took Google’s guidelines at face value, created great content, and spent only 10% of his time building links manually. After six months of blogging, Google represented just over 0.5% of his referral traffic. That’s a pathetic amount, considering how popular the posts were in StumbleUpon, Digg, and other social networks. In fact, when he analyzed his traffic, he discovered that the blog didn’t perform in the top 100 for even the simplest keywords. The exception was one post for which Michael did a little link building. Apart from this limited effort, certain posts which received over 30,000 views from social marketing generated almost no natural inbound links.
Now, Michael isn’t advocating that content doesn’t matter. It does, as it engages readers and entices them to return. His article restates a point everyone in the industry has been making for years: If you build it, they don’t just come, or in this case, the links don’t just happen, and the engines won’t just refer anyone your way. As Ian Lurie wrote: “content alone is not going to boost you into the top 10 for any even remotely relevant phrase”
Of particular interest to me in juxtaposition to the pieces by Ian and Michael were the articles by Eric Enge and Rand Fishkin. Rand hypothesizes that 20% of the web’s links exist to influence the search engines. That’s a lot of noise, but that also means 80% are not there just to influence search engines. Eric makes the point that 80% leaves lots of meat to work with and links are still a big factor, but he also estimates that you need to spend 30% of marketing energy into social media, which is interesting in the context of Michael’s lack of success with organic link building from social networks.
There’s an interesting question in Eric’s article, which every site operator should ask themselves: “If you aren’t good enough to be worth linking to, then what do you have anyway?” The answer is of course, you need to build better content. But, a hint to the broader correct answer can actually be found in the title of Rand’s article: Editorial Citation.
Rand notes this in his reference to three periods of linking: 1) early web; links were editorial like footnotes and citations, helping people navigate the web; 2) The engines incorporate web page links as a value metric in ranking algorithms (Google / Alltheweb/ Teoma); 3) non-webpage citations. Google’s recent patent publication which was reviewed by Eric Ward, supports this last point - see Eric’s point #5.
However, another point in the patent leads us in towards additional context: user interaction with links may determine their value, and may go a long way to resolving the dichotomy between the points these articles circle, but also may point search marketers towards clues around link building strategies going forward.
Links from different areas of the document will have different value, and will pass different amounts of link juice flow. Obviously, a link from one site to the next where the link is located in the main body content and is relevant to both the origin and destination will end up scoring higher than an irrelevant link, or links within footers, template side navs and the like. Which really is the point: links as citations are the oldest form of linking, and still carry the most value.
Bryan Eisenberg & Richard Zwicky at SES Toronto
Bryan Eisenberg, who is the best-selling author of “Waiting for Your Cat to Bark?: Persuading Customers When They Ignore Marketing” and many other books and I did a panel together recently at SES Toronto. After the panel, he interviewed me with regards to meaningful metrics. It was a good panel, and interview.
Perhaps the most salient point from both panel and interview is that when a business is trying to understand and evaluate key metrics in online marketing they need to look through the entire value-chain. The challenges of the changing marketplace make it very difficult for marketers to measure all channels equitably and fairly, balancing search, social media, email, newsletters, etc. Bryan highlighted the importance of cross channel metrics, which I was able to substantiate with an example of a client who was struggling to find value in PPC after having only invested in, and measured, one channel. Upon examination, the client discovered that a significant portion of the business’s social and organic search traffic was preceded by visits from the paid channel, and that these multi-touch visits were actually converting and providing measurable results at a higher rate than single visit traffic.
I hope you take the 5 minutes to listen to the interview, and feel free to send me any questions that you have as a consequence.
The Launch of Linker - SEO’s Version of Matchmaking
Ten days ago, after a long period of development followed by extensive testing during our Beta period, Linker has been launched to the public for anyone to sign up.
In the early days of the Internet, before the search engines came along, navigation was driven by links which allowed people to jump from one place (document) to another. People used hyperlinks as authors use footnotes in reference texts. The purpose of the links were to provide citations, and to advise readers of other valuable resources which they ought to consult.
When the Internet began to become popular with the masses, marketers started using links as a means of traffic acquisition. I did this type of link building as far back in the mid 1990’s “Before Google” for my own businesses. This marketing trend was actually followed by, not preceded by the search engines recognizing the value of links, and embedding a weighting value for links into their algorithms. Of course, at first the search engine did things differently, valuing any old link as a positive score. When this link recognition system was discovered as a performance score value in search results, it was unsurprising that some marketers saw the opportunity and took advantage of it. What is surprising is how far away from the fundamental reasons for link building the noise in the marketplace has taken this strategy, especially in light of how strongly the search engines have driven away from volume. Their mantra could be qualified as back to the future in regards to link values.
The reality is high value links matter. Period. A high quality link is one which readers will find of value. Generally, these links are found within the body content of a document and the link points out to content on another page or site which is relevant to your content. Your link is adding perspective for your readers, and also helping build both your sites’ authority in the search results.
Finding great resources to link to is not easy however. There’s so much content, how do you know which are the best resources, and the ones which are the most relevant to you and your readers? Just as importantly, how do you get in contact with representatives of the right sites which are the appropriate matches for yours? This was the challenge I used to face as an online marketer, and the manual process I was explaining internally when we came up with the idea behind Linker.
I’ll be starting a series of posts shortly around the philosophy behind the Linker product, and how it came to be. The reality is, no one likes to get all those spammy “link to me” emails that flood our email inboxes. They really are useless Junk. Links from, or to these low value, low relevance domains won’t add to your site’s user experience, or add any value to your business in terms of visibility in the search engines. That said, everyone with any online marketing knowledge recognizes that link building is a formidable tool in any good marketing campaign. The point everyone needs to focus on is that good quality, relevant link-building improves the user experience of your web site, and at the same time also drives traffic via search engines and direct referrals. Addressing this need is why we built Linker, a context and relevance driven introduction system which people have labeled a dating service for online marketers.
Google Search Update: Ranking Report Really is Dead (finally)
This week I had the pleasure of moderating and speaking at SES Chicago. It was probably my favorite Chicago show yet. What a change from last year when everyone was nervous about how deep the economy would slide into chaos.
One subject that did create some buzz - no surprise - was Google’s announcement of an always-on personalized search. There’s been lots written about it, and the change truly is spectacular. Unfortunately, spectacular doesn’t always equate good.
Rather than dwell on all the questionable issues that the always-on personalized search system raises, I’m going to comment about something that’s actually good in this update: The death of the ranking report. Finally! Finally, rankings are totally meaningless as a reporting metric. Ranking reports which scrape results to identify a position in the search results have been deceptive for years, but now they are unquestionably and completely useless. Anyone providing a ranking report as authoritative is deceiving their clients.
In a way, I am thrilled with Google’s personalization changes, as they make the performance reporting used in Enquisite Optimizer even more valuable. It now is definitely the only real way to measure true page and rank positioning. Optimizer shows where people located anywhere in the world are finding your site in the results, based on actual click-through activity, not some bogus ranking report. This is only analytical platform which report back to you on what your customers are actually seeing in the search results.
People who use traditional ranking reports as a reporting metric are no longer able to report any meaningful data. First off, the data collected are unique to that computer. Second, other activity from that computer affects the results. Run just one site’s reports from a system? Do anything else with it? Anything you search for with that computer can now affect the results you’re seeing. Wait until Caffeine rolls out, and anything you do with that computer will cause variations. Use Google Docs, Gmail, or any other Google products? Your results will vary.
So how can any ranking report based on what one, or even 100 computers which repeatedly run ranking analysis reports be accurate? They can’t. The ranking report you used to use as a metric is dead.
If, as a user, you’re not comfortable with the new personalized search “benefit” just wait for caffeine to roll-out in full next year. Me? I’ve already changed my default search engine in Firefox to Bing. Strange, I’m not concerned about how responsibly Microsoft will handle my information.
Does Depth of Referral Affect Quality of Visit?
Yesterday I published data around click through rates from the search results. That data shows that 95% of all search referrals now arrive from page 1 in the search results. The number is higher in paid, and slightly lower in organic search, but 5% for everything not on page 1 doesn’t leave a lot of room for any other positioning.
I thought it would be interesting to start comparing that data against quality of visit, from the perspective of engagement. A longer time on site and / or more pages viewed should give a good indication of engagement. What I found was quite surprising. You would think that a searcher who is going to bother to drill deeper into the search results would be more motivated to find the right information, and thus would stay engaged in a destination site longer. In fact, the opposite is true. As people drill deeper into the results they become less patient.
The information shown demonstrates how there is a relationship between where in the search results people click, and the quality of their visit to your business. In this case longer time on site and more pages viewed would indicate a better quality of visitor. Counter-intuitively, it’s not the people who drill deeper in the search results that are showing the greatest satisfaction when they land on a destination site, it’s the visitors from page one:
Referrals from Page # | Pages Viewed | Time on Site |
---|---|---|
(average) | mm:ss: | |
1 | 3.59 | 2:27 |
2 | 2.16 | 1:06 |
3 | 2.12 | 1:01 |
4 | 2.08 | 0:57 |
5 | 2.05 | 0:55 |
What this data demonstrates is that visitors from page one in the SERPs are, on average, spending twice as much time and viewing almost twice as many pages on the web sites they visit as visitors who arrive from clicking deeper within the results pages.
Not only is page one more valuable from the perspective of amount of traffic, but also quality. When viewed graphically, the similarity between pages viewed and time on site is stunning, both in relation to time on site v. the referring page number in the search results:
As well as to pages viewed v. the referring page number in the search results:
This less patient user behavior is also reflected in how people search using longer and longer queries. I published data a few weeks ago around how many words are in a typical referring query. What I found was while people might start searching with one word queries, they quickly move to longer, more specific requests. In the next few weeks I’ll expand on that post with some page view and time on site behavioral metrics as well.
As always, Enquisite collects data from a network of web sites distributed globally. The data used in this reports represents web sites distributed globally, and reflects click-through activity data.
Search Engine Market Share Update
Greetings from Search Engine Strategies Berlin!
This week, our weekly trend data of search engine market share as defined by click-through activity shows a Bing regaining its forward momentum, after a slight slip last week. However, looking at the last four weeks, it seems that Bing is hovering quite steadily around the 7.7% market share mark. Over the next few weeks we should be able to see if this is maintained as a normal position, or if Bing recovers its forward momentum.
It should be interesting to observe what happens this week. Each year we see a big drop in search referral traffic associated with the week of the American Thanksgiving Holiday. Will all the engines drop the same proportionate amount, or will Google’s traditional strength in the IT and student marketplace result in a larger drop in market share for the week? Next week I’ll try and put together a chart showing how search volume drops in the run-up to the Holiday, and also how it bounces back.
As always, we’re providing the data in weekly breakdowns to try and identify trends in very granular ways. This data reflects actual clickthrough activity, and not the number of queries run. Meaning if someone performs a search on Yahoo, but doesn’t click through to the results, we don’t track it. We only track searches which generated referrals.
The raw data for those who prefer the numbers, not the graphics:
Yahoo | Bing | Other | ||
---|---|---|---|---|
September 7 | 78.68% | 11.51% | 6.80% | 3.01% |
September 14 | 78.35% | 11.13% | 6.50% | 4.02% |
September 21 | 77.43% | 11.35% | 7.11% | 4.11% |
September 28 | 77.65% | 10.80% | 7.27% | 4.28% |
October 4 | 77.78% | 10.66% | 7.23% | 4.33% |
October 12 | 77.78% | 10.66% | 7.21% | 4.35% |
October 18 | 77.89% | 10.65% | 7.29% | 4.17% |
October 25 | 77.83% | 10.56% | 7.56% | 4.05% |
November 1 | 77.75% | 10.46% | 7.66% | 4.12% |
November 8 | 77.96% | 10.21% | 7.75% | 4.08% |
November 15 | 77.60% | 10.39% | 7.59% | 4.42% |
November 22 | 77.59% | 10.41% | 7.67% | 4.37% |
Enquisite collects data from a network of web sites distributed globally. The data used in this reports represents web sites distributed globally, accessed by searchers located in the U.S., and reflects click-through activity data.
Should You Consider “Author Authority”?
August 6th, 2010 by Richard Zwicky
Search marketers are familiar with signals. One of the truisms is the logic: if no one links to your site, it can’t be considered important, therefore why should it appear in the search results? The more quality links referencing your website or web pages, the better.
A lot of signals or factors behind links can affect the quality, relevance, and value of these citations. Perhaps there’s another signal to consider: Author.
If you’re interested in learning more, earlier today Search Engine Watch published an article I submitted on the topic of Author Authority. The idea came to me while reading a recent patent which was issued and assigned to Google. I’d love to get your thoughts and feedback!
Thanks!
Richard / @rzwicky
Tags: authority, authority score, GOOG, Google, linking strategies, links, marketer, Marketing, patent, Search Engines, SEO Link Building, social marketing
Posted in Commentary, Google, Inbound Marketing, Link Strategies, SEO, SMM, Search Engines, link building, search marketing, social marketing | No Comments »