Mark Glaser at MediaShift has a great two-part series on measuring Web traffic. It’s well worth reading for publishers, advertisers and anyone else who cares about how traffic is measured.
There are a number of different ways firms measure traffic:
- Panels recruited for the purpose. This is similar to the way that advertisers and networks measure television viewers. It tends to underrepresent small or highly local sites. Special usage monitoring software is installed on each panelist’s computer. Nielsen and comScore rely on panels.
- Buying ISP logs. This has a tendency to undercount at-work usage.
- Direct measurement by each site. This can be done using a tracking pixel or by using server logs. This is the most accurate, though not 100% accurate. Advertisers have a tendency to distrust them because the numbers come from the publishers, who have an incentive to inflate them.
This quote from the Internet Advertising Bureau SVP Sheryl Draizen sums up my view:
There’s this philosophy in the entire media industry that it’s always better to use independent third-party numbers, because they’re independent and don’t have a vested interest. I would argue that that’s not the case, because no one’s independent and everyone has a vested interest. We also have to change our thinking because we have a unique medium that could give us more accurate numbers than we have ever had before…I would challenge the agencies and marketers to stop thinking that the only numbers that are valid are coming from a third party. It’s just not the case in our industry. If there’s a certification process against those numbers, there’s no reason those numbers can’t stand.
If direct measurement offerings such as Omniture, Google Analytics, WebTrends and Quantcast could agree on definitions for visitors, page views, sessions, time on site and other key metrics as well as what bots/spiders/other junk traffic to ignore, they could give panel-based measurement firms a run for their money. That’s not to say they shouldn’t go beyond to offer custom metrics, but advertisers want apples to compare.
Rocky,
A couple of points for discussion, but I’m broadly in agreement with your sentiments here.
I’m not sure that the whole ‘panel vs census’ debate is hugely relevant now, as we see more and more of the leading WA (web analytics) providers being able to offer both of these tool sets in one offering (with varying degrees of success).
What this means is that marketers can take the rich demographic and offline data sets that panels provide and combine them with the deep onsite behavioural data that page tagging yields. If this is done in the same tool, we unlock the power of site centric and visitor centric metrics. When we blend this with the ability to measure activity across streaming events, Ajax events, social networks, forums, blogs et al, we get pretty close to delivering a robust ‘single view of the customer’.
Where bodies like ABCe, OJD, KIA, JICWEB, AOP, and the IAB (I’m in Europe) etc etc are massively helpful is in ensuring that the numbers reported meet the requirements of the marketers (the paying customer in the majority of WA plays).
Analytics providers must continually innovate and develop their tools and services to keep pace with emerging business needs, those that do this well will, by natural selection, be more likely to become the currency of reporting than those that fail. Those that succeed are then charged with feeding back into the industry bodies to ensure that the apples we count are the right apples.
Then we can get on with the interesting task of figuring out what the apples mean and how to get more for less.