I’ve always viewed web traffic numbers with great suspicion, if for no other reason than they are all over the board. But the amazing Carl Bialik, the Wall Street Journal’s “numbers guy,” does us another great service today in his latest column, “The Trouble With Web-Traffic Numbers,” by walking us through exactly how big of a mess these numbers really are. Carl is the closest thing we have to a statistical ombudsman for the Internet as he repeatedly illustrates in his column how numbers can deceive and distort.
In terms of bogus web traffic numbers, there’s plenty of distortion going on. He quotes Erin Pettigrew, marketing director for Gawker Media, as saying that “For an industry that relies so heavily on accurate data and numerical accountability, relying on an estimate is embarrassing, antiquated.” Too true. Of course, with so many people frequently deleting their cookies and now accessing websites from different machines, it’s not surprising that the numbers are such a jumble.
One of the reasons it’s so important to try to improve web traffic metrics is because it is essential to the advertising business, which powers the web and all the great content and services we consume online. More accurate web traffic metrics can help better direct and target ads across the web. But it won’t be easy.
Anyway, read Carl’s piece for all the details. And thank you Carl for always reminding us that there are “lies, damned lies, and statistics.”