Why It's So Hard to Measure Online Readership

What's the right way to measure an audience online—clicks, readers, time-spent, or shares?

Name a metric, any metric, for measuring audience attention, and there is (a) a reason why it's useful; (b) a reason why it's worthless; and (c) a way for digital media companies to corrupt it.
Page views (e.g.: clicks) used to be the most common currency of online attention, only to be replaced by unique visitors (e.g.: readers). But in the viral age, "readers" doesn't mean what it used to mean. If you were a newspaper in the 1980s, readers = subscribers who receive your bundle of paper each morning. In a bookmarked-site world, readers meant Web visitors who drop by a few days a month at most. But in the viral age, readers can mean 1 million Facebook users who see some sensational headline, clicked it, and scurried away, having no recollection of what URL hosted the article, and never visit the site again.
Reader used to be a person. Now it's a spectrum. There are dedicated readers on one end, Tsetse-fly-brained Facebook browsers on the other, and fully engaged one-time readers in the middle. Surely they shouldn't count equally to an advertiser seeking an consistently engaged and knowable audience.
That's why some digital companies are trying to the Internet less like a newspaper and more like TV. YouTube now measures “Time Watched." Medium counts “Total Time Reading." Chartbeat tracks “Average Engaged Time.” And now Upworthy—the viral firehose of the Web—announced that it's developed a new metric. "Attention minutes" seeks to measure time spent watching or reading an Upworthy page, by studying "length of time a browser tab has been open, how long a video player has been running, and the movement of the mouse on screen," according to Nieman Lab.
The official reflexive position among journalists regarding all things Upworthy is wry derision, and true to form, here was Felix Salmon's take on "attention minutes":
"If I sit slack-jawed watching a video for 4 mins rather than 2 [minutes], that’s not 2X the amount of attention paid to the subject," Salmon continued on Twitter. "When people share Upworthy videos, they almost never write anything substantive about the issue, or the video."
The deeper lesson is that just about any conceivable metric will carry not only virtues and limitations, but also a certain seduction for digital media companies to corrupt it. Let's quickly review some...
Uniques: Unique visitors is a good metric, because it measures monthly readers, not just meaningless clicks. It's bad because it measures people rather than meaningful engagement. For example, Facebook viral hits now account for a large share of traffic at many sites. There are one-and-done nibblers on the Web and there are loyal readers. Monthly unique visitors can't tell you the difference.
Page Views: They're good because they measure clicks, which is an indication of engagement that unique visitors doesn't capture (e.g.: a blog with loyal readers will have a higher ratio of page views-to-visitors, since the same people keep coming back). They're bad for the same reason that they can be corrupted. A 25-page slideshow of the best cities for college graduates will have up to 25X more views than a one-page article with all the same information. The PV metric says the slideshow is 25X more valuable if ads are reloaded on each page of the slideshow. But that's ludicrous.
Time Spent/Attention Minutes: Page views and uniques tell you an important but incomplete fact: The article page loaded. It doesn't tell you what happens after the page loads. Did the reader click away? Did he stay for 20 minutes? Did he open the browser tab and never read the story? These would be nice things to know. And measures like attention minutes can begin to tell us. But, as Salmon points out, they still don't paint a complete picture. Watching a 5 minute video and deciding it was stupid seems less valuable than watching a one minute video that you share with friends and praise. Page views matter, and time spent matters, but reaction matters, too. This suggests two more metrics ...
Shares and Mentions: "Shares" (on Facebook, Twitter, LinkedIn, or Google+) ostensibly tell you something that neither PVs, nor uniques, nor attention minutes can tell you: They tell you that visitors aren't just visiting. They're taking action. But what sort of action? A bad column will get passed around on Twitter for a round of mockery. An embarrassing article can go viral on Facebook. Shares and mentions can communicate the magnitude of an article's attention, but they can't always tell you the direction of the share vector: Did people share it because they loved it, or because they loved hating it?
Measuring digital readers is technically easier than measuring newspaper readers or TV viewers. We can see where they are, page-by-page, tab-by-tab, and we can track what they're sharing and where they're staring. Theoretically, this makes it easier for digital publishers to "know" their audience. But as websites get in on the viral game, where articles attract massive traffic from an audience of Facebook grazers, it's calling into question whether the old-fashioned metrics are really worth maximizing. No matter what metric we settle on, there will be reasons to doubt and editors to manipulate it.