September 2008

By Berin Szoka & Adam Thierer
Progress Snapshot 4.19 (PDF)

Since the fall of 2008, a debate has raged in Washington over “targeted online advertising,” an ominous-sounding shorthand for the customization of Internet ads to match the interests of users.  Not only are these ads more relevant and therefore less annoying to Internet users than untargeted ads, they are more cost-effective to advertisers and more profitable to websites that sell ad space.  While such “smarter” online advertising scares some—prompting comparisons to a corporate “Big Brother” spying on Internet users—it is also expected to fuel the rapid growth of Internet advertising revenues from $21.7 billion in 2007 to $50.3 billion in 2011-an annual growth rate of more than 24%. Since this growing revenue stream ultimately funds the free content and services that Internet users increasingly take for granted, policymakers should think very carefully about what’s really best for consumers before rushing to regulate an industry that has thrived for over a decade under a layered approach that combines technological “self-help” by privacy-wary consumers, consumer education, industry self-regulation, existing state privacy tort laws, and Federal Trade Commission (FTC) enforcement of corporate privacy policies.

In an upcoming PFF Special Report, we will address the many technical, economic, and legal aspects of this complicated policy issue-especially the possibility that regulation may unintentionally thwart market responses to the growing phenomenon of users blocking online ads.

We will also issue a three-part challenge to those who call for regulation of online advertising practices:

  1. Identify the harm or market failure that requires government intervention.
  2. Prove that there is no less restrictive alternative to regulation.
  3. Explain how the benefits of regulation outweigh its costs.

Continue reading →

Comcast v. FCC: Now what?

by on September 24, 2008 · 11 comments

A divided FCC recently issued an order concluding that Comcast acted discriminatorily and arbitrarily to squelch the dynamic benefits of an open and accessible Internet, and that its failure to disclose its practices to its customers has compounded the harm. The commission does get a bit excited sometimes.  Anyway, the FCC required Comcast to end its network management practices and submit a compliance plan.

Richard Bennett reviews the Comcast “protocol agnostic” network management plan requested by the FCC:

[T]he new system will not look at any headers, and will simply be triggered by the volume of traffic each user puts on the network and the overall congestion state of the network segment. If the segment goes over 70% utilization in the upload direction for a fifteen-minute sample period, congestion management will take effect.

In the management state, traffic volume measurement will determine which users are causing the near-congestion, and only those using high amounts of bandwidth will be managed. The way they’re going to be managed is going to raise some eyebrows, but it’s perfectly consistent with the FCC’s order. High-traffic users – those who’ve used over 70% of their account’s limit for the last fifteen minutes – will have all of their traffic de-prioritized for the next fifteen minutes. While de-prioritized, they still have access to the network, but only after the conforming users have transmitted their packets. So instead of bidding on the first 70% of network bandwidth, they’ll essentially bid on the 30% that remains. This will be a bummer for people who are banging out files as fast as they can only to have a Skype call come in. Even if they stop BitTorrent, the first fifteen minutes of Skyping are going to be rough.

Aside from filing a compliance plan, Comcast is also filing suit. For one thing, Commissioner Robert McDowell claims that “the FCC does not know what Comcast did or did not do. The evidence in the record is thin and conflicting.” Ouch.

Yes, there could be years of litigation. Continue reading →

Yours truly shows up in a good story on surveillance cameras on the Christian Broadcasting Network today. Watching the whole thing, I was impressed by the sophistication of the host, who observed in the discussion segment: “We’re giving up so much privacy in order to obtain the illusion of security.”

[The following post discusses a matter of public interest and people who have brought public attention upon themselves. It contains only expressions of opinion and recitations of facts that I believe in good faith to be true. It should be clear, and I urge you to be clear, that the only service provider discussed in the post, involved in authoring the post, or involved in publishing the post is the law firm Jones Day and its attorneys. Let there be no confusing that this is all about Jones Day. Since it’s a post critical of Jones Day, there should be no implication to anyone that Jones Day could possibly endorse the content of this post or this blog. I say all this so that any nitwit attorney who thinks this blog post gives him or her a cause of action will be on notice that nothing said here violates trademark law, everything here is protected speech under the First Amendment, and that no cause of action can possibly lie against any person for this publication.]

It’s fascinating sometimes to see lawyers with an abundance of power and no sense of judgment – especially big-firm lawyers who shouldn’t exhibit their poor judgment and ignorance of basic legal doctrine to current and prospective clients.

But the law firm of Jones Day has some lawyers working for it who really don’t seem to have a clue about trademark law. I never practiced trademark law, and I seem to know more about it than they do.

The case is well described on the Citizen Media Law Project Web site.
Continue reading →

Forget net neutrality and the growing Googleplex. The real threat to Internet freedom comes from plain old criminal law.

In three weeks time, Missouri housewife Lori Drew will face trial for entering false personal details when she signed up for a MySpace account. Her indictment alone, whether or not she is convicted, should frighten anyone who’s ever filled out a form online.

The case, which captured the tabloid media when it broke last year, turns on unusual facts. Drew, posting as a teenage boy, created the MySpace account to probe why a neighbor’s daughter, Megan Meier, had broken off a friendship with her own daughter. She gave a few others access to the account, and things quickly spiraled out of control. Before long, “Josh Evans” (the fictional teen) and Meier were an online couple, and soon after that, they were hurling insults at one another on public message boards.

Meier, already suffering from depression, was devastated by Josh’s turnabout. A final private message from the Evans account–“The world would be a better place without you”–pushed her over the edge. Twenty minutes after receiving it, Meier hung herself in her closet.

Even though she was not responsible for the worst of the messages (according to a prosecutor who investigated the case but declined to file charged), Lori Drew mislead an emotionally troubled youth, and that was surely wrong.

But it’s more problematic to say that it’s a crime.

The theory of the prosecutor behind this case would make all Internet users criminals. Continue reading →

Veoh Considered

by on September 22, 2008 · 8 comments

I reviewed the Veoh case for DRMWatch recently:

The user-generated video site Veoh achieved a victory in court on August 27th when California District Judge Howard Lloyd ruled that it was entitled to the protection of the DMCA’s safe harbor provisions. Veoh was accused of copyright infringement by IO Group, a maker of adult films…

Like eBay v. Tiffany, another case in which one might trumpet a tech-side win… the tech gets at least some protection from liability. But only in a context in which the tech is already taking substantial steps to help the plaintiff trademark/copyright owner with their enforcement problem, steps that would have been hard to conceive of a decade ago, and that many would have grandly declared to be too ambitious and too invasive for online services to attempt. Prediction: the case law is now much more mature, but the business side is just getting started. More and fancier filtering to come.

It’s funny and scary how many of our grand ideas about justice, rights, freedom, fairness and property come down to what we can become accustomed too.  Bad, in the sense that one can easily lose the customary baselines against which freedom is measured in a generation or so. Good, in the sense that one is not limited to identify freedom with just one historic mythical Golden Age; a free society has somewhat more leeway.

I’m fond of paradoxes these days. Tedious things. Almost as annoying to other people, I am sure, as those characters (you know who you are) who make puns all the time.

Boynton Beach, Florida’s experiment with municipal wi-fi has ended.  [Add it to the list of recent failures]. According to the South Florida Sun-Sentinel:

There’s a roadblock in Boynton Beach‘s information superhighway. The city’s Community Redevelopment Agency decided this month it has no more money for free wireless Internet service in its district.  Boynton Beach was the first city in Palm Beach County to offer Wi-Fi three years ago. It operated 11 “hot spots,” or access points, paying $44,000 annually for vendors to keep the system running. But the CRA dropped vendors who failed to meet their contracts. Other companies wanted to sell the Community Redevelopment Agency new equipment, but in a tough budget year, offering free wireless was no longer viable, said the agency’s executive director, Lisa Bright.  […]  “There is clearly no way for it to be a revenue generator at this time,” Bright said. “It’s premature for us to go to the next level.”

Whenever I read one of these articles about the small town or mid-sized town wi-fi experiments failing so miserably I have to admit that I am a bit surprised.  After all, many muni wi-fi supporters have argued that it is precisely in those communities where government support is most necessary and will be most likely to fill in gaps left by sporadic / delayed private broadband deployment.  Frankly, I always thought this was the best argument for muni wi-fi and it’s why I made sure to never go on record as opposing all government efforts, even though I am obviously a skeptic and don’t like the idea of wagering taxpayer money on such risky ventures. (By contrast, I could just never see the reason for government subsidies of wi-fi ventures in major metro areas with existing private broadband operators. Like Philly and Chicago.)

But the fact that many small town or mid-sized town wi-fi experiments are failing is really interesting because it must tell us something about either (a) the viability of the technology or (b) demand for such service.  Now, many municipalization believers will just say that clearly (a) is the case and argue that we just need to wait for Wi-Max solutions to come online and then all will be fine.  It certainly may be the case that Wi-Max will help boost coverage in low density areas, but is that really the end of the story?  What about demand?  What really makes me mad when I read most of these stories about current failed experiments is that they rarely give us any solid numbers about how many people utilized the services.  To the extent any journalists or analysts are out there contemplating a story or study on this issue, I beg you to dig into the demand side of the equation and try to find out how much of the currently muni-wifi failure is due to technology and how much is due to demand, or lack thereof.  Of course, government mismanagement could also be a culprit. But I suspect there is a far less demand for these services than supporters have estimated.

Zittrain Future of the Net coverSorry if it seems like I am beating a dead horse here, but the folks at the City Journal asked me a pen a review of Jonathan Zittrain’s new book, The Future of the Internet and How to Stop It.  Faithful readers here will no doubt remember that I have already penned a review of the book and several follow-up essays. (Part 1, 2, 3, 4). I swear I am not picking on Jonathan, but his book is probably the most important technology policy book of the year–Nick Carr’s Big Switch would be a close second–and deserves attention.  Specifically, I think it deserves attention because I believe that Jonathan’s provocative thesis is wildly out of touch with reality.  As I state in the City Journal review of his book:

[C]ontrary to what Zittrain would have us believe, reports of the Internet’s death have been greatly exaggerated. […] Not only is the Net not dying, but there are signs that digital generativity and online openness are thriving as never before. […]

Essentially, Zittrain creates a false choice regarding the digital future we face. He doesn’t seem to believe that a hybrid future is possible or desirable. In reality, however, we can have a world full of some tethered appliances or even semi-closed networks that also includes generative gadgets and open networks. After all, millions of us love our iPhones and TiVos, but we also take full advantage of the countless other open networks and devices at our disposal. […]

Continue reading →

The ‘D’ Word?

by on September 19, 2008 · 13 comments

Barack Obama argues that John McCain “hurt everyday workers with his longtime support for deregulation,” according to Politico.

Thomas Frank adds,

There is simply no way to blame [the failure of several large financial institutions], as Republicans used to do, on labor unions or over-regulation. No, this is the conservatives’ beloved financial system doing what comes naturally. Freed from the intrusive meddling of government, just as generations of supply-siders and entrepreneurial exuberants demanded it be, the American financial establishment has proceeded to cheat and deceive and beggar itself — and us — to the edge of Armageddon. It is as though Wall Street was run by a troupe of historical re-enactors determined to stage all the classic panics of the 19th century.

But as Steve Forbes points out, the “easy-money” policy of the Federal Reserve helped financial institutions pile up debt and bad assets.

 According to former FDIC Chairman William M. Isaac,

The biggest culprit is a change in our accounting rules that the Financial Accounting Standards Board and the SEC put into place over the past 15 years: Fair Value Accounting. Fair Value Accounting dictates that financial institutions holding financial instruments available for sale (such as mortgage-backed securities) must mark those assets to market. That sounds reasonable. But what do we do when the already thin market for those assets freezes up and only a handful of transactions occur at extremely depressed prices?

The answer to date from the SEC, FASB, bank regulators and the Treasury has been (more or less) “mark the assets to market even though there is no meaningful market.” The accounting profession, scarred by decades of costly litigation, just keeps marking down the assets as fast as it can.

This is contrary to everything we know about bank regulation. When there are temporary impairments of asset values due to economic and marketplace events, regulators must give institutions an opportunity to survive the temporary impairment. Assets should not be marked to unrealistic fire-sale prices. Regulators must evaluate the assets on the basis of their true economic value (a discounted cash-flow analysis).

If we had followed today’s approach during the 1980s, we would have nationalized all of the major banks in the country and thousands of additional banks and thrifts would have failed. I have little doubt that the country would have gone from a serious recession into a depression.

Easy money and mark-to-market are not deregulatory policies. They are examples of government intervention with unfortunate consequences.

The nature of unfortunate consequences is always unpredictable; the inevitability of unfortunate consequences, never so.

Easy money was supposed speed the transition from the dotcom and telecom bubbles to prosperity, and mark-to-market was so we would not have to suffer from similar speculative bubbles in the future. Yet here we have another burst speculative bubble.

According to Frank,

Thanks to the party of Romney and McCain, federal work is today so financially unattractive to top talent that it might as well be charity work. It’s one of the main reasons — other than outright conquest by the industries they’re supposed to be overseeing — that our regulatory agencies can’t seem to get out of bed in the morning.

France attracts its best and brightest to government service, but most of us don’t want to be like France — at least not in all respects. Although it is hard to fail in France, it is also hard to succeed.  

Maybe blaming the regulators is like the blame the messenger proverb. Perhaps the problem isn’t the regulators; it is regulation itself.

Although regulation always seems brilliant in theory, it usually fails in practice. Either it doesn’t work, it spawns corruption or both.   Or it backfires, as it did here.

Broadband Prices Drop

by on September 19, 2008 · 10 comments

I expected to see more reaction to the Wall Street Journal’s recent observation of a surprising shakeup in the broadband industry. Vishesh Kumar reported that

Verizon Communications Inc., which last quarter became the first company ever to see a drop in DSL subscribers — some of whom went to its faster FiOS service — is now offering customers six months of DSL service free if they sign up for the company’s phone and Internet package. That makes the bundled package $45 a month, vs. $65 prior to the offer. AT&T Inc., meanwhile, is now guaranteeing its current prices, ranging from $20 to $55 a month, for two years.

I cite this because I always claim that less regulation of a highly-regulated industry promotes competition, consumer choice and ultimately lower prices. Occasionally someone claims that prices do not appear to be falling. And depending on the point in time they may be right. Of course, if you don’t have to lower prices to attract and retain customers you won’t. But good times never last forever.

Until the second quarter of this year, the cable and telephone industries were adding roughly equal numbers of broadband accounts. Then something changed, and the cable companies are now signing up three-quarters of new customers. Maybe the marketing efforts of some of the companies are better than others, or maybe the phone companies’ main broadband product, DSL, can no longer compete on speed, quality and/or features.

In any event, when all else fails, you have to slash your prices.