Articles by Hance Haney

Hance Haney is Director and Senior Fellow of the Technology & Democracy Project at the Discovery Institute. Haney spent ten years as an aide to former Senator Bob Packwood (R-OR), and advised him in his capacity as chairman of the Senate Communications Subcommittee. He subsequently held various positions with the United States Telecom Association and Qwest Communications. He earned a BA in history from Willamette University and a JD from Lewis and Clark Law School in Portland, Oregon.


The Federal Communications Commission began a broad inquiry of intercarrier compensation in 2001 and now it may finally be getting around to acting on it on Nov. 4 while everyone’s thoughts are on something else.

This is about 12 years overdue. Congress in 1996 foresaw that implicit phone subsidies were unsustainable and ordered the FCC to replace them with a competitively-neutral subsidy mechanism. Due to political pressure, regulators have failed to complete the job.

Intercarrier compensation refers to “access charges” for long-distance calls and “reciprocal compensation” for local calls. A long-distance carrier may be forced to pay a local carrier more than 30 cents per minute to deliver a long-distance call, but local carriers receive as little as .0007 cents per minute to deliver calls they receive from other local carriers.

Once upon a time, before fiber optics, there were significant distance related costs. Now distance isn’t a major factor.

The high access charges remain only because the recipients, typically small and mid-size phone companies serving sparsely populated areas, have successfully lobbied regulators and legislators to keep them.

Continue reading →

Comcast v. FCC: Now what?

by on September 24, 2008 · 11 comments

A divided FCC recently issued an order concluding that Comcast acted discriminatorily and arbitrarily to squelch the dynamic benefits of an open and accessible Internet, and that its failure to disclose its practices to its customers has compounded the harm. The commission does get a bit excited sometimes.  Anyway, the FCC required Comcast to end its network management practices and submit a compliance plan.

Richard Bennett reviews the Comcast “protocol agnostic” network management plan requested by the FCC:

[T]he new system will not look at any headers, and will simply be triggered by the volume of traffic each user puts on the network and the overall congestion state of the network segment. If the segment goes over 70% utilization in the upload direction for a fifteen-minute sample period, congestion management will take effect.

In the management state, traffic volume measurement will determine which users are causing the near-congestion, and only those using high amounts of bandwidth will be managed. The way they’re going to be managed is going to raise some eyebrows, but it’s perfectly consistent with the FCC’s order. High-traffic users – those who’ve used over 70% of their account’s limit for the last fifteen minutes – will have all of their traffic de-prioritized for the next fifteen minutes. While de-prioritized, they still have access to the network, but only after the conforming users have transmitted their packets. So instead of bidding on the first 70% of network bandwidth, they’ll essentially bid on the 30% that remains. This will be a bummer for people who are banging out files as fast as they can only to have a Skype call come in. Even if they stop BitTorrent, the first fifteen minutes of Skyping are going to be rough.

Aside from filing a compliance plan, Comcast is also filing suit. For one thing, Commissioner Robert McDowell claims that “the FCC does not know what Comcast did or did not do. The evidence in the record is thin and conflicting.” Ouch.

Yes, there could be years of litigation. Continue reading →

The ‘D’ Word?

by on September 19, 2008 · 13 comments

Barack Obama argues that John McCain “hurt everyday workers with his longtime support for deregulation,” according to Politico.

Thomas Frank adds,

There is simply no way to blame [the failure of several large financial institutions], as Republicans used to do, on labor unions or over-regulation. No, this is the conservatives’ beloved financial system doing what comes naturally. Freed from the intrusive meddling of government, just as generations of supply-siders and entrepreneurial exuberants demanded it be, the American financial establishment has proceeded to cheat and deceive and beggar itself — and us — to the edge of Armageddon. It is as though Wall Street was run by a troupe of historical re-enactors determined to stage all the classic panics of the 19th century.

But as Steve Forbes points out, the “easy-money” policy of the Federal Reserve helped financial institutions pile up debt and bad assets.

 According to former FDIC Chairman William M. Isaac,

The biggest culprit is a change in our accounting rules that the Financial Accounting Standards Board and the SEC put into place over the past 15 years: Fair Value Accounting. Fair Value Accounting dictates that financial institutions holding financial instruments available for sale (such as mortgage-backed securities) must mark those assets to market. That sounds reasonable. But what do we do when the already thin market for those assets freezes up and only a handful of transactions occur at extremely depressed prices?

The answer to date from the SEC, FASB, bank regulators and the Treasury has been (more or less) “mark the assets to market even though there is no meaningful market.” The accounting profession, scarred by decades of costly litigation, just keeps marking down the assets as fast as it can.

This is contrary to everything we know about bank regulation. When there are temporary impairments of asset values due to economic and marketplace events, regulators must give institutions an opportunity to survive the temporary impairment. Assets should not be marked to unrealistic fire-sale prices. Regulators must evaluate the assets on the basis of their true economic value (a discounted cash-flow analysis).

If we had followed today’s approach during the 1980s, we would have nationalized all of the major banks in the country and thousands of additional banks and thrifts would have failed. I have little doubt that the country would have gone from a serious recession into a depression.

Easy money and mark-to-market are not deregulatory policies. They are examples of government intervention with unfortunate consequences.

The nature of unfortunate consequences is always unpredictable; the inevitability of unfortunate consequences, never so.

Easy money was supposed speed the transition from the dotcom and telecom bubbles to prosperity, and mark-to-market was so we would not have to suffer from similar speculative bubbles in the future. Yet here we have another burst speculative bubble.

According to Frank,

Thanks to the party of Romney and McCain, federal work is today so financially unattractive to top talent that it might as well be charity work. It’s one of the main reasons — other than outright conquest by the industries they’re supposed to be overseeing — that our regulatory agencies can’t seem to get out of bed in the morning.

France attracts its best and brightest to government service, but most of us don’t want to be like France — at least not in all respects. Although it is hard to fail in France, it is also hard to succeed.  

Maybe blaming the regulators is like the blame the messenger proverb. Perhaps the problem isn’t the regulators; it is regulation itself.

Although regulation always seems brilliant in theory, it usually fails in practice. Either it doesn’t work, it spawns corruption or both.   Or it backfires, as it did here.

Broadband Prices Drop

by on September 19, 2008 · 10 comments

I expected to see more reaction to the Wall Street Journal’s recent observation of a surprising shakeup in the broadband industry. Vishesh Kumar reported that

Verizon Communications Inc., which last quarter became the first company ever to see a drop in DSL subscribers — some of whom went to its faster FiOS service — is now offering customers six months of DSL service free if they sign up for the company’s phone and Internet package. That makes the bundled package $45 a month, vs. $65 prior to the offer. AT&T Inc., meanwhile, is now guaranteeing its current prices, ranging from $20 to $55 a month, for two years.

I cite this because I always claim that less regulation of a highly-regulated industry promotes competition, consumer choice and ultimately lower prices. Occasionally someone claims that prices do not appear to be falling. And depending on the point in time they may be right. Of course, if you don’t have to lower prices to attract and retain customers you won’t. But good times never last forever.

Until the second quarter of this year, the cable and telephone industries were adding roughly equal numbers of broadband accounts. Then something changed, and the cable companies are now signing up three-quarters of new customers. Maybe the marketing efforts of some of the companies are better than others, or maybe the phone companies’ main broadband product, DSL, can no longer compete on speed, quality and/or features.

In any event, when all else fails, you have to slash your prices.

Winners and Losers

by on August 29, 2008 · 7 comments

The Federal Communications Commission picks winners and losers, which is why we ought to get rid of it. During the chairmanship of Reed E. Hundt, the losers were incumbent phone companies, whom Hundt considered too Republican. Now it is a cable company, who some consider too Democratic.

The FCC issued an order last week concluding that Comcast acted discriminatorily and arbitrarily to squelch the dynamic benefits of an open and accessible Internet, and that its failure to disclose it’s practices to its customers has compounded the harm. Wow. The FCC will require Comcast to end its network management practices and submit a compliance plan, which is code for submitting to bureaucratic micromanagement.

FCC Chairman Kevin Martin recently asked, “Would you be OK with the post office opening your mail, deciding they didn’t want to bother delivering it, and hiding that fact by sending it back to you stamped ‘address unknown – return to sender’?”

Martin, who the Wall Street Journal identifies as one of the Bush administration’s more questionable personnel picks, lately has become a bit excitable.

Martin is upset with Comcast because it rejects his hypothesis that allowing consumers to pay only for the cable channels they prefer would reduce cable rates.

Martin sided with the commission’s two Democrats to slam Comcast for managing its broadband network like a traffic cop who works hard to prevent gridlock.

Continue reading →

As expected, the FCC has chosen Comcast as the target of its biggest net neutrality enforcement action to date.  I wonder whether the FCC has actually chosen a good set of facts to serve as the foundation for what may possibly be a broad new precedent (we won’t know how broad until the commission publishes the order), considering that the commission will likely be forced to defend it in court.  Like it or not, FCC decisions are required to have a “rational basis.”

FCC Chairman Kevin Martin suggests Comcast acted atrociously:

While Comcast claimed its intent was to manage congestion, they evidence told a different story:

  • Contrary to Comcast’s claims, they blocked customers who were using very little bandwidth simply because they were using a disfavored application;
  • Contrary to Comcast’s claims, they did not affect customers using an extraordinary amount of bandwidth even during periods of peak network congestion as long as he wasn’t using a disfavored application; 
  • Contrary to Comcast’s claims, they delayed and blocked customers using a disfavored application even when there was no network congestion;
  • Contrary to Comcast’s claims, the activity extended to regions much larger than where it claimed congestion occurred.

In short, they were not simply managing their network; they had arbitrarily picked an application and blocked their subscribers’ access to it

Yet Commissioner Robert McDowell seems to claim that the evidence is insubstantial:

The truth is, the FCC does not know what Comcast did or did not do. The evidence in the record is thin and conflicting.  All we have to rely on are the apparently unsigned declarations of three individuals representing the complainant’s view, some press reports, and the conflicting declaration of a Comcast employee. The rest of the record consists purely of differing opinions and conjecture. [footnote omitted]

Continue reading →

FCC Chairman Kevin Martin received a reprimand from the Republican Leader of the House of Representatives, John A. Boehner, based upon reports that Martin plans to side with the commission’s two Democrats on Friday to interfere with the network management decisions of broadband providers in the matter of Comcast delaying the uploading of P2P file sharing when necessary to relieve network congestion:

When a small minority of subscribers – often using these applications to share pirated music and movies – began clogging the networks to the harm of the large majority of users, broadband providers began taking steps to alleviate the congestion. This, in turn, has prompted peer-to-peer developers to collaborate with broadband providers to find better ways to manage traffic.  It is this market-based self-governing nature of the Internet that is the key to its success.  Your heavy-handed attempts to inject the FCC into the middle of that process threaten to hijack the evolution of the Internet to everyone’s detriment.  It will also deter the very broadband investment we need for the Internet to continue growing to meet the increasing demands being placed upon it.

Comcast has already adjusted its policy based upon public reaction and perhaps the threat of regulation.  The question is whether this incident needs to be enshrined in permanent regulation or whether it indicates that the market actually works to protect legitimate consumer interests in the absence of reglation.  I think it’s the latter.

For the FCC commissioners, this is a choice between good politics and good policy.  Good politics would be to hammer Comcast, although that wouldn’t buy popularity for the Bush administration or any of its appointees.  Their enemies are their enemies.  Good policy would be to declare that this matter has been resolved.  Ultimately, appointees of the Bush administration will be judged on their policies, not their politics.

The Federal Communications Commission, according to the Wall Street Journal, is prepared to stop Comcast from blocking peer-to-peer file sharing later this week — although the commission won’t fine the company because it wasn’t “previously clear what the agency’s rules were.”

Now, according to Multichannel News, comes word that there is a wireless broadband provider who explicitly prohibits all uses that may cause extreme network capacity issues, and “explicitly identif[ies] P2P file sharing applications as such a use.” 

I am not familiar with the wireless broadband provider’s practices in this area (nor even of its relevant terms of service, even though I am a customer).  However, Comcast delayed file sharing only when necessary to relieve network congestion.  Absent congestion, Comcast permitted file sharing.  A cable broadband network typically experiences congestion during the early evening hours. Which means that if file sharers were willing to avoid those hours they could share files on the Comcast network the rest of the time.  

So it will be interesting whether the FCC bans network management which prohibits file sharing, in which case cable and wireless networks could become congested to the annoyance of millions of ordinary users.  Or whether it allows broadband providers to practice network management so long as they clearly disclose it, in which case file sharers may discover they can’t use a broadband wireless or cable connection to share files, ever. Or maybe the brilliant politicians at the commission will require disclosure in sufficient detail to enable hackers to defeat network management altogether, permitting congestion to reign but ensuring that providers, not the commission, will be blamed.

As everyone who reads this blog knows, the architecture of cable, wireless and wireline networks is completely different.  Each have unique congestion challenges, and in the short term all providers must have flexibility to find appropriate solutions.  

The key point is that all broadband providers are trying to increase bandwidth as fast as they can.  The proper role for the commission is to eliminate barriers to investment, of which regulatory uncertainty is one of the most significant.

If a particular company, Comcast, is the target here primarily because it refused to pay certain political dues or tribute, as I suspect it is, we should acknowledge that and take the company’s side.

Should antitrust enforcers be concerned about entry barriers in the search ad market? Some believe the market exhibits “network effects,” according to the New York Times.

Although traditionally applied to Industrial Age industries with high fixed costs like railroads and telephone exchanges, anything now exhibits a network effect if its value increases because more people use it. Network effects are “everywhere,” according to a top former antitrust official. Coke and Pepsi drinkers, for example, “benefit from the network of their fellow consumers because Coke and Pepsi are widely available in restaurants and in vending machines,” claims Timothy J. Muris.

A preexisting network of vending machines is admittedly tough for soft drink imitators to replicate. But a barrier to imitation can also be viewed as a spur to innovation because it acts as a reward which inspires creators and investors. Not an incentive to create a barely distinguishable alternative, to be sure, but to create something transformative.

The alleged network effects in search advertising are more subtle than in the case of railroads, telephone exchanges or soft drinks (in fact, they even bear a striking resemblance to what one might also term legitimate and hard-won competitive advantages).
Continue reading →

Trade War

by on June 16, 2008 · 20 comments

Picking up on Braden’s recent post, “Abuse of Power? Competition Commissioner that Pushes ‘Smart Business Decisions,’” it’s no secret that Europe’s software industry is years behind Microsoft, and not surprising the industry is seeking help from politicians in Brussels.

When Kroes, a politician, talks about open standards one must assume she is referring to the European software industry, not to the open source movement generally. Of course, for the moment “the enemy of my enemy [may be] my friend,” as they say.

In her remarks last week Kroes said,

“I know a smart business decision when I see one — choosing open standards is a very smart business decision indeed,” Ms. Kroes told a conference in Brussels. “No citizen or company should be forced or encouraged to choose a closed technology over an open one.”

This statement could be read either as an innocent statement of personal opinion, or more like an informal, unofficial statement of official policy with plausible deniability. I suspect it is the latter, and that if you are a European bureaucrat or business leader you now understand what is expected of you as far as your future software procurement is concerned.

Why would Kroes need to be opaque? Because there are both structural (e.g., excessive tariffs, unreasonable licensing terms, etc.) and nonstructural trade violations (e.g., certain winks and nods) which are actionable. And because two or more can play this game.

A good reason for governments to not encourage boycotts of foreign goods is because foreign governments can do the same thing. That can lead to trade war, in which your efforts to protect one of your small, insignificant struggling industries may result in foreign retaliation against your most successful exporters.

Trade wars don’t always have serious repercussions, but they have sparked global recessions and many think a trade war sparked the Great Depression.

That’s another good reason why maybe politicians on both sides of the Atlantic ought to leave software procurement decisions up to the marketplace.