Articles by Steven Titch

Steven Titch (@stevetitch) is an independent telecom and IT policy analyst. His policy analysis has been published by the Reason Foundation and the Heartland Institute and covers topics such as municipal broadband, network neutrality, universal service, telecom taxes and online gambling. Titch holds a dual Bachelor of Arts degree in journalism and English from Syracuse University. He lives in Sugar Land, Texas. He burns off energy running 5K races, is an avid poker player, and likes to mellow out in cellar jazz bars.


Low-income states have a much higher degree of facilities-based competition than wealthier ones, according to a new report from ID Insight, a consulting firm that provides authentication, verification and fraud prevention solutions to financial services companies, credit issuers, retailers, online merchants and broadband providers.

The results, which surprised even its two authors, Adam Eliot and municipal broadband advocate Craig Settles, turns on its head the notions that only consumers wealthy markets are seeing the benefits of broadband competition and that Internet service providers have abandoned low-income rural areas as too costly to serve. One policy consequence already, the report says, is that most of the $7.2 billion broadband stimulus awarded so far has been directed to states and regions where there is robust competition and no shortage of service.

Continue reading →

After reading over some of the postings from the few weeks and exchanging emails with TLF’s Richard Bennett, I am coming to see how disastrous a decision it was for the FCC to pursue sanctions against Comcast over its throttling of BitTorrent files.

True, the case, and the court decision has allowed activists to foam at the mouth about a “crisis” in Internet service.

Yet despite the breathless warnings, none of this resonates with the public. The results of a recent Rasmussen Reports poll, posted here by Adam Thierer, that found that 53% of Americans oppose FCC regulation of the Internet.

Perhaps Americans are sanguine because there is no Internet censorship problem. Even though the issues in the BitTorrent case are a bit technical, the public groks on some level that claims by proponents of  regulation that the recent U.S. Court of Appeals decision in favor of Comcast would lead to rampant Internet censorship don’t ring true.

That’s because first and foremost, the BitTorrent case was not about blocking or “censorship.” In fact, in the more than four years of debate, the only real instance of a network neutrality violation, that is, an outright flouting of the guidelines set up by former Chairman Michael Powell, came in 2005 when Madison River Communications blocked Vonage’s VoIP service. And Madison River got caught and fined.

Continue reading →

The Federal Communications Commission keeps grabbing and the judges keep slapping its hands.

The big news today is that a federal appeals court has ruled the FCC has no legal authority to regulate the Internet. This throws the entire FCC broadband policy agenda into turmoil.

The decision, by the United States Court of Appeals for the District of Columbia Circuit, concerns sanctions the FCC imposed on Comcast after the cable company slowed down the rate of transfer for certain peer-to-peer files using the BitTorrent protocol. Although Comcast and BitTorrent settled the dispute, the FCC nonetheless sought to fine Comcast for violating the FCC’s network neutrality guidelines against content discrimination. Comcast sued, claiming it had the right to manage its own network to serve the interest of the 95 percent its customers who don’t use BitTorrent.

Continue reading →

There’s been some considerable comment over my February 11 post that, had the the FCC’s proposed Network Neutrality regulation been in force a few years ago, products like the Apple iPhone and Amazon Kindle would not have been possible.

In fact, the otherwise levelheaded Mike Masnick at TechDirt called my assertion “ridiculous.”

I beg to differ.

The premise behind mandated network neutrality is the concern that ISPs like AT&T, Verizon and Comcast are in a position to exploit their control of the “last mile” broadband connections to unfairly influence the market success or failure of a third-party Internet-related product or application.

Former vice-president Al Gore summed up the position best in a Reason TV video posted here the other day: “I just think that’s it’s unacceptable to have the folks that control the pipes get into anything that smacks of controlling the content or favoring their content over other content. Whoa!”

Continue reading →

The Reason Foundation releases my policy brief today looking at the effect network neutrality regulation will have on wireless applications and services.

Much has been written about the deleterious effect that regulating network management would have on broadband investment and innovation, and when applied to wireless, which is what FCC Chairman Julius Genachowski proposes to do, problems would only get worse.

The non-discrimination principle that Genachowski seeks to mandate would prohibit service providers such as AT&T, Verizon Wireless, T-Mobile and Sprint from using their network resources to prioritize or partition data as it crosses their networks so as to improve the performance of specific applications, such as a movie or massive multiplayer game. Yet quality wireless service is predicated on such steps. The iPhone, for example, would not have been possible if AT&T and Apple did not work together to ensure AT&T’s wireless network could handle the increase in data traffic the iPhone would create.

Continue reading →

Catching up on some magazines while waiting for my car to pass its annual emissions test the other day, I came across an article on cable TV bundling. Not too long ago, the issue of cable TV multichannel packaging–and whether cable companies should be required to offer channels “a la carte,” allowing customers to pick and choose the channels they watch–was a hot issue. Former FCC Commissioner Kevin Martin pushed heavily for it, even though the FCC’s own research, and later as some real-world market trials, found that a la carte options would not gain market traction.

The article nicely summed up the reasons.

The simple argument for unbundling is: “If I pay sixty dollars for a hundred channels, I’d pay a fraction of that for sixteen channels.” But that’s not how a la carte pricing would work. Instead, the prices for individual channels would soar, and the providers, who wouldn’t be facing any more competition than before, would tweak prices, perhaps on a customer-by-customer basis, to maintain their revenue. That doesn’t necessarily mean that Bravo would suddenly cost fifteen dollars a month, but there’s little evidence to suggest that a la carte packages would be generally cheaper than the current bundles. One recent paper on the subject, in fact, estimated the best-case gain to consumers at thirty-five cents a month. But even if it wasn’t a boon to consumers an a la carte system would inject huge uncertainty into the cable business, and many cable networks wouldn’t get enough subscribers to survive. That’s a future that the industry would like to avoid.

Continue reading →

In my researching the wireless competitive picture for my comments on the FCC Network Neutrality NPRM, one of my contacts was kind enough to point me to a Bank of America/Merrill Lynch paper that used the Herfindahl-Hirschman Index (HHI) to compare the market concentration of wireless service providers in the 26 Organization for Economic Co-Operation and Development (OECD) countries.  HHI is one of the metrics used by the Department of Justice to determine market concentration. It is calculated by squaring the market share of each firm competing in the market and then summing the resulting numbers. For example, for a market consisting of four firms with shares of 30, 30, 20, 20 percent, the HHI is 2600 (302 + 302 + 202 + 202 = 2600). The higher the number, the greater the market concentration. When the formula is applied to the U.S. wireless market share percentages determined by Bank of America/Merrill Lynch (28.5, 26.7, 18.2. 12.1 and 14.5), the U.S. HHI is the smallest at 2213. This number is substantially less than the HHIs for all the other OECD companies with the exception of the U.K.  Otherwise, no other HHI is under 2900.

Here’s the OECD market share data for Q4 2007 as it appears in the Bank of America/Merrill Lynch’s Global Wireless Matrix.

Continue reading →

As the annual Winter Consumer Electronics Show (CES) is set to convene in Las Vegas tomorrow, it will be interesting to see the temper of the policy climate. On the federal and state policy level, the hostility towards every facet of the high-tech sector has done nothing but grow. Not too long ago, politicians were extolling America’s high-tech leadership as the its primary vehicle for continued global economic leadership. Now it seems the entire tech sector, from semiconductors to wireless phones to TV is under attack from the White House, to Congress to state-level bureaucrats.

Just this week, as Adam reported, the left-leaning Free Press has inexplicably gone on the offensive against the idea of pushing TV to wireless devices. Such activists are no doubt emboldened by the example of the current administration, which has launched an antitrust campaign against Intel (just as the European Union was all but surrendering its own),  and continues to press for antitrust action against Google before, as antitrust chief Christine Varney has freely admitted, it is “too late”— that is, the speed of technology change undermines the government’s case, as it did in in the Clinton era Microsoft suit over browser bundling.

Add to this the California Energy Commission’s ban on big screen TVs, the FCC’s push for sweeping new Internet  regulations under the guise of “network neutrality,”and the Internet, in general, being blamed for everything from the decline of newspapers to postal rate increases to weight gain in teenage girls (for more, type “Internet blamed for” into the search engine of your choice), and one might expect the mood at the show, at least in the policy sessions to be dour. Even if I am watching from afar (Adam will descend into the CES maelstrom on our behalf), I await to see if this will be the case.

The Left has been drumbeating about high-tech market failure for more than 10 years (plus ça change: see this rebuttal paper from 2001). The big difference is that today’s Washington technocrats have bought in, despite all the evidence to the contrary.  Berin provided some solid data on mobile OS competition earlier today. Here’s some more data courtesy of Digital Society as to the growth of applications and revenues in this alleged stagnant, failing sector:

– Number of e-mails sent per day in 2000: 12 billion
– Number of e-mails sent per day in 2009: 247 billion
– Revenues from mobile data services in the first half of 2000: $105 million
– Revenues from mobile data services in the first half of 2009: $19.5 billion
– Number of text messages sent in the U.S. per day in June 2000: 400,000
– Number of text messages sent in the U.S. per day in June 2009: 4.5 billion
– Number of pages indexed by Google in 2000: 1 billion
– Number of pages indexed by Google in 2008: 1 trillion
– Amount of hard-disk space $300 could buy in 2000: 20 to 30 gigabytes
– Amount of hard-disk space $300 could buy in 2009: 2,000 gigabytes (2 terabytes)

Metrics such as these are the best weapon against attempts at regulation, especially from an administration keen to find a “market failure” rationale wherever it looks. High-tech consumer electronics remains a bright spot in what has been a down economy. It is best left on its own to thrive.

FCC Chairman Julius Genachowski linked spectrum management, universal service and network neutrality in a speech yesterday at the Innovation Economy Conference in Washington, and in the process may have signaled some comprehension of the negative consequences network neutrality regulation may have.

His most significant statement was a concession that network management will be required to keep wireless networks and services economically sustainable. The FCC’s Notice of Proposed Rulemaking on network neutrality seeks to apply a “non-discrimination” principle to wireless—that is, to prohibit service providers such AT&T, Sprint, T-Mobile and Verizon Wireless from using network intelligence from grooming, partitioning or prioritizing data to ensure quality performance of voice, data, gaming or video applications.

Yet yesterday, as reported by Wireless Week, Genachowski acknowledged that the explosion of data use was placing “unsustainable strains” on operators’ wireless networks.

“[There] are real congestion and network management issues that operators must address, particularly around wireless networks, and we must allow reasonable network management…,” he said, emphasizing the importance of developing policies that “encourage investment and the development of successful business models.”

While the NPRM does allow for “reasonable network management,” it never really defines what that may be. In light of this, Genachowski’s injection of “investment” and “business models” into his side of the debate is both startling and welcome.

For one, it acknowledges the rising chorus of critics (examples here and here), who, independent of ideology, have questioned the economic wisdom of barring carriers from recouping their investment in intelligent network technology from large applications providers who generate these necessary costs. A recent conference on Capitol Hill, sponsored by the American Consumer Institute (full disclosure: I was a participant), featured comments a number of economists who said mandated non-discrimination was a recipe for higher consumer prices, lower quality and ultimately, declining investment in broadband infrastructure. Much of the Q&A discussion focused on whether numerous wireless data services and applications that have arisen out of innovations such as the iPhone, because of the continual network management they require, would be possible under a network neutrality regime.

Genachowski also deserves credit for tying the network neutrality issue in with spectrum management and universal service. Previous commissions have tended to pursue these issues separately, as if the policies addressing one had no effect on the others. Contemporary telecom policy must be holistic, understanding how spectrum allocation affects wireless network management, and how allowing the industry greater freedom to formulate the business models and partnerships can yield the investment needed to deliver universal service.

The U.S. Treasury and the Federal Reserve have pushed back the deadline for banking industry compliance with regulations pursuant to the Unlawful Internet Gambling Enforcement Act of 2006 (UIGEA). UIGEA, the controversial tack-on to the Bush administration’s SAFE Port Act aimed at curtailing on-line gambling by making it illegal for U.S banks and financial institutions to participate in funds transactions between U.S. citizens and corporations that operate online casinos, effectively banning Internet gambling.

In a joint statement, the Treasury and the Fed delayed the compliance date, which had been set for today (December 1) to June 1, 2010, the Gambling Today blog reports. The decision also comes just days before Thursday’s scheduled hearing in the House Financial Services Committee on H.R. 2267, a bill introduced by Rep. Barney Frank (D-MA), which would overturn UIGEA and create a full licensing and regulatory framework for the Internet gambling industry in the United States.

As Financial Services Committee chairman, Frank has been a vocal opponent of UIGEA and has been working for its repeal over the past two years. In authorizing the delay, the two agencies said that financial institutions were not prepared with the mechanisms they needed to block unlawful Internet gambling transactions, but they also noted that the rules did not provide a clear definition of unlawful Internet gambling. This last observation could be significant as it acknowledges one of the bill’s principal vulnerabilities—it broadly defines Internet gambling as games of chance. Opposition groups, notably the Poker Players Alliance, have repeatedly argued (correctly IMHO) that certain online casino games, especially poker, are games of skill.

Online gambling blogs generally greeted the delay positively and hope it is another step in the direction of restoring the freedom to gamble online.

As the Gambling Today blog notes:

The postponement was greatly appreciated by the supporters of online gambling. House Financial Services Committee chairman Barney Frank has two of his sponsored bills coming up for hearing on December 3. Frank said, “This will give us a chance to act in an unhurried manner on my legislation to undo this regulatory excess by the Bush administration and to undo this ill-advised law.”