Carl Gipson – Technology Liberation Front https://techliberation.com Keeping politicians' hands off the Net & everything else related to technology Tue, 10 May 2011 21:39:31 +0000 en-US hourly 1 6772528 San Francisco backs off controversial cell phone radiation ordinance https://techliberation.com/2011/05/10/san-francisco-backs-off-controversial-cell-phone-radiation-ordinance/ https://techliberation.com/2011/05/10/san-francisco-backs-off-controversial-cell-phone-radiation-ordinance/#comments Tue, 10 May 2011 21:39:31 +0000 http://techliberation.com/?p=36767

San Francisco, often the breeding ground for “interesting” public policy proposals, decided recently to back off its mandate the would have required retailers of cell phones to label them with radiation levels and pass out material explaining the level of SAR in each device (SAR= Specific Absorption Rate).

This has not been done anywhere else and faced stiff opposition from the wireless industry, which filed suit against the ordinance last year.

We’ve commented on the ridiculous nature of this ordinance before. Suffice to say, in the year since this issue hit, there has still been no evidence offered that cell phones pose any sort of carcinogenic threat. This great piece in the NY Times Magazine goes into more detail on the problems of testing this hypothesis but also highlights the common error of confusing causality with coincidence. The author’s main point:

“In truth, many substances of modern life do not — cannot — cause cancer. Some do, and it’s absolutely critical to identify and reduce exposure to them. Other’s don’t, and it’s absolutely worthwhile identifying these, so that we can focus on the real carcinogens around us. If we lump everything into the category of “potentially carcinogenic,” from toxic potatoes to McCarthys grave, then our scientific language around cancer begins to degenerate. The effect is like crying “wolf” about cancer…” (emphasis added)

An interesting factoid pointed out in the San Francisco Chronicle’s article (one that I must admit I had been previously unaware) was that Specific Absorption Rate measures the peak output of electromagnetic radiation. This could lead to consumers thinking that the lower SAR rate is the end-all-be-all statistic they should take into account if they are concerned with radiation.

In fact, while SAR measures peak radiation levels, it does not measure average SAR levels. So, it’s highly possible that a handset that emits higher SAR levels has a lower overall SAR average output, while a lower SAR handset may actually have a higher average SAR level. Therefore, the customer could end up purchasing a handset that emits more radiation than they think. Again, numerous scientific communities and the FCC have concluded that cell phone radiation does not cause cancer.

We’re seeing a “precautionary principle” develop in the technology/digital arena and the result is this type of policymaking. An upcoming paper of mine will explore how this is taking place and why it’s not a good thing for technology or innovation.

The bottom line is San Francisco moved forward with this ordinance because it thought regulations like these would help people decided for themselves which handset to purchase. But failing to incorporate science and plain facts created even more confusion and stoked unnecessary fears. There really are genuine health concerns in our environment, ones that have a strong correlation with cancer, and we should remain focused on those and not waste time trying to make politicians look good.

]]>
https://techliberation.com/2011/05/10/san-francisco-backs-off-controversial-cell-phone-radiation-ordinance/feed/ 4 36767
UK report’s “resiliency” sure looks a lot like “anticipation” https://techliberation.com/2011/05/10/uk-reports-resiliency-sure-looks-a-lot-like-anticipation/ https://techliberation.com/2011/05/10/uk-reports-resiliency-sure-looks-a-lot-like-anticipation/#respond Tue, 10 May 2011 04:33:40 +0000 http://techliberation.com/?p=36739

A UK government report issued this week warns that climate change, in addition to threatening many different parts of everyday life, also threatens the Information and Communications Technology (ICT) industry. The report, available online, warns that regulatory measures have to be taken to lessen the threat of rising temperatures and stormy weather, which would have adverse effects on the radio waves that constitute communications technology.

Specifically, the report’s authors assume that rising temperatures and rainy storms will interfere with radio waves. This assumes that that the aforementioned rising temperatures and rainy storms are indeed a foregone conclusion. For the sake of argument, let’s assume they are correct.

The study mentions that rising temperatures will cause cell towers to lose efficiency, but nothing in the document backs this up. Making such a claim requires scientific data but nothing was offered. A skeptical person reading this report may think, anecdotally, that cell towers are sited in all sorts of conditions all over the globe, taking into account varying temperatures in which they operate. Cell towers sited in Alaska are probably able to handle the extreme cold, otherwise the cell provider would not waste money placing it there. Likewise, a cell tower sited in Arizona would need to take into account the 100 degree+ temperature. And at last count, wireless service is available in both Alaska and Arizona.

The basis for their concern is that the UK report authors are assuming a 2-10 Celsius increase in mean temperature by 2080 (throughout England). This exceeds the IPCC prediction by quite a large margin and reality is likely to be far less. In the United States the worst case scenario is expected to be about 7 degrees Fahrenheit over the next several decades. But again, many climatologists assume it will be about half that.

The point of my critique is not to criticize their climate science, but rather the public policy approach that policymakers and the private market must undertake in order to address these concerns.

Interestingly, the report calls for a “climate resilient world,” but then doesn’t describe what a climate resilient world looks like, or the regulations that would be required to bring this utopia about (most of the recommendations require further study and prioritization). But the report does assert that “countries will need to increase their investment in their infrastructure to adjust to a more challenging climate.” So, the devil is in the details — UK economic regulators would be tasked with planning (aka regulating) on behalf of the ICT industries.

And while I applaud the introduction of “resiliency” into the conversation, I disagree with their interpretation of resiliency. One problem is that the authors concede that “the impact of climate change on telecommunications is not well understood,” but the report goes on to make some pretty fantastical potential pitfalls of what might happen if steps are not taken to mitigate these risks. Regulators shouldn’t be tasked with regulating using the worst case scenario as a basis for establishing rules. This is why cost/benefit analyses are necessary.

The UK report, and its subsequent recommendations, are in actuality engaging in a strategy of “anticipation” rather than “resiliency.” They are anticipating that catastrophic harm to the ICT industry will come about (rather than “may”) due to climate change. This despite the fact that we have been dealing with electromagnetic communication for one hundred years, and so far the only true threats to the continuation and expansion of this technology is physical infrastructure development (as in taller buildings that block line-of-site or cause signal attenuation) or solar activity in the form of solar flares.

A true adaption of a “resiliency” would take into account the fantastic evolutionary improvements in WiFi and other telecommunications over the last decade alone and even account for the high probability that WiFi as we know it will not be around by the time the worst-of-the-worst climate change repercussions hit. Same goes with traditional voice wireless telecommunications. Just check your local newspaper or tech blog for advertisements of 4G networks (3G networks are only 4-5 years old!). Before you know it, we will see improved iterations of today’s network emerge, this will be followed by revolutionary and possibly disruptive advances in communications in the next few decades. And by 2080? Chances are that we won’t even need cell towers; maybe not even fiber optics. (see Adam’s rundown on resiliency for more information)

Innovation moves rapidly, much more rapidly than government regulators. We need to take this into account when considering rulemaking and long-term climate change.

Wireless Fidelity (WiFi for short) has been around for awhile and in very different forms but most consumers are only familiar with a few systems since they came onto the consumer market in the early 2000s. This chart shows the evolution of consumer WiFi (though there are over a dozen other standards that most people don’t know about).

System Range Speed Notes
802.11b 150-300 feet 11Mb/s 1999
802.11g 150-300 feet 54Mb/s 2003
802.11n 300-600 feet 600mb/s Multiple I/O
WiMax 31 miles 40Mb/s – 1Gb/s mobile voice/broadband

The evolution of both wired and wireless technology continue unabated. And as the FCC moves toward opening up more wireless spectrum, companies and researchers will learn how to capitalize on these new airwaves, bringing more mobile broadband to citizens. If warmer temperatures or increased storms pose any threat to the global communications industry, you can bet that it won’t catch the governments and businesses, which have already poured billions of dollars into this investment, by surprise.

The report touches on other parts of the UK infrastructure that no doubt need to be addressed before 2080, such as roads, bridges, dams, airports, etc. This type of physical infrastructure may indeed be more susceptible to floods, hurricanes, tornadoes, etc. (again, assuming that a much worse climate period is heading our way and also that no maintenance or upgrading to this infrastructure is accounted for, again, not very realistic).

But policymakers still need to take into account the probability of massive floods and temperature increases with the cost of regulating against these risks. It’s tempting for government regulators to come to the rescue via economic regulation, but the cost and unintended consequences may very well outweigh any potential benefits. No one can see with crystal clarity just where human technological advancement will take us in the next seven decades, but acknowledging the fact that it exists would be a good start.

]]>
https://techliberation.com/2011/05/10/uk-reports-resiliency-sure-looks-a-lot-like-anticipation/feed/ 0 36739
Balancing Risk and Regulation https://techliberation.com/2011/04/18/balancing-risk-and-regulation/ https://techliberation.com/2011/04/18/balancing-risk-and-regulation/#respond Mon, 18 Apr 2011 17:33:35 +0000 http://techliberation.com/?p=36330

So a few weeks ago I hit up Adam Thierer, who has done and is continuing to do great work on all things regulation, on some materials for a project I was working on regarding the precautionary principle in the digital space. Turns out Adam was in the middle of his own Digital Precautionary Principle piece as well. I’ll take our simpatico as a sign that this phenomenon may actually be taking place and that I’m not paranoid. (If you haven’t read his earlier piece on TLF, please do so).

While my piece on DPP is coming, hopefully this week, I’ll start things off with my article in today’s RealClearMarkets.com on regulations and risk and how regulating agencies are engaging in traditional “risk aversion behavior” to the detriment of the risk takers (aka entrepreneurs) in the private market. A smarter approach to regulating would incorporate both benefits and risks of NOT regulating. So many times the discussion is geared towards the notion that something has to be done, so how can we minimize the negative impacts, rather than, should we be doing anything at all or should we encourage the trial and error mechanisms that markets utilize?

While the piece isn’t targeted directly at the technology industry, I think it can apply there just as much as any other industry.

 

]]>
https://techliberation.com/2011/04/18/balancing-risk-and-regulation/feed/ 0 36330
Why are some states disproportionately taxing a service we want more people using? https://techliberation.com/2011/02/15/why-are-some-states-taxing-a-service-we-want-more-people-using/ https://techliberation.com/2011/02/15/why-are-some-states-taxing-a-service-we-want-more-people-using/#comments Tue, 15 Feb 2011 18:41:00 +0000 http://techliberation.com/?p=35052

A new report out this week in State Tax Notes shows the discriminatory way in which Federal, state and local governments treat their citizens who subscribe to wireless services — and according to CTIA that’s about 93% of Americans.

Federal, state and local taxes and fees for wireless services topped an average of 16.3% in 2010. The highest combined rate was 16.85% in 2005. This far surpasses the average retail sales tax rate, which obviously varies by state.

Some blame can rest squarely on the shoulders of state or local officials who have targeted wireless services for a specific tax. The report points out a few examples:

  • Baltimore: increased its per-line tax from $3.50 per month to $4
  • Montgomery County, MD: increased its per-line tax from $2 to $3.50 per month
  • Olympia, WA: imposed a 9 percent telecommunications tax on top of the state-local combined sales tax of 8.5 percent
  • Chicago: imposed a 7 percent excise tax on wireless services on top of the state’s 7 percent excise tax
  • Nebraska: imposes a local “utility” tax of up to 6.5 percent in addition to the 6.5 percent combined state-local sales tax
  • Tucson, AZ: increased its telecommunications license tax from 2 percent to 4 percent

But the Fed’s Universal Service Fund (USF) fee is where the biggest increase has come of late. Increases in the Federal USF have added 0.9 percentage points to the rate since 2007, while state and local increases added just 0.2 percent. However, of concern as well is the rate at which wireless taxes/fees are increasing: almost three times as fast as other general sales tax increases.

I’m privileged enough to live in Washington state, where we impose 23.00% combined Federal,state and local taxes/fees on wireless customers. That’s second highest in the nation. Granted, Washington state has one of the highest state+local combined sales tax in the nation according to the Tax Foundation, but the 17.95% state-local wireless rate is twice that of the state-local sales tax rate of 9%. What is the excuse for that?

Nebraska imposes the highest combined tax/fees on wireless at 23.69%. Lowest is Washington’s neighbor to the south, Oregon, at 6.86% then Nevada at 7.13% and Idaho at 7.25%. Rounding out the top five of the worst are New York (22.83%), Florida (21.62%) and Illinois (20.90%).

So this brings up the debate of why are some government entities disproportionately taxing a service everyone agrees is vital to the health and future growth of our economy? CTIA says that fully one-quarter of households in the U.S. have ditched their landline phone in favor of wireless only or wireless + IP telephony  (count this author among one of those).

Everyone from the Obama Administration on down is touting the future of wireless and mobile broadband as one we should heartily embrace. Examples of this include the FCC’s recent Net Neutrality Order that exempts wireless carriers, the Administration’s push for spectrum re-allocation, all the hullabaloo about 3G-4G-LtE and all the talk about increasing carrier’s backhaul.

A recent Pew Internet poll highlighted that, for many people (and especially minorities), a mobile smartphone is the primary way they connect to the Internet. Now that many smartphones can be had for free with a contract, we have lowered the barrier to entry to things like social networking and search. No longer do you have to own a desktop or laptop, you just need to whip out the computer in your pocket and connect to a cell service or WiFi. Another Pew poll said that among “non-adopters” of broadband, 14% of them had accessed the Internet via their mobile phones.

For years the two main reasons that people state when it comes to explaining why they have not adopted broadband Internet are price and relevance. The non-adopters either can’t afford, don’t see the value in subscribing to broadband or think it’s simply a waste of time. We can’t do anything about the folks who think the Internet is a fad, but why make wireless services and mobile broadband even more expensive, especially if this type of service is so elastic?

Twenty-three states now impose a state-local wireless rate above 10%, which has become the unwritten but psychological barrier that we are seeing sales taxes creep towards.

Wireless services and customers shouldn’t be exempted from paying any taxes or fees, but imposing a disproportionately higher tax rate on a service we as a society are pushing people to use more of is duplicitous.

]]>
https://techliberation.com/2011/02/15/why-are-some-states-taxing-a-service-we-want-more-people-using/feed/ 3 35052
Preserving the Open Internet by Changing Everything https://techliberation.com/2010/12/21/preserving-the-open-internet-by-changing-everything/ https://techliberation.com/2010/12/21/preserving-the-open-internet-by-changing-everything/#comments Tue, 21 Dec 2010 22:19:02 +0000 http://techliberation.com/?p=33812

Citing nefarious, and completely imaginative, examples of “Big Mobile” and “Big Cable” shutting down access to the Internet, the FCC voted today to move towards greater regulatory oversight of Broadband Internet through Net Neutrality principles. (indeed, even the HuffPo is saying in the most hyperbolic way that this is “The Most Important Free Speech Issue of Our Time“)

For a primer of Net Neutrality, visit here or here.

The public is still waiting for the specific language of the regulatory order (FCC Commissioners only recevied it themselves just before midnight last night), so I cannot comment on the language that lawyers are sure to argue over for years to come. However, based on the comments of the FCC Commissioners, both pro and con, on the order I have come to some preliminary conclusions:

1) Those who think this regulatory framework “preserves” the openness of the Internet are wrong. This fundamentally changes many aspects of the infrastructure of the Internet, even if it is below the radar. My foremost concern is that paid prioritized access will be barred. So, count Level3 among the winners here today. How is paid prioritized access a bad thing? If I want a quality experience with Netflix or Amazon or Hulu, I need access to a network that prioritizes video over someone’s email, who won’t be harmed if their message is delayed by less than a second. If my video is delayed less than a second, guess what, jitter occurs. And if too much jitter occurs then I’m turning off my video. How is that good?

2) The legal framework on which the majority of FCC Commissioners are basing this argument is void of reasoning. The DC Appellete Court struck down the Commission’s authority in this matter and the FCC’s response is to thumb their noses to the Court and move forward unilaterally when members of Congress in both parties recognize the FCC’s overreach. Chairman Genachowski needs a rulemaking 101 refresher course. As Commissioner Attwell-Baker points out, the Commission is NOT given an affirmative grant of authority to regulate in this area by the Telecom Act of 1996. Therefore, there is no basis on which the FCC can stand on to make these rules.

It doesn’t matter how well-meaning these rules may be, but as a non-legislative body, the FCC must have been granted the authority to regulate in this area specifically from a legislative body (Congress) and it has not received such authority. As Commissioner McDowell pointed out, why would have Congress introduced legislation just a few months ago addressing this very issue if the FCC already had direct regulatory authority in this manner?

3) The FCC is putting the onus on the ISPs of the world to prove that their network management practices do not harm the consumer. This is backwards. Companies need the freedom to innovate and if they go too far in the harming of the consumer than we must revist the question “are there adequate enforcement or consumer protection mechanisms in place to curtail infringement of the rights of consumers?” This new regulatory regime is analagous to a digital “precautionary princinple” and has the potential to open up myriad and frivolous claims by consumer groups who wont be happy until the Commission IS the ISP, or at least a highly regulated public utility. 

This is just the end of the beginning though, as it looks as if both sides (industry and the consumer leftist groups) are lawyering up to take this fight to the courts. So it appears that the only folks happy with this vote are those that will wrack up billable hours duking out the definition of “reasonable network management,” among other opaque language in the forthcoming order. 

]]>
https://techliberation.com/2010/12/21/preserving-the-open-internet-by-changing-everything/feed/ 8 33812
Gladwell’s take on social networking as a social force (or lack thereof) https://techliberation.com/2010/09/28/gladwells-take-on-social-networking-as-a-social-force-or-lack-thereof/ https://techliberation.com/2010/09/28/gladwells-take-on-social-networking-as-a-social-force-or-lack-thereof/#comments Tue, 28 Sep 2010 18:13:12 +0000 http://techliberation.com/?p=31963

An interesting and thought-provoking piece by Malcolm Gladwell over at The New Yorker this month takes a look at the intersection between true civic activism (the kind that could get you killed) and “social networking” activism (the kind that only takes a retweet or hitting the “like” button on Facebook).

Gladwell’s piece starts off retelling the story of how the Civil Rights “sit-in” movement of the early 1960s spread like wildfire among the younger set without the aid of, god forbid, Facebook or Twitter. Contrast that historical example with the more recent happenings in Iran and the Twitter Revolution, where it seemed that tens of thousands of Twitter users stood in solidarity with the protesting Iranians, some of who were literally dying in the streets. The point Gladwell is making, and one with which I concur, is that for all the hype regarding social networking tools, relying on said tools to advocate significant change will end up in a losing battle or inefficient result.

A big reason, Gladwell postulates, is that social networks are at their core good at increasing participation but inefficient at execution. It’s easy to hit the “like” button on Facebook to agree that “I support Darfur victims,” or “down with big government,” but it’s another thing to put your literal neck on the line — as the protestors in South Carolina and Iran did.

So, it will be interesting how this social network aspect affects today’s Tea Party. Unlike in 2008, when the Obama campaign made history through social network participation, the Tea Party has no official head, no official hierarchy. The Tea Party seems to be more of a network of independent operators, not a movement orchestrated by a small group of decision-makers with a clear agenda and defined strategy and goals.

For all the “populism” put forth by Obama and his supporters on Facebook, Twitter, et. al., there was a distinct hierarchy — strategic decisions backed up with execution by people who were in charge and accountable to higher-ups. Not so with the Tea Party.

So, at the risk of sounding cliche, we are entering another new experiment in politics and technology. Social networking is giving people the chance to participate merely through the click of a button. But by lowering the barrier to participation to the “least inconvenience,” will change actually manage to surface?

]]>
https://techliberation.com/2010/09/28/gladwells-take-on-social-networking-as-a-social-force-or-lack-thereof/feed/ 3 31963
Free Press Pressing Communities of Color on Net Neutrality https://techliberation.com/2010/06/17/free-press-pressing-communities-of-color-on-net-neutrality/ https://techliberation.com/2010/06/17/free-press-pressing-communities-of-color-on-net-neutrality/#comments Thu, 17 Jun 2010 22:18:31 +0000 http://techliberation.com/?p=29821

A fun little tidbit from Huffington Post today. Cook County Commissioner Robert Steele penned an op-ed revealing that Free Press, strong advocates for Net Neutrality regulation, is pushing its agenda on minority communities in order to gin up support for further regulation of the Internet. I’m sure there is no connection with today’s FCC decision to move forward with its Notice of Inquiry on reclassifying the Internet to fits Chairman Genachowski’s controversial “third way.”

Take a minute to read the entire piece by Commissioner Steele, but one of the more salient points is this,

“My first thought when reading this [Free Press] email was, ‘what do these folks know about the needs and wants of communities of color, especially on an issue as impactful as Net Neutrality?'”

In assessing a couple of recent surveys on broadband adoption among minority communities (especially African-American and Hispanic), a couple of things become evident. First, the nation is facing an adoption problem, not an access problem. Those who are not connected to broadband are in this position largely due to their own choice. The FCC’s own report shows that, while African-Americans and Hispanics trail the average in broadband access, the gaps have narrowed just in the last year.

Not only that, but when it comes to the African-American community, it is the older folks who are not connecting (both minority and non-minorities). Those in the minority community under the age of 30 have basically the same broadband adoption rates as whites, which mean younger adults are recognizing the benefits of broadband. The same can be said about educated households, but then, educated households have a higher income level than non-educated and higher income is another factor towards higher adoption rates.

Another interesting factoid is that the minority groups are more likely to access the Internet via a handheld device. This means that mobile broadband growth may very well help pick up the slack in the “digital divide.” It seems more and more are relying on their smartphones to handle their Internet needs.

Really though, the bottom line is that people in low-income households, and those who tend to be older, are the ones that by-and-large do not want to connect to the Internet. There is nothing in Genachowski’s “third way” regulatory scheme, nor in anything that Free Press is pushing, that will help bridge this gap.

It’s a shame that Free Press is using racial division as a motivation to push unnecessary government regulation.

]]>
https://techliberation.com/2010/06/17/free-press-pressing-communities-of-color-on-net-neutrality/feed/ 7 29821
Then What’s the Point? https://techliberation.com/2010/06/16/then-whats-the-point/ https://techliberation.com/2010/06/16/then-whats-the-point/#comments Thu, 17 Jun 2010 04:26:34 +0000 http://techliberation.com/?p=29800

A “funny until you realize it’s real” story out of San Francisco today (courtesy of the NY Times) where the city will soon require cell phone retailers to display the amount of radiation in each phone that is for sale on their shelves.

Even though this concern has been debunked time and again — the article mentions the latest major study that found no supporting evidence linking cancer to cell phone usage — the city believes that, though there are plenty of resources for those wishing to research radiation for particular cell phones, the city “thinks that for the consumer…it ought to be easier to find.”

This brings to mind one of the old axioms of public policy; that “there ought’a be” legislation is often poorly thought out and poorly executed. A few years back, legislators in Arizona wanted to enact a “cell phone bill of rights” to help unhappy cell phone customers, including bans on contracts longer than one year. Sounds great, to be able to suddenly legislate away dropped cell calls, or legislate how a company writes its contracts, until you think about it. Such a move would be a disincentive to mobile carriers to, well, provide service. In today’s world it is not possible to simply create a 100% perfect, non-interruptible network. Even behemoth Google, with its tens of thousands of servers (probably hundreds of thousands) can only promise a 99% uptime service level agreement on its Google Apps business suite. And banning long-term contracts, while appealing on the surface, would mean much higher prices for the phones up front. That fancy new iPhone 4? Expect to pay $299 for the version that currently goes for $199. The longer term contacts help defray some of the upfront costs of the handset. As an aside, you can purchase just about any phone contract free, but it will cost you.

Also at stake here is confusing the consumer. As CTIA points out, consumers may actually believe they are buying a safer phone by purchasing one with a lower radiation level. Again, this presumes that there is actually a solid connection between cancer and the low-level radiation signatures of cell phones. There is not. This is somewhat analogous to Washington’s latest “hands-free” cell phone driving law. As The Seattle Times and others have opined, the public may be lulled into a false sense of security, as roads will not actually be any safer, as numerous studies have concluded that using hands free devices, such as bluetooth headsets, will not negate driver distraction. As commonsensical barring drivers from texting or talking on their cell phones sounds, it’s practically meaningless the way it was written (texting is illegal, but not holding the phone as a speakerphone. And using your phone to check email or sports scores? It’s not texting, so presumably it’s still legal).

Really though, the final quote from San Fran’s mayoral spokesperson is telling. He says that, “Really, it’s not about telling people not to use cell phones.” Then what it is about? Because forcing retailers to broadcast radiation levels smacks of fear mongering or something that will be justified somehow as consumer protection. But I’m not buying it. In reality, it’s a manufactured solution for something that isn’t actually a problem in order to score political points.

]]>
https://techliberation.com/2010/06/16/then-whats-the-point/feed/ 2 29800
Broadband is great but, magical? really? https://techliberation.com/2010/04/05/broadband-is-great-but-magical-really/ https://techliberation.com/2010/04/05/broadband-is-great-but-magical-really/#respond Mon, 05 Apr 2010 19:11:04 +0000 http://techliberation.com/?p=27834

The city of Bellingham, Washington lies close to the Canadian border. It is a sleep town of 70,000 or so with a decent sized University, a pleasant waterfront and charming downtown. (Full disclosure, the author attended said University a decade ago)

The town’s motto is, “the city of subdued excitement,” something that probably better fits a description of this author than the town, but whatever.

I did, however, get a kick out of the video that city leaders spent $5K putting together to accompany the Google fiber rollout project application. I love a good broadband connection as much as the next guy, but the video, while done in a very professional manner, made my hair stand up on end. For one thing, Bellingham has good broadband networks, including Clear’s WiMax, numerous coffee shops with complimentary WiFi, a networked university system, etc. We’re not dealing with backwood hicks here or stone-cobbled streets.

But I suppose a video looks less desperate than changing the name of your city.

Google Fiber: Put the G in Bellingham

]]>
https://techliberation.com/2010/04/05/broadband-is-great-but-magical-really/feed/ 0 27834
Don’t subject the Internet to politicians, bureaucrats https://techliberation.com/2010/03/02/dont-subject-the-internet-to-politicians-bureaucrats/ https://techliberation.com/2010/03/02/dont-subject-the-internet-to-politicians-bureaucrats/#comments Tue, 02 Mar 2010 20:09:16 +0000 http://techliberation.com/?p=26658

I recently wrote an op-ed for the American Legislative Exchange Council’s Inside ALEC publication. It’s decidedly non-technical, as most correspondance with a majority in the legislative branch must be. In my dealings with those in state government positions, it seems that only in the last few months have many of them become aware of the FCC’s Net Neutrality proposals — or even the issue itself. I don’t blame them. State legislators are often more concerned with local issues such as solving their budget deficits or finding funding for critical government operations.

But it’s important that they also keep an eye on what’s happening in “the other Washington,” (as we Washington state-ers like to call it) as the policies from Congress, the Administration and federal agencies trickle down to affect each and every one of us.

The text of the op-ed is after the break.

“Four years ago, the Federal Communications Commission (FCC) issued an advisory statement that laid out four principles of government regulation of the Internet. The principles, which have no statutory authority, include the right of consumers to access lawful Internet content of their choice, the right to run applications and services of their choice, the right to connect legal devices that do not harm networks, and competition among network, application, service and content providers.

New FCC Chairman Julius Genachowski, with strong backing from the Obama Administration, is pushing for this “statement of principles” to become enforceable regulations, along with two other rules that would regulate how Internet Service Providers (ISPs) manage their own networks, whether wired or wireless, and require ISPs to be “transparent” about their network management practices.

Supporters of the concept of Net Neutrality tout their desire for openness and competition while guaranteeing consumer access to data and content.

However, as innocuous as the proposed FCC rules might sound – who could be against network management transparency and access to legal content? – subjecting the Internet and ISPs to an entire new regulatory structure threatens to curtail the explosive growth of the Internet. This is ironic, given that one of the Obama Administration’s goals is to accelerate broadband deployment to Americans – a goal which will cost ISPs tens of billion of dollars.

Since the beginning of the commercial Internet in the early 1990s, most consumers accessed it via an all-you-can-eat data subscription plan; e.g., everyone pays the same subscription price per month for as much data as you want. As technology advanced consumers began using more data. This culminated in such services as Napster and BitTorrent – peer-2-peer services (P2P) that allowed direct sharing of large files between two consumers. Often, the P2P connections resulted in sharing pirated copyright material such as movies and music (but that is a whole other conversation).

Internet Service Providers, in order to lessen the disproportionate impact that the P2Pers were having on the network, began looking into ways of protecting other consumers whose connections were being slowed down by the P2P bandwidth hogs. Some of the ideas floated or adopted include usage caps, and tiered pricing – the ability to pay more for faster, better service. Some of the scare tactics Net Neutrality supporters use have never, in actuality, occurred. There have been no instances in the United States where consumers cannot access legal content, control of the Internet has not been wrested away from the people into the hands of greedy corporations. No one controls the Internet, and no one, including the government, should.

Need proof that the Internet has not suffered from a lack of strong regulatory oversight? Look at the immense growth rate of both users and data since the turn of the century. In the year 2000, only 5.1 million Americans subscribed to broadband connections. At that time broadband often meant a 1Mbps download connection for cable subscribers, or a paltry 500Kbps connection for DSL users.

In 2000 there was no YouTube, no Facebook or Twitter, no Netflix or Amazon.com streaming video services, no Blackberry or iPhone. These types of innovations simply would not have worked. The capacity to carry that kind of data did not exist, nor did the demand.

Contrast that with 2008 after broadband usage, both wired and wireless (wireless data connections were barely a thought in 2000) had experienced a 500-fold increase in just eight years. Today there are over 80 million households subscribing to broadband.

Looking forward there will be a leveling off in the rate in increase of broadband users, but data demands will continue to skyrocket, particularly in the mobile broadband arena. Cisco Systems estimates that the Internet in 2012 will be seventy-five times larger than it was in 2002 — and that Internet traffic will generate the equivalent of seven billion DVDs each month. Cisco also estimates that Internet video in 2012 will be nearly 400 times the size of the entire U.S. Internet backbone in 2000.

Given this spectacular growth, a new regulatory structure like that pushed by Net Neutrality proponents makes no economic sense.  When a powerful third party, such as a federal agency, regulates a limited resource, such as broadband capacity, the market itself becomes subject to political whims and special-interest carve-outs, which will only harm consumers.

Unfortunately, several state and city officials around the nation have also tried to get in on the act of regulating the Internet in their small jurisdiction. It seems regulatory proliferation in this area knows no bounds. Fortunately, courts have time and again rebuffed efforts at anything less than federal regulatory authority. As a result, some states and cities are petitioning the FCC to move forward with Net Neutrality.

No one would disagree that the growth of the Internet has been anything less than transformative on our society and economy, which has happened with minimal government interference. It will continue to grow if we leave it alone. Regulating an industry to achieve peace of mind comes at a price – most often that price is paid in missed opportunities and lost innovations and therefore cannot be measured.

With new restrictions in place, innovators will have to overcome artificial barriers and find success despite regulatory obstacles, not because of them.

The federal government should protect intellectual property rights and continue to encourage long-term investment in our online network by keeping the regulatory barrier low.”

]]>
https://techliberation.com/2010/03/02/dont-subject-the-internet-to-politicians-bureaucrats/feed/ 5 26658
Weekend Reading — NBCU/Comcast/GE Public Interest Statement https://techliberation.com/2010/01/29/weekend-reading-nbcucomcastge-public-interest-statement/ https://techliberation.com/2010/01/29/weekend-reading-nbcucomcastge-public-interest-statement/#comments Fri, 29 Jan 2010 23:30:20 +0000 http://techliberation.com/?p=25544

For those of you inclined to read protracted legalese filings, NBC Universal, Comcast and GE submitted their Public Interest Statement to the FCC this week. You can read the filing here.

Many conspiracies have been touted, claiming that public control of communication mediums will be wrested away from the public because of this venture and that consumers stand to lose the most. Adam did a good job debunking these concerns earlier this month. The fact is that this merger in no way would result in the dreaded “M” word, aka monopoly.

Whatever the case, this process is still bound to take another year or so before finalization, which gives you, dear reader, time to process the entire 145 page document. Happy reading!

]]>
https://techliberation.com/2010/01/29/weekend-reading-nbcucomcastge-public-interest-statement/feed/ 2 25544
Emergence of Cloud Computing = Government Regulation Can’t Be Far Behind https://techliberation.com/2010/01/22/emergence-of-cloud-computing-government-regulation-cant-be-far-behind/ https://techliberation.com/2010/01/22/emergence-of-cloud-computing-government-regulation-cant-be-far-behind/#comments Fri, 22 Jan 2010 15:36:28 +0000 http://techliberation.com/?p=25271

Brad Smith, Microsoft’s Senior Vice President and General Counsel addressed the Brookings Institution earlier this week calling for government to get involved to enhance the safety, security and privacy of the “Cloud.” (Here’s a transcript of his remarks)

Smith alluded to the fact that cloud computing is undergoing a powerful transformation and correctly pointed out that, even though millions of Americans are using cloud computing platforms today (and have been for years), a far majority of them have no real concept of what cloud computing actually is or does — and neither to most policymakers.

This speech was very well timed, given the current Google-China kerfuffle from the past couple of weeks. Essentially, who is in charge of the data in the cloud? How can we guarantee that best practices are being used by providers? And, what role will the federal government play in the regulation of this powerful emerging technology?

Without getting into too many specifics, Smith called for a Cloud Computing Advancement Act, which would promote innovation, protect consumers and provide the Executive Branch with new tools needed for a new technology area. Now, it seems that protecting consumers is the only one of those three points in which the government should play an active role. Legislating innovation seldom works how legislators foresee and giving any administration more tools for controlling the Internet has been met with some skepticism.

But there are modernizations that can take place within existing consumer protection laws, as Smith points out. One of the more interesting and valid examples Smith highlights is that many of the existing laws were written with the single PC in mind. A data thief breaks into a business’ or person’s computer and steals data — when caught the perpetrator may face only a single charge for each break-in. But when dealing with data centers that hold thousands of servers hosting potentially hundreds of thousands or millions of users’ data, the stakes need to be upped. Perpetrators of data center break ins should face charges for each user affected, or thereabouts.

Another point at which I will agree with Smith is his insistance that much of this new, or strengthened policy, be worked out on the federal level instead of having the states take the issue of cloud computing regulation on separately. It would be unfortunate if all 50 states jumped onto the regulatory bandwagon for a product that knows no boundaries.

The cloud computing platform is changing and growing extremely fast; it’s been a scant couple of years since the term really even took off. And government regulators will, as always, be forced to play catch-up (which most of the time is just fine). And as far as the transparency or, “truth in cloud computing” Smith wants among cloud providers, that may be tougher than he thinks; or it could be a veiled swipe at other providers who aren’t as transparent as Smith wants. He does suggest, lastly, that perhaps the cloud provider industry come together around a new self-regulatory code. This option should be first and foremost, before bringing in the FTC.

Nevertheless, it was an interesting speech because Microsoft, for all its ups and downs fighting for marketshare in the cloud, is now a major player in this discussion and perhaps this speech set the tone for the next year in how policy and cloud computing will collide.

For more on the issue of regulatory control and the Cloud, check out this piece by Holman Jenkins in today’s The Wall Street Journal.

]]>
https://techliberation.com/2010/01/22/emergence-of-cloud-computing-government-regulation-cant-be-far-behind/feed/ 3 25271