Tim Wu was kind enough to comment on my general overview and critique of his new book, The Master Switch: The Rise and Fall of Information Empires. That essay will be the first of many I plan to pen about Wu’s important book. I appreciate Prof. Wu being willing to engage me in a debate over some of these issues since I’m sure he has better things to do with his time. Some of the points he raised in his comment will be addressed in subsequent posts.
In this post, I want to respond briefly to his assertion that I was “missing the point of the book” which is “to describe the world we live in.” He says that his book, “suggests that we tend to go through open and closed cycles in the Information Industries, and that, roughly, both have their strengths and weaknesses, and both become popular at different times for various reasons.” But he fears there are “greater risks in the closed periods.”
Contrary to what he suggests, I certainly understand that’s the point of his book, it’s just that I don’t fully agree with his analysis or conclusions. Let me be clear about a crucial point, however: I accept that almost every industry goes through “cycles” of some sort and that, typically, after a “Wild West” period of greater “openness” and more atomistic competition, some degree of “consolidation” or more “closed” (or proprietary) models often sets in. (A somewhat different and far more descriptive interpretation of such cycles can be found in Deborah Spar’s 2001 book, Ruling the Waves: Cycles of Discovery, Chaos, and Wealth from Compass to the Internet. She outlines a more refined 4-part cycle of: Innovation, Commercialization, Creative Anarchy, and Rules.)
My primary beef with Prof. Wu is that, contrary to his assertion yesterday in commenting on my post, his book seems to regard the progression of “the Cycle” as mostly linear and one-directional: straight down toward a perfectly closed, corporate-controlled, anti-consumer Hell. By my reading of his book – much like Lessig and Zittrain’s work – Wu is painting an overly pessimistic portrait of technologies being subjected to the “perfect control” of largely unfettered markets.
I believe history – especially recent history — teaches us something very different. Continue reading →
Carl Malamud is a breakthrough thinker and doer on transparency and open government. In the brief video below, he makes the very interesting case that various regulatory codes are wrongly withheld from the public domain while citizens are expected to comply with them. It’s important, mind-opening stuff.
It seems a plain violation of due process that a person might be presumed to know laws that are not publicly available. I’m not aware of any cases finding that inability to access the law for want of money is a constitutional problem, but the situation analogizes fairly well to Harper v. Virginia, in which a poll tax that would exclude the indigent from voting was found to violate equal protection.
Regulatory codes that must be purchased at a high price will tend to cartelize trades by raising a barrier to entry against those who can’t pay for copies of the law. Private ownership of public law seems plainly inconsistent with due process, equal protection, and the rule of law. You’ll sense in the video that Malamud is no libertarian, but an enemy of an enemy of ordered liberty is a friend of liberty.
William Powers, a writer who has been a columnist and media critic for such publications as The Washington Post, The New Republic, and National Journal, discusses his new book, Hamlet’s BlackBerry: A Practical Philosophy for Building a Good Life in the Digital Age. In the book, Powers writes, “You can allow yourself to be led around by technology, or you can take control of your consciousness and thereby your life.” On the podcast, he discusses historical philosophers’ ideas that can offer shelter from our present deluge of connectedness, how to create gaps that allow for currently elusive depth and inward reflection, and strategies that help him and his family regain control over their technology.
I’ve grown increasingly tired of the fight over not just retransmission consent, but almost all TV regulation in general. Seriously, why is our government still spending time fretting over a market that is more competitive, diverse and fragmented than most other economic sectors? It’s almost impossible to keep track of all the innovation happening on this front, although I’ve tried here before. Every metric — every single one — is not just improving but exploding. Just what’s happening on the kids’ TV front is amazing enough, but the same story is playing out across other programming genres and across multiple distribution platforms.
More proof of just how much more diverse and fragmented content and audiences are today comes in this excellent new guest editorial over at GigaOm, “The Golden Age of Choice and Cannibalization in TV,” by Mike Hudack, CEO of Blip.tv. Hudack notes that, compared to the Scarcity Era, when we had fewer choices and were all forced to watch pretty much the same thing, today’s media cornucopia is overflowing, and audiences are splintering as a result. “Media naturally trends towards fragmentation,” he notes. “As capacity increases so does choice. As choice increases audiences fragment. When given a choice people generally prefer media that speaks to them as individuals over media that speaks to the ‘masses.’”
Indeed, he cites Nielsen numbers I’ve used here before illustrating how the top shows of the 50’s (like Texaco Star Theater) netted an astonishing 60-80% of U.S. television households while more recent hits, like American Idol is lucky if it can manage over 15% audience share. He concludes, therefore, that: Continue reading →
In the current issue of *Foreign Affairs*, Deputy Defense Secretary William J. Lynn III, has one of the [more sober arguments](http://www.foreignaffairs.com/articles/66552/william-j-lynn-iii/defending-a-new-domain) for government involvement in cybersecurity. Naturally, his focus is on military security and the Pentagon’s efforts to protect the .mil domain and military networks. He does, however, raise the question of whether and how much the military should be involved in protecting civilian networks.
One thing that struck me about Lynn’s article is the wholesale rejection of a Cold War metaphor for cybersecurity. “[The United States] must also recognize that traditional Cold War deterrence models of assured retaliation do not apply to cyberspace, where it is difficult and time consuming to identify an attack’s perpetrator,” he writes. Given the fact that attribution is nearly impossible on the internet, he suggests that the better strategy would be “denying any benefits to attackers [rather] than imposing costs through retaliation.”
What’s interesting about this is that it is in utter contrast to the recommendations of cybersecurity enthusiasts like former NSA chief Michael McConnell, who wrote earlier this year in a [1,400-word op-ed](http://www.washingtonpost.com/wp-dyn/content/article/2010/02/25/AR2010022502493.html) in the *Washington Post*: Continue reading →
Of course, that doesn’t mean I agree with everything in it. In fact, I disagree vehemently with Wu’s general worldview and recommendations, and even much of his retelling of the history of information sectors and policy. Nonetheless, for reasons I will discuss in this first of many critiques, the book’s impact will be significant because Wu is a rock star in this academic arena as well as a committed activist in his role as chair of the radical regulatory activist group, Free Press. Through his work at Free Press as well as the New America Foundation, Professor Wu is attempting to craft a plan of action to reshape the Internet and cyberspace.
I stand in opposition to almost everything that Wu and those groups stand for, thus, I will be spending quite a bit of time addressing his perspectives and proposals here in coming months, just as I did when Jonathan Zittrain’s hugely important The Future of the Internet & How to Stop It was released two years ago (my first review is here and my latest critique is here). In today’s essay, I’ll provide a general overview and foreshadow my critiques to come. (Note: Tim was kind enough to have his publisher send me an advance uncorrected proof of the book a few months ago, so I’ll be using that version to construct these critiques. Please consult the final version for cited material and page numbers.) Continue reading →
I recently picked up a copy of Robert Wuthnow’s Be Very Afraid: The Cultural Response to Terror, Pandemics, Environmental Devastation, Nuclear Annihilation, and Other Threats. According to the dust cover, the Princeton sociologist’s book “examines the human response to existential threats…” Contrary to common belief, we do not deny such threats but “seek ways of positively meeting the threat, of doing something—anything—even if it’s wasteful and time-consuming.” Interesting batch of ideas, no?
Well, the fifth paragraph of the book joins up with some pretty obscure and disorganized writing in the introduction to disqualify it from absorbing any more of my precious time. That paragraph contains this sentence: “Millions could die from a pandemic or a dirty bomb strategically placed in a metropolitan area.”
It’s probably true that millions could die from a pandemic. Two million deaths would be just under 0.03% of the world’s population—not quite existential. But the killer for the book is Wuthnow saying that millions could die from a dirty bomb placed in a metropolitan area. There will never be that many deaths from a dirty bomb, placed anywhere, ever.
One suspects that the author doesn’t know what a dirty bomb is. A dirty bomb is a combination of conventional explosives and radioactive material that is designed to disperse the radioactive material over a wide area. A dirty bomb is not a nuclear explosive and its lethality is little greater than a conventional weapon, as the radiological material is likely to be too dispersed and too weak to cause serious health issues.
Dirty bombs are meant to scare. Incautious discussion of dirty bombs has induced more fright in our society than any actual bomb. Professor Wuthnow asserts, as fact, that a dirty bomb could kill millions, which is plainly wrong. If he doesn’t know his subject matter, he doesn’t get any more time from this reader.
Given my brief experience with the book, I advise you to be very afraid of Be Very Afraid.
An important anniversary just passed with little more notice than an email newsletter about the report that played a pivotal role in causing the courts to strike down the 1998 Child Online Protection Act (COPA) as an unconstitutional restriction on the speech of adults and website operators. (COPA required all commercial distributors of “material harmful to minors” to restrict their sites from access by minors, such as by requiring a credit card for age verification.)
The Congressional Internet Caucus Advisory Committee is pleased to report that even after 10 years of its release the COPA Commission’s final report to Congress is still being downloaded at an astounding rate – between 700 and 1,000 copies a month. Users from all over the world are downloading the report from the COPA Commission, a congressionally appointed panel mandated by the Child Online Protection Act. The primary purpose of the Commission was to “identify technological or other methods that will help reduce access by minors to material that is harmful to minors on the Internet.” The Commission released its final report to Congress on Friday, October 20, 2000.
As a public service the Congressional Internet Caucus Advisory Committee agreed to virtually host the deliberations of the COPA Commission on the Web site COPACommission.org. The final posting to the site was the actual COPA Commission final report making it available for download. In the subsequent 10 years it is estimated that close to 150,000 copies of the report have been downloaded.
The COPA Report played a critical role in fending off efforts to regulate the Internet in the name of “protecting our children,” and marked a shift towards focusing on what, in First Amendment caselaw is called “less restrictive” alternatives to regulation. This summary of the report’s recommendations bears repeating:
After consideration of the record, the Commission concludes that the most effective current means of protecting children from content on the Internet harmful to minors include: aggressive efforts toward public education, consumer empowerment, increased resources for enforcement of existing laws, and greater use of existing technologies. Witness after witness testified that protection of children online requires more education, more technologies, heightened public awareness of existing technologies and better enforcement of existing laws. Continue reading →
It’s wonderful to see that the FCC is putting spectrum front and center on its agenda. Yesterday it held a spectrum “summit” at which it released several [papers](https://docs.google.com/viewer?url=http://hraunfoss.fcc.gov/edocs_public/attachmatch/DOC-302330A1.pdf) looking at the challenges and opportunities mobile broadband faces, and it was [announced](http://techdailydose.nationaljournal.com/2010/10/spectrum-on-november-fcc-agend.php) that at its November meeting, the chairman will introduce several items related to spectrum reallocation. NTIA is keeping pace, [identifying over 100 MHz](http://techdailydose.nationaljournal.com/2010/10/ntia-identifies-federal-spectr.php) now in federal hands (mostly DoD) to be moved over for commercial use.
The consensus that has led us to this happy time is that there is a spectrum “shortage” or spectrum “crunch,” as many said yesterday. Here’s how Chairman Genachowski [explained it](https://docs.google.com/viewer?url=http://hraunfoss.fcc.gov/edocs_public/attachmatch/DOC-302331A1.pdf&pli=1):
>The explosive growth in mobile communications is outpacing our ability to keep up. If we don’t act to update our spectrum policies for the 21st century, we’re going to run into a wall—a spectrum crunch—that will stifle American innovation and economic growth and cost us the opportunity to lead the world in mobile communications.
>Spectrum is finite. Demand will soon outpace the supply available for mobile broadband.
Every natural resource is finite, however. So how exactly did we end up with this “spectrum crunch”? Continue reading →
The Federal Communications Commission has established a new advisory group called the “Technological Advisory Council.” Among other things it will advise the agency on “how broadband communications can be part of the solution for the delivery and cost containment of health care, for energy and environmental conservation, for education innovation and in the creation of jobs.”
This is an agency that is radically overspilling its bounds. It has established goals that it has no proper role in fulfilling and that it has no idea how to fulfill. As we look for cost-cutting measures at the federal level, we could end the pretense that communications industry should be regulated as a public utility. Shuttering the FCC would free up funds for better purposes such as lowering the national debt or reducing taxes.
The Technology Liberation Front is the tech policy blog dedicated to keeping politicians' hands off the 'net and everything else related to technology. Learn more about TLF →