September 2008

I recently took over as Chairman of the Space Frontier Foundation, Foundationlogo110pwa citizens’ advocacy group dedicated to the opening the space frontier to human settlement.  Established in 1988 to preserve the ideas of Dr. Gerard O’Neill, author of The High Frontier, the Foundation has worked to enable the fulfillment of O’Neill’s vision of humans living and working in space to the benefit of all humanity through:

  • Cultural change—spreading awareness of the vast, untapped potential of space to make humanity richer, safer, healthier and freer;
  • Supporting the growth of the entrepreneurial NewSpace industry; and
  • Promoting government policies supports NewSpace.

I’m hosting an anniversary bash for Sputnik at my home, so if you’re interested in space—as a place, not just a government program—and happen to be in DC on October 4, please consider joining us.  Just email me to RSVP and I’ll add you to the evite (berin dot szoka at gmail dot com).

(In case you were wondering:  We’re working now to upgrade our rather outdated website.  If you’d like to help, just let me know!)

“Hasn’t Steve Jobs learned anything in the last 30 years?” asks Farhad Manjoo of Slate in an interesting piece about “The Cell Phone Wars” currently raging between Apple’s iPhone and the Google’s new G1, Android-based phone. Manjoo wonders if whether Steve Jobs remembers what happen the last time he closed up a platform: “because Apple closed its platform, it was IBM, Dell, HP, and especially Microsoft that reaped the benefits of Apple’s innovations.” Thus, if Jobs didn’t learn his lesson, will he now with the iPhone? Manjoo continues:

Well, maybe he has—and maybe he’s betting that these days, “openness” is overrated. For one thing, an open platform is much more technically complex than a closed one. Your Windows computer crashes more often than your Mac computer because—among many other reasons—Windows has to accommodate a wider variety of hardware. Dell’s machines use different hard drives and graphics cards and memory chips than Gateway’s, and they’re both different from Lenovo’s. The Mac OS, meanwhile, has to work on just a small range of Apple’s rigorously tested internal components—which is part of the reason it can run so smoothly. And why is your PC glutted with viruses and spyware? The same openness that makes a platform attractive to legitimate developers makes it a target for illegitimate ones.

I discussed these issues in greater detail in my essay on”Apple, Openness, and the Zittrain Thesis” and in a follow-up essay about how the Apple iPhone 2.0 was cracked in mere hours. My point in these and other essays is that the whole “open vs. closed” dichotomy is greatly overplayed. Each has its benefits and drawbacks, but there is no reason we need to make a false choice between the two for the sake of “the future of the Net” or anything like that.

In fact, the hybrid world we live in — full of a wide variety of open and proprietary platforms, networks, and solutions — presents us with the best of all worlds. As I argued in my original review of Jonathan Zittrain’s book, “Hybrid solutions often make a great deal of sense. They offer creative opportunities within certain confines in an attempt to balance openness and stability.”  It’s a sign of great progress that we now have different open vs. closed models that appeal to different types of users.  It’s a false choice to imagine that we need to choose between these various models.

Continue reading →

For years there’s been talk of broadband over power lines as an alternative way to deliver Internet into the home. Today I heard about an interesting concept–using broadband to complement the delivery of energy into homes. I’ll call it power over broadband lines (PBL).

Today the Technology Policy Institute hosted an interesting conference on Energy public policy issues. Kathryn Brown of Verizon discussed the idea of a “smart grid” and the way that broadband and ICT can help add to the smarts. Energy meters in the home could tap into the ‘Net to help users monitor and evaluate their energy use. Energy companies could also use broadband communication networks to better monitor distribution and be alerted to problems on the energy grid.

Maybe, just maybe, the power of the Internet can come help rescue an energy industry that faces many technical, regulatory and environmental challenges.

Many Eyes

by on September 25, 2008 · 4 comments

I’m currently at a talk by Martin Wattenburg, who runs a fantastic visualization site from IBM research. Here’s my favorite visualization to date:

Apparently this got an immediate reaction from someone with a different partisan orientation:

The site is chock full of interesting tidbits. Here is a chart of the inflation-adjusted sized of historical bailouts. Here is a graph of personality types by state. Here is a graph comparing historical immigration rates.

The best thing is that you can upload your own data sets, choose your visualization, and share it in a web 2.0-savvy manner. It’s a really cool site, and I encourage you to check it out.

Our conference, “Broadband Census for America,” is fast approaching…. The event is tomorrow. If you want to attend, follow the instructions in the press release below:

FOR IMMEDIATE RELEASE

WASHINGTON, September 25, 2008 – California Public Utilities Commissioner Rachelle Chong, a member of the Federal Communications Commission from 1994 to 1997, will kick off the Broadband Census for America Conference with a keynote speech on Friday, September 26, at 8:30 a.m.

Eamonn Confrey, the first secretary for information and communications policy at the Embassy of Ireland, will present the luncheon keynote at noon. Confrey will overview Ireland’s efforts to collect data on broadband service through a comprehensive web site with availability, pricing and speed data about carriers.

Following Chong’s keynote address, the Broadband Census for America Conference – the first of its kind to unite academics, state regulators, and entities collecting broadband data – will hear from two distinguished panels.

One panel, “Does America Need a Broadband Census?” will contrast competing approaches to broadband mapping. Art Brodsky, communication director of the advocacy group Public Knowledge, will appear at the first public forum with Mark McElroy, the chief operating officer of Connected Nation, a Bell- and cable-industry funded organization involved in broadband mapping.

Also participating on the panel will be Drew Clark, executive director of BroadbandCensus.com, a consumer-focused effort at broadband data collection; and Debbie Goldman, the coordinator of Speed Matters, which is run by the Communications Workers of America.

The second panel, “How Should America Conduct a Broadband Census?” will feature state experts, including Jane Smith Patterson, executive director of the e-NC authority; and Jeffrey Campbell, director of technology and communications policy for Cisco Systems. Campbell was actively involved in the California Broadband Task Force.

Others scheduled to speak include Professor Kenneth Flamm of the University of Texas at Austin; Dr. William Lehr of the Massachusetts Institute of Technology; Indiana Utility Regulatory Commissioner Larry Landis; and Jean Plymale of Virginia Tech’s eCorridors Program.

Keynote speaker Rachelle Chong has been engaged in broadband data collection as a federal regulator, as a telecommunications attorney, and since 2006 as a state official.

Chong was instrumental to the California Broadband Task Force, which mapped broadband availability in California. She will speak about broadband data collection from the mid-1990s to today.

The event will be held at the American Association for the Advancement of Sciences’ headquarters at 12th and H Streets NW (near Metro Center) in Washington.

For more information:
Drew Bennett, 202-580-8196
Bennett@broadbandcensus.com
Conference web site: http://broadbandcensus.com/conference/
Registration: http://broadbandcensus.eventbrite.com/


This week, I have been up at Harvard University participating in another meeting of the Internet Safety Technical Task Force (ISTTF), of which I am a member. The ISTTF was organized earlier this year pursuant to an agreement between 49 state attorneys general (AGs) and social networking giant MySpace.com. A group of experts from academia, non-profit organizations, and industry were appointed to the Task Force, which is charged with evaluating the market for online child safety tools and methods and issuing a report on the matter to the AGs at the end of this year.  ISTTF members have been meeting privately and publicly in both Cambridge, MA and Washington, D.C. The Task Force has been very ably chaired by John Palfrey, co-director of Harvard’s Berkman Center for Internet & Society.

Although the ISTTF is looking at a wide variety of tools and methods associated with online child protection (ex: filters, monitoring tools, educational campaigns, etc.), many of the AGs who crafted the agreement with MySpace that led to the Task Force’s formation have made it clear that they are most interested in having the ISTTF evaluate age verification / online verification technologies.  In fact, at the start of this week’s session at Harvard Law School, AGs Martha Coakely of Massachusetts and Richard Blumenthal of Connecticut both spoke and made it abundantly clear they expect the Task Force to develop age and identify-verification tools for social networking sites (SNS). AG Blumenthal said we need to deal with “the dangers of anonymity” and repeated his standard line about online age verification: “If we can put a man on the moon, we can make the Internet safe.”  [Of course, putting a man on the moon took hundreds of billions of dollars and a decade to accomplish, but never mind that fact! Moreover, one could also argue that if we can put a man on the moon we can cure hunger, AIDS, and the common cold, but some things are obviously easier said than done. Finally, putting a man on the moon didn’t require all Americans or their kids to give up their anonymity or privacy rights in order to accomplish the feat!]

On many occasions here before, I have outlined various questions and reservations about proposals to mandate online age verification.  Last year, I also published a lengthy white paper on the issue and hosted a lively debate on Capitol Hill [transcript here] about this.  I also have discussed age verification in my book on parental controls and online child safety. [Braden Cox also talked about his experiences up at Harvard this week here, and CNet’s Chris Soghoian had a brutal assessment of this week’s proposals on his “Surveillance State” blog.]

In this essay, I will discuss the new fault lines in the debate over online age verification and outline where I think we are heading next on this front.  I will argue:

  • There is now widespread understanding that it is extraordinarily difficult to verify the ages and identities of minors online using the methods we typically use to verify adults. Because of this, age verification proponents are increasingly proposing two alternative models of verifying kids before they go online or visit SNS…
  • First, for those who continue to believe that we must do whatever we can to verify kids themselves, schools and school records are increasingly being viewed as the primary mechanism to facilitate that. This raises two serious questions: Do we want schools to serve as DMVs for our children? And, do we want more school records or information about our kids being accessed or put online?
  • Second, for those who are uncomfortable with the idea of verifying kids or using schools, or school records, to accomplish that task, parental permission-based forms of authentication are becoming the preferred regulatory approach. Under this scheme, which might build upon the regulatory model found in the Children’s Online Privacy Protection Act of 1998 (COPPA), parents or guardians would be verified somehow and then would vouch for their children before they were allowed on a SNS, however defined.  But how do we establish a clear link between parents and kids?  And will parents be willing to surrender a great deal more information (about themselves and their kids) before their kids can go online? And, is it sensible to use a law that was meant to protect the privacy and personal information of children to potentially gather a great deal more information about them, and their parents?
  • It remains very unclear how either of those two verification methods would make children safer online. Indeed, that could actually make kids less safe by compromising their personal information and creating a false sense of security online for them and their parents.
  • It is highly unlikely the Internet Safety Technical Task Force will be able to reach consensus on this complicated, controversial issue. A small camp will likely flock to the sort of proposals mentioned above. Another, larger camp (including me) will flock to education-based approaches to child safety as well increased reliance on other parental empowerment tools and strategies, industry self-regulatory efforts, social norms, and better intervention strategies for troubled youth. But the age verification debate will go on and, as was the case over the past two years, the legal battleground will be state capitals across America, with AGs likely pushing for age verification mandates regardless of what the Task Force concludes.

Continue reading if you are interested in the details.

Continue reading →

“Bailout” Podcasts

by on September 24, 2008 · 12 comments

My employer, The Cato Institute, has put together a couple of podcasts that make for interesting listening as we try to understand what has happened in mortgage finance and financial services – and what is to come.

Cato Chairman Bill Niskanen speaks to us in “The $700-Billion Bailout.” And Cato’s Vice President for Academic Affairs Jim Dorn tells us about “Socialized Risks, Private Reward.”

Dorn puts together Cato’s annual Monetary Conference, which draws some of the most knowledgeable analysts in the country and world. It happens November 19th this year. I’m guessing it will be well attended.

For the past day and a half, the Harvard Berkman Center for Internet & Society hosted a public meeting of the Internet Safety Technical Task Force. Discussions focused mostly on what technical solutions exist for addressing the perceived lack of online safety on social networking websites. But overall there’s still a need to connect the most important dot—do proposed solutions actually make children safer?

Being at Harvard Law School I was reminded of the movie the Paper Chase, where Professor Charles Kingsfield wielded the Socratic Method to better train his students for the rigors of law practice. In this spirit, I think there are three main questions that the task force must fully address when it issues its report later this year:

1. What are the perceived Internet safety problems? This should be a broad inquiry into all the safety-related issues (harassment, bullying, inappropriate content and contact, etc.) and not just limited to social networking websites. Also, there should be an attempt to define those problems that are unique to the Internet and others where root causes are offline problems.

2. What are the possible technical solutions to these problems? It’s important to recognize that some of the problems will NOT primarily be technology fixes (such as education in school classrooms) and even age verification would rely on offline information.

3. Do the solutions offered in #2 to the problems in #1 actually do anything to make children safer? It’s not whether the technology works that’s the salient inquiry. It’s whether the technology works to make children safer.

There were 16 or so companies that presented technology solutions based on age verification, identity verification, filtering/auditing, text analysis, and certificates/authentication tools. Some were better than others, and while most addressed questions one and two above, they were silent about number three.

Lately the good folks at Bureaucrash have really been giving us a lot of cool tech related podcasts. Last week they brought us an interview with Cory Doctorow. This week a guide to online privacy. Topics include:

Listen to it at Bureaucrash.com.

Britannica Concise Encyclopedia: def. monopoly

Exclusive possession of a market by a supplier of a product or service for which there is no substitute. In the absence of competition, the supplier usually restricts output and increases price in order to maximize profits.

How does this possibly apply to Google?  Google hasn’t decreased output, prices have not skyrocketed, and clearly there are plenty of substitutes.  Yet groups like the Association of National Advertisers are attacking Google, claiming Google has a monopoly because it “controls” 90% of search.

Okay. Google gets the lion’s share of search engine traffic. But controlling search doesn’t amount to controlling online advertising, not by a long shot.

We haven’t seen prices go up because the time people spend on search engines every day is minimal, amounting to only a handful of minutes. Google has successfully turned these few minutes a day into a machine that generates billions of dollars a year.  Yet despite its powerful position in the search market, competition from outside of search is forcing Google to keep its rates low.

Continue reading →