Emotions ran high at this week’s Privacy Identity and Innovation conference in Seattle. They usually do when the topic of privacy and technology is raised, and to me that was the real take-away from the event.
As expected, the organizers did an excellent job providing attendees with provocative panels, presentations and keynotes talks—in particular an excellent presentation from my former UC Berkeley colleague Marc Davis, who has just joined Microsoft.
Beginning several years ago, the researchers assembled a dataset of more than 60,000 various crimes, including homicides. Using an algorithm they developed, they found a subset of people much more likely to commit homicide when paroled or probated. Instead of finding one murderer in 100, the UPenn researchers could identify eight future murderers out of 100.
Berk’s software examines roughly two dozen variables, from criminal record to geographic location. The type of crime, and more importantly, the age at which that crime was committed, were two of the most predictive variables.
Unlike applying data mining to detection of terrorism planning or preparation, which is exceedingly rare, using tens of thousands of examples of recidivism to discover predictive factors is a good way to focus supervision resources where they are most likely to be effective.
The article describes use of this software for monitoring parolees and probationers. Using data mining to justify anything approaching extra punishment would be a misuse, and many far more difficult issues would arise if it were used on the general population.
In the podcast this week, Kevin King, a recent law school graduate now clerking for a federal court of appeals, discusses his recent paper, Geolocation and Federalism on the Internet: Cutting Internet Gambling’s Gordian Knot. In his paper King uses the online gambling industry to examine conflict between federalism and the internet — the borderless nature of the internet eschews traditional models of state jurisdiction. He discusses previous attempts to regulate online gambling, conflict between internet gambling providers and the Kentucky horse betting sector, Congress’ current online gambling bill, and a solution that utilizes geolocation technology.
Two articles of interest in today’s Wall Street Journal with indirect impact on the debate over the future of Internet policy. First, there’s a front-page story (“Facing Budget Gaps, Cities Sell Parking, Airports, Zoo“) documenting how many cities are privatizing various services — including some considered “public utilities” — in order to help balance budgets. The article worries about “fire-sale” prices and the loss of long-term revenue because of the privatizations. But the author correctly notes that the more important rationale for privatization is that, “In many cases, the private takeover of government-controlled industry or services can result in more efficient and profitable operations.” Moreover, any concern about “fire-sale” prices and long-term revenue losses have to be stacked again the massive inefficiencies / costs associated with ongoing government management of resources /networks.
Of course, what’s so ironic about this latest privatization wave is that it comes at a time when some regulatory activists are clamoring for more regulation of the Internet and calling for broadband to be converted into a plain-vanilla public utility. For example, Free Press founder Robert McChesney has argued that “What we want to have in the U.S. and in every society is an Internet that is not private property, but a public utility.” That certainly doesn’t seem wise in light of the track record of past experiments with government-owned or regulated utilities. And the fact that we are talking about something as complex and fast-moving as the Internet and digital networks makes the task even more daunting.
Government mismanagement of complex technology projects was on display in a second article in today’s Journal (“U.S. Reviews Tech Spending.”) Amy Schatz notes that “Obama administration officials are considering overhauling 26 troubled federal technology projects valued at as much as $30 billion as part of a broader effort by White House budget officials to cut spending. Projects on the list are either over budget, haven’t worked as expected or both, say Office of Management and Budget officials.” I’m pleased to hear that the Administration is taking steps to rectify such waste and mismanagement, but let’s not lose sight of the fact that this is the same government that the Free Press folks want to run the Internet. Not smart.
I’ve noted here before that Gordon Crovitz is my favorite technology policy columnist and that everything he pens for his “Information Age” column for The Wall Street Journal is well worth reading. His latest might be his best ever. It touches upon the great debate between Internet optimists and pessimists regarding the impact of digital technology on our culture and economy. His title is just perfect: “Is Technology Good or Bad? Yes.” His point is that you can find evidence that technological change has both beneficial and detrimental impacts, and plenty of people on both sides of the debate to cite it for you.
This is a subject I’ve spent a lot of time noodling over here through the years and, most recently, I compiled all my random thoughts into a mega-post asking, “Are You an Internet Optimist or Pessimist?” That post tracks all the leading texts on both sides of this debate. I was tickled, therefore, when Gordon contacted me and asked for comment for his story after seeing my piece. [See, people really do still read blogs!] Continue reading →
The Progress and Freedom Foundation has just published a white paper I wrote for them titled “The Seven Deadly Sins of Title II Reclassification (NOI Remix).” This is an expanded and revised version of an earlier blog post that looks deeply into the FCC’s pending Notice of Inquiry regarding broadband Internet access. You can download a PDF here.
I point out that beyond the danger of subjecting broadband Internet to extensive new regulations under the so-called “Third Way” approach outlined by FCC Chairman Julius Genachowski, a number of other troubling features in the Notice indicate an even broader agenda for the agency with regard to the Internet. Continue reading →
Earlier this week, The Daily Show’s Jon Stewart summed up the debate over net neutrality by stating, “On one side [are] those who want the marketplace to remain a wide open market of ideas, and on the other side [is] a larger group who have no idea what net neutrality means.”
Stewart may have been joking, but he was right about one thing – many folks are confused about what net neutrality actually is and what it would mean for Internet users.
That’s why I decided to enter the America’s Got Net video contest, sponsored by the Open Internet Coalition, a pro-net neutrality trade association. In a short video entitled, “The Open Internet and Lessons from the Ma Bell Era,” I explain how mandating net neutrality would endanger the networks of tomorrow and insulate entrenched firms from competition. Enjoy!
Back in March, the Motion Picture Association of America re-launched its film-rating website, filmratings.com. While this may be old news to some, I just learned about it from a post on BoingBoing which makes fun of the rationales given for the ratings, which are available on the new website. Example: The movie “3 Ninjas Knuckle Up” was “rated PG-13 for non-stop ninja action.”
Recent revelations about Microsoft’s internal debate over Internet Explorer’s handling of tracking cookies, as chronicled by The Wall Street Journal earlier this month, have prompted harsh criticism from self-described privacy groups, who’ve called on Congress to investigate Microsoft’s actions. But as Jim Harper pointed out in an excellent WSJ essay, Web users stand to lose a great deal if online tracking is squelched by the hand of government. Data gathering on the Internet is largely harmless, and individually targeted advertising coexists with robust privacy safeguards.
Over on AOLNews.com, my colleague Carolyn Homer discusses these privacy tradeoffs, arguing that Microsoft and other Internet firms have a strong incentive to set privacy defaults that align with their users’ preferences. She points out that most consumers are, in practice, quite willing to live with allegedly “pervasive” tracking in exchange for the enormous benefits that targeted advertising makes possible. While many surveys and polls indicate consumers are very worried about their privacy, the actual decisions that consumers make every day tell a very different story (as documented extensively by Berin Szoka). From Carolyn’s piece:
A body of research reveals a sizable disparity between how much people say they value privacy and how willing they are to actually protect it. In a 2003 Duke Law Journal article, Michael Staten and Fred Cate found that fewer than 10 percent of users exercise their right to opt out and share less. Conversely, if given the opposite choice, fewer than 10 percent of users elect to opt in and share more. The vast middle is apparently indifferent.
If consumers were required to affirmatively opt in before sharing data, the Internet’s prevailing advertising-based business model would be decimated. The effectiveness of online advertising in Europe, for example, fell 65 percent after the European Union in 2002 required a blanket opt-in system. For more than a decade, the Internet has thrived on the assumption that most people believe it is a fair trade to receive free content in exchange for viewing ads. Mere advertisements shouldn’t be equated with gross privacy violations.
She goes on to discuss how privacy settings are evolving as consumer preferences adapt to new technologies and firms experiment with new ways to use and collect data. You can read the rest over at the AOL News website.
A favorite PR maven pitched me (and probably many of you) Senator Al Franken’s (D-MN) email suggesting that WiFi is threatened by the Google-Verizon “deal.”
“The Google-Verizon ‘framework’ was written so as not to apply to wireless Internet services,” says Franken. “If you use wi-fi or access the Internet on your phone, this is a serious problem.”
Kindamaybenotsomuch. WiFi is wireless, yes, but it’s not what they’re talking about when they say “wireless.”
But what caught my eye is Senator Franken’s somewhat inverted take on power arrangements in the federal government: “This evening, I’ll be speaking at an FCC hearing in Minneapolis. I’ll urge the commissioners to reject the Google-Verizon framework, stop the Comcast/NBC merger, and take action to keep the Internet free and open.”
Folks, Article I, section 1 of the United States Constitution creates the United States Senate, with section 3 describing the Senate’s makeup and some procedures.
The Federal Communications Commission is not a constitutional body. The best view is that Congress has no authority to establish an FCC like we have today. The better view is that Congress should not maintain the sprawling FCC we have today. And the only correct view is that FCC is a creation of Congress, beneath it in every relevant respect.
Senator Franken is supposed to be the boss of the FCC, not a supplicant “urging” the FCC to do x, y, and z.
Does it matter a lot? No. Senator Franken is mostly making a symbolic appeal to gin up constituent support. But he’s also symbolizing the abasement of the legislative branch to an independent agency that has no constitutional pedigree.
The Technology Liberation Front is the tech policy blog dedicated to keeping politicians' hands off the 'net and everything else related to technology. Learn more about TLF →