Sage advice from Brooke.
I’m not going to name names, but I find it particularly disturbing when people who work in tech policy refer to individual blog posts as “blogs.” The blog is the medium, not the message; calling a post a “blog” is the equivalent of calling an article in the Washington Post a newspaper, as in, “Hey, did you read that newspaper in the Washington Post this morning about new FDA regulations on over-the-counter pain relievers? Boy, that Matthew Perrone sure can write a newspaper!”
I’m glad she wrote that blog to make sure no one was confused.
Related to my previous post, I think it’s no coincidence that the Samba team has taken the lead in criticizing the Microsoft-Novell deal. Some commentators have argued that the free software movement objects to the deal because they want to prevent interoperability between free and proprietary software, thereby forcing vendors to choose sides. But that clearly can’t be right, because Samba’s raison d’être is (as their slogan says) “opening Windows to a wider world.” If the free software movement were trying to prevent compatibility with proprietary software, you would expect the Samba team to be on the other side, urging restraint and cooperation with Microsoft. That clearly hasn’t happened.
I suspect that what is happening is that the Samba guys are terrified that Microsoft will use patent law to put them out of business. They’re particularly vulnerable to patent claims because their software is designed to interoperate with Windows, which of necessity means that they have to mimic many features of Microsoft’s own software in order to achieve compatibility.
Naturally, Microsoft has never liked the fact that people could interoperate with Windows without paying Microsoft for the privilege. The Samba guys know this. So they expect they’d be among the first targets should Microsoft make a concerted effort to use the patent system against the free software movement.
Update: My software patent series will be taking the week off in observance of the holidays.
Another bit of fallout from Novell’s patent agreement with Microsoft, as Samba developer Jeremy Allison quits Novell. He was quickly snapped up by Google. I’ve never heard of the guy, but Ars calls him “prominent,” and Groklaw calls him “legendary.” His letter said, in part:
As many of you will guess, this is due to the Microsoft/Novell patent agreement, which I believe is a mistake and will be damaging to Novell’s success in the future. But my main issue with this deal is I believe that even if it does not violate the letter of the licence it violates the intent of the GPL licence the Samba code is released under, which is to treat all recipients of the code equally…
The Microsoft patent agreement has put us outside the community, and there is no positive aspect to that fact, and no way to make it so. Until the patent provision is revoked, we are pariahs.
Given that the ability to recruit and retain talent is crucial to the success of software firms, this sort of defection is likely to prove an effective way to enforce the GPL without the need to resort to the courts. Indeed, it’s a more powerful mechanism than the courts, because efforts like Novell’s to squeak by on a technicality aren’t going to fly. Being perceived as violating the spirit of the GPL is just as damaging as violating the letter of it.
I’d like to call out an interesting development from the past week that is a great example of how the Internet can do an end run around traditional regulation–in this case, federal broadcast indecency rules.
As described very well in this NY Times article in yesterday’s Arts section, Saturday Night Live had a decently funny skit (my friends have thought it to be either hilarious or plain stupid) involving a parody of two boy band singers, one played by Justin Timberlake. The skit was called “Special Treat in a Box” and involved a song about giving a holiday present to their girlfriends–their male anatomy, wrapped up in a box.
Over the air, NBC had to bleep out the 16 references to the anatomy (think other name for Richard)–but, SNL simultaneously released an uncensored version that made its way to YouTube. Over 2 million people had viewed it on YouTube alone, according to the article.
Lorne Michaels, SNL’s producer, predicted that other shows might more actively offer material online that isn’t suitable for prime-time broadcast. But in a telling state of the regulatory climate, and its chilling effect on the distribution of content (the easily offended think this is a good thing), according to the article:
[Michaels] cautioned in an interview that the strategy of treating Internet users to the equivalent of an authorized “director’s cut” of his late-night show “will be the exception” going forward.
Don’t want to piss off anyone with power in Washington, DC, or else Internet content could one day receive a not-so-special regulatory treat from the FCC.
Over at Catallarchy, Sean Lynch has a tirade against Wikipedia:
Wikipedia is an excellent example of when crowds are not wise; one’s actual knowledge has no correlation whatsoever with how much effort they’re willing to put out to keep Wikipedia accurate, and some of my recent experience there seems to indicate exactly the opposite, that people who know what they’re talking about have better things to do with their time than sit around all day fixing incorrect information in Wikipedia, whereas know-it-alls will spend lots of time “fixing” correct information that they disagree with. The other group who may not be know-it-alls are the rules nazis who care more about form than accuracy. These are the people who show up at all the HOA meetings to complain that your curtains are the wrong color when meanwhile the pipes are leaking and about to burst.
Recently I went back to the Wikipedia page on hydrogen peroxide out of curiosity to see if some fixes I’d made to dangerously inaccurate information on the page had survived. They had not. The same bullshit that I’d originally corrected (bullshit that could kill someone) had been returned. Rather than attempt to fix it again because the bullshit was now scattered throughout the article, I simply added a notation under “hazards” warning people that the article could be edited by anyone and that they should consult a source with actual accountability for safety information. My warning was reverted within minutes, with the notation on the edit referring me to a page entitled “What Wikipedia is Not” and suggesting that I use my knowledge to fix the information on the page. Well, I already had fixed the page, and some moron with far more certainty than knowledge had gone and screwed it up again. In addition, the “What Wikipedia Is Not” page mentions nothing about safety or accountability.
I’ve gone from merely thinking Wikipedia doesn’t live up to its name to believing that it is a complete joke.
I’ve responded to this general argument on several occasions , so I won’t rehash those arguments here. But one of the interesting things about Lynch’s post is the attitude of entitlement it seems to reflect.
Hoping to discover Universal Truths, I have been reading Law in Imperial China and The Law of Primitive Man among other things. One never knows when one might stumble across the Law of Nature. But it’s all downhill after Hobbes and Locke. History is quite determined to make a mockery of it all. (This ultimately has bearing on some of the arguments made concerning copyright and patent rules, particularly by my old, old friend Tom Bell (not that Tom is old, just that I’ve known him for ages) and by a younger version of myself, but I don’t make all those connections here).
The FCC got a wake-up call yesterday in the Second Circuit Court of Appeals in New York City. The agency was there in court defending its recent actions in various indecency enforcement cases against Fox Television. Specifically, the question at hand was whether of not the use of a fleeting explicative should be categorically barred from the airwaves and punishable by massive fines if they are uttered. (You can find the video of the trial on C-Span’s website).
The 3-judge panel showed very little patience with the FCC and asked some sharp questions about its stepped-up crusade to regulate broadcast speech. (The case is Fox Television v. FCC and, as I mentioned here before, I filed a joint amicus brief in the case along with my friends at the Center for Democracy and Technology.)
Before a packed courtroom, FCC attorney Eric Miller was grilled by Second Circuit Judges Rosemary Pooler, Pierre Leval and Peter Hall on numerous issues. Here are a few highlights:
Related to my last post, it occurs to me that there are a lot of businesses that drink from one fire hose or another, and then sell the resulting expertise to people who are too busy to drink from the fire hose themselves. Free software firms and college professors are two such examples. It occurs to me that our friends at TechDirt are an example of the same phenomenon.
Their “fire hose” is the the world of tech news. Between formal news sites like CNet and ZDNet and the blogosphere, keeping up with the conversation about technology, business, policy, and the like is more than a full-time job. I spend a couple of hours a day reading blogs that focus on tech policy, and I’m nowhere near keeping up with all the worthwhile tech policy blogs out there. And I don’t even try to keep up with sites that focus on tech business and Silicon Valley gossip. You can get a sense of the size of this particular fire hose by perusing TechMeme, a site that aggregates the most popular stories in the tech blogosphere at any given moment. It would be a full-time job just to read every post that gets linked to from TechMeme.
David Robinson, managing editor of The American, has a great article arguing that soaring spending on higher education is something to celebrate:
Modern academics often liken their work to drinking from a fire hose. Historians, philosophers, and physicists all find it impossible to keep up with every potentially relevant paper or study. It’s not just a matter of catching up to the state of the art–one couldn’t even read the research materials in an academic field as fast as they are being produced. Inevitably, this leads scholars to retreat further and further into sub-specialization, narrowing the horizon of what counts as “relevant,” of what their fields consist in. But the side effect of this constant, fractal division of the range of human knowledge is that more and more scholars are needed to cover the same range of topics. A hundred years ago, a biologist could plausibly aspire to know all the important theories and facts contained within the field of biology. But today, there are people working on genetics, proteomics, virology, ecology, and a host of other fields, each of which is a full-time, fully mind-absorbing pursuit in its own right.
This all makes sense once one recognizes that professors are the conduits carrying our accumulated knowledge into the present. Having access to something that is written in a book is not the same thing as knowing it. In order for knowledge to be available and useful here and now, someone must be practically familiar with it. And the more knowledge there is to “cover,” as it were, with practical familiarity, the greater the number of scholars needed to complete a university. This means both more professors now and a greater number of those honors undergrads, training for the professoriate. A greater throughput of accumulated knowledge among successive generations requires an ever-increasing number of conduits.
I think this observation applies equally well to the software world. As software simultaneously gets more complex and cheaper, getting access to a piece of software will be a less and less important part of the overall cost of using it. That was certainly true when I worked as a webmaster in college–keeping up with all the changes in web technology was a full-time job.