Every week, I look at a software patent that’s been in the news. You can see previous installments in the series here. This week, I consider Patent #6,988,138, “Internet-based education support system and methods,” issued in January to Blackboard Inc. According to CNet, the Software Freedom Law Center is challenging the patent’s validity. And it’s a good thing somebody is. Here’s what the patent purports to cover:
A course-based system for providing to an educational community of users access to a plurality of online courses, comprising: a) a plurality of user computers, with each user computer being associated with a user of the system and with each user being capable of having predefined characteristics indicative of multiple predetermined roles in the system, each role providing a level of access to a plurality of data files associated with a particular course and a level of control over the data files associated with the course with the multiple predetermined user roles comprising at least two user’s predetermined roles selected from the group consisting of a student role in one or more course associated with a student user, an instructor role in one or more courses associated with an instructor user and an administrator role associated with an administrator user…
And it goes on in that vein. In a nutshell, they’re trying to patent the concept of distributing course information (assignments, announcements, class discussions, grades, etc) via the web with different access permissions for different users.
With most of the patents I’ve analyzed in this series, I’ve had to make my best guess about whether someone of ordinary programming skill could have developed the type of software described in the patent. But I don’t have to make any guesses in this case, because I’ve personally worked on software that does most of the things this patent describes.
Continue reading →
Joe at Techdirt makes an excellent point about government and monopolies:
Here’s a story that hits on some of today’s themes of monopolistic behavior and keeping stuff off the internet. The Department of Justice has been given the go ahead to proceed with a lawsuit against the National Association of Realtors, alleging that the group colluded to prevent listings from appearing online, in a bid to give established brokers an advantage. Now, we’d be tempted to say that however backwards the organization’s thinking is, they have the right to distribute their data to whomever they want. But we should take a step back and ask why the NAR is in the position to monopolize this information in the first place. That fault rests with the government, which has put the NAR in charge of regulating its industry, and deciding who can and can’t be a broker. In other words, its monopoly has official legal blessing. Without this, anyone could go out and get listings, and abide by whatever rules they wanted to, offerings to broker home sales as efficiently as possible. So instead of suing the NAR, for doing what it’s intended to do (maximize profits for its members) why not get at the root of the problem and take away its monopoly status?
Quite so. We just published an article by my colleague Sarah Brodsky describing how the realtors’ lobby recently got a euphemistically named “Homeowners’ Bill of Rights” passed in Missouri that limits competition by outlawing discount real estate brokers. If you want to pay someone to list your house but do the rest of the legwork of selling the house yourself, that’s too bad. You have to go with a full-service real estate agent.
The state has a split personality when it comes to monopolies and cartels. Most of the time, our elected officials vigorously denounce them and take action to (supposedly) increase competition. However, if they’re created by the government, that’s a whole other ball game. In that case, only crazy right-wingers would suggest that more competition would be beneficial. And sometimes, the state does both at the same time: creating a cartel with its right hand, while its left hand simultaneously investigates the cartel for being anti-competitive. It’s very strange.
Yesterday, the Center for Democracy & Technology and the Progress & Freedom Foundation filed joint comments in both the Second Circuit Court of Appeals and Third Circuit Court of Appeals calling upon the courts to halt the Federal Communications Commission’s (FCC) recent over-zealous indecency enforcement activities. The cases are Fox Television Stations v. FCC (the Second Circuit case) and CBS Corp. v. FCC (the Third Circuit case). (The filings we submitted to the courts were virtually identical so I’m just posting the link for the Second Circuit brief which you can find here).
In our joint amicus briefs we argued that the status quo cannot stand for three primary reasons:
Continue reading →
This is odd. Apparently, the CIA has recently decided that access to its entire website will henceforth be encrypted using SSL–the encryption standard used by websites accepting your credit card number.
They say this ensures that no one is able to impersonate the CIA website, but that doesn’t make a whole lot of sense. I can’t imagine why anyone would want to impersonate the CIA’s public website. And if they did, SSL is only an effective deterrent if the user manually examines the site certificate, which doesn’t seem very likely.
The other claimed benefit is to prevent eavesdropping on (or tampering with) peoples’ browsing. But that doesn’t make sense either. An eavesdropper could still see the URLs being visited by a user. And since most of the site is publicly available, static content, encrypting it is kind of pointless. It’s certainly good to encrypt personal information submitted by users, but the site was already doing that before this announcement.
Technologically-challenged institutions have an unfortunate habit of judging security using bulleted lists. Throwing more encryption at something doesn’t make it more secure. You have to think about who your attacker is and what he’s likely to be after before you start looking for solutions. In this case, it’s not clear there’s any attacker at all. As far as I can see, no one is trying to spoof the CIA’s public website or eavesdrop on people visiting it. So adding SSL is a solution in search of a problem.
CNET reports that Google has been contacted by cell phone carriers who don’t want their customers accessing Google Maps from their cell phones. One Google executive claims: “we’ve been getting notes from some of the telco carriers who are saying ‘look, you need to stop our customers from downloading this thing’.” If the report is true, it says a lot about whether or not we need heavy-handed government regulation to protect basic Internet freedoms.
Google Maps one of the best cell phone features I have ever used and I would be angry if my cell phone carrier tried to take it away. They could, of course. They’re under no network neutrality-type obligations. Any cell phone carrier could block access to Google Maps tomorrow. But if the media report is true, some have decided to appeal to Google instead. Maybe they fear a customer backlash if they take action on their own. Dissatisfied customers could jump ship. There are four major cell phone carriers to choose from. But existing customers are locked into service agreements, so one would assume the carriers are in a strong position. What they fear, I suspect, is bad press and resulting damage to the brand. They also may be afraid of provoking Washington. Either way, they already seem to feel there are limitations on what they can do even in the absence of net neutrality regulation.
Mike Masnick notes that Venezuela is ahead of the United States when it comes to adopting voter-verified paper trails for their electronic voting machines. Several commenters objected that given the level of corruption in Venezuela’s government, this doesn’t really mean anything: corrupt government officials can mis-count paper voting records as easily as electronic ones.
I don’t know enough about Venezuelan politics to have a definite opinion on whether the election is likely to be rigged, but the general point is quite true. Voting security ultimately turns on human factors, not technological ones. If the people running your election system are systematically corrupt, your election results are going to be suspect no matter what technological safeguards you put in place. E-voting (with or without a voter-verified paper trail) can’t make dishonest officials follow the rules. It simply obfuscates the voting process, making it less likely that someone will spot foul play should it occur.
Last week, the EPA reversed course and said it will begin to regulate nanotechnology, specifically nanoparticles of silver used in washing machines. Now comes word that “Berkeley is proposing what a city official says would be the world’s first local regulation of nanomaterials,” according to the SF Chronicle. I love the rationale offered by the city official: “There have been a great number of attempts to regulate them, and they’ve all amounted to nothing because of the fear of upsetting industry, which leaves workers and the community at some unknown risk,” he said. “It’s the unknown that’s a concern to us.” Someone recently explained to me that when pasteurization first became prevalent, many opposed it because of possible unknown health risks. Nanotech is something I plan to keep an eye on and maybe shed some light on the consumer benefits as well as the risks.
This week I appeared on C-SPAN’s weekly program “The Communicators” and discussed a wide variety of communications and media policy issues including: the outlook for telecom & media legislation in the new Democratic Congress, the First Amendment treatment of new media technologies, Net neutrality regulation and the need for universal service and spectrum policy reform.
The video can be viewed here and I apologize in advance if I put you to sleep!
When are state and local lawmakers going to stop wasting taxpayer dollars with unnecessary regulatory enactments and fruitless lawsuits aimed at censoring video games? I ask because this week the video game industry added yet another slam dunk victory to its growing string of impressive First Amendment wins. For those of you keeping track at home, this brings the tally to 10 major court wins for the video game industry versus zero wins for would-be government regulators. With a track record like that you would think that government officials would get the point. But the censorial tendencies of public officials have once again trumped common sense.
This week’s win came in the 7th Circuit Court of Appeals in the case of Entertainment Software Association v. Blagojevich. (Full decision here.) The case dealt with an Illinois statute that would have required that video game retailers to affix a 4-square-inch sticker with the numerals “18” on any “sexually explicit” game. It also would have imposed criminal penalties on any retailer who sold or rented a game with that designation to a minor. The statute also included signage and brochure requirements that would have forced retailers to place certain displays in their stores and provide all customers with brochures about game ratings.
The court’s decision overturning the law was written by Judge Ann Claire Williams and it echoed what every previous decision on this front has held, namely:
Continue reading →
According to Congress Daily, DHS Secretary Michael Chertoff “said today his department will ensure that the highest-risk urban areas have interoperable [public safety] communications equipment by the end of next year, and that all states have it by the end of 2008.” DHS has been under pressure from the incoming Democratic majority to do something about the lack of communications among first responders. According to the article,
Without explicitly acknowledging the looming pressure for faster action, Chertoff told a conference of emergency response officials that metropolitan regions under his department’s Urban Areas Security Initiative grant program will have interoperable communications by the end of the 2007 calendar year, followed by all states by the end of 2008.
Chertoff said the department will give urban locations “interoperability scorecards” next month to help them decide how much money to seek in their upcoming grant applications. He did not provide additional details during his speech.
A Homeland Security Department aide would only add: “We will have further info at later date, as well as further info on the grant guidance.”
The whole speech is here, but it doesn’t really add much. I’m not sure what to make of this, but if the interoperability problem could be solved so simply, by just giving more money in federal grants to states and localities, then we would have fixed it a long time ago. As the Katrina Commission pointed out in its report, “Although some New Orleans and Louisiana state officials attribute the lack of true interoperability for first responders in the region to financial limitations, this explanation flies in the face of the massive amounts of federal grants to Louisiana.” Among other things, the interoperability problem is caused by a collective action problem, which in turn is cause by a spectrum policy that gives each of 50,000 public safety agencies their own (untradable) spectrum license and thus the impetus to build their own custom radio system. Coordination among these 50,000 actors is not easy, and I don’t see how more money will help.
Luckily, the Mercatus Center and Tom Hazlett’s Tech Center at GMU are putting on a symposium along with the FCLJ that will try to offer some solutions for the interoperability issue on Friday, Dec. 8. You’re intvited. Presenting papers on the topic will be Gerald Faulhaber, Jon Peha, Phil Weiser, and yours truly.