James D. Miller, Associate Professor of Economics at Smith College and author of Singularity Rising: Surviving and Thriving in a Smarter, Richer, and More Dangerous World, discusses the economics of the singularity, or the point of time in which we’ll either have computers that are smarter than people or we will have significantly increased human intelligence.
According to Miller, brains are essentially organic computers, and, thus, applying Moore’s law suggests that we are moving towards singularity. Since economic output is a product of the human brain, increased brainpower or the existence of computers smarter than humans could produce outputs we cannot even imagine.
Miller goes on to outline what the singularity could look like and what could derail our progress towards it.
In a New York Times op-ed this weekend entitled “You Can’t Say That on the Internet,” Evgeny Morozov, author of The Net Delusion, worries that Silicon Valley is imposing a “deeply conservative” “new prudishness” on modern society. The cause, he says, are “dour, one-dimensional algorithms, the mathematical constructs that automatically determine the limits of what is culturally acceptable.” He proposes that some form of external algorithmic auditing be undertaken to counter this supposed problem. Here’s how he puts it in the conclusion of his essay:
Quaint prudishness, excessive enforcement of copyright, unneeded damage to our reputations: algorithmic gatekeeping is exacting a high toll on our public life. Instead of treating algorithms as a natural, objective reflection of reality, we must take them apart and closely examine each line of code.
Can we do it without hurting Silicon Valley’s business model? The world of finance, facing a similar problem, offers a clue. After several disasters caused by algorithmic trading earlier this year, authorities in Hong Kong and Australia drafted proposals to establish regular independent audits of the design, development and modifications of computer systems used in such trades. Why couldn’t auditors do the same to Google?
Silicon Valley wouldn’t have to disclose its proprietary algorithms, only share them with the auditors. A drastic measure? Perhaps. But it’s one that is proportional to the growing clout technology companies have in reshaping not only our economy but also our culture.
It should be noted that in a Slate essay this past January, Morozov had also proposed that steps be taken to root out lies, deceptions, and conspiracy theories on the Internet. Morozov was particularly worried about “denialists of global warming or benefits of vaccination,” but he also wondered how we might deal with 9/11 conspiracy theorists, the anti-Darwinian intelligent design movement, and those that refuse to accept the link between HIV and AIDS.
To deal with that supposed problem, he recommended that Google “come up with a database of disputed claims” or “exercise a heavier curatorial control in presenting search results,” to weed out such things. He suggested that the other option “is to nudge search engines to take more responsibility for their index and exercise a heavier curatorial control in presenting search results for issues” that someone (he never says who) determines to be conspiratorial or anti-scientific in nature.
Taken together, these essays can be viewed as a preliminary sketch of what could become a comprehensive information control apparatus instituted at the code layer of the Internet. Continue reading →
As you likely know by now, the Republican Study Committee published a briefing paper critical of copyright, but then later pulled it down claiming the memo had not received adequate review. Some have suggested that IP-industry pressure may have led to the reversal. I hope we will find out in due time whether the paper was indeed reviewed and approved (as I suspect it was), and why it was removed. That said, I think what this take-down likely shows is a generational gap between the old, captured, and pro-business parts of the Republican Party and its pro-market and pro-dynamism future.
I also hope that this dust-up sparks a debate within the “right” about our bloated copyright system, and so it’s propitious that in a couple of weeks the Mercatus Center will be publishing a new book I’ve edited making the case that libertarians and conservatives should be skeptical of our current copyright system. It’s called Copyright Unbalanced: From Incentive to Excess, and it is not a moral case for or against copyright; it is a pragmatic look at the excesses of the present copyright regime and of proposals to further expand it. The book features:
- Yours truly making the Hayekian and public choice case for reform
- Reihan Salam and Patrick Ruffini arguing that the GOP should take up the cause of reforming what is now a crony capitalist system
- David Post explaining why SOPA was so dangerous
- Tim Lee on the criminalization of copyright and the a use of asset forfeiture in enforcing copyright
- Christina Mulligan explaining that the DMCA harms competition and free expression
- Eli Dourado calculating that the system we have today likely far exceeds what we need in order to offer authors an incentive to create
- Tom Bell suggesting five reforms for copyright, including returning to the Founders’ vision of what copyright should be
Conservatives and libertarians, who are naturally suspicious of big government, should be skeptical of an ever-expanding copyright system. They should be skeptical of the recent trend toward criminal prosecution of even minor copyright infringements, of the growing use of civil asset forfeiture in copyright enforcement, and of attempts to regulate the Internet and electronics in the name of piracy eradication. I think our movement is very close to seeing that copyright reform is not just completely compatible with a respect for property rights, but a limited-government project. We hope our book will help make the case.
Also, the Cato Institute will be hosting a lunchtime book forum on December 6. Tom Bell and I will present our views and Mitch Glazier of the Recording Industry Association of America will respond. Please RSVP to attend and tell your colleagues.
On Friday evening, I posted on CNET a detailed analysis of the most recent proposal to surface from the secretive upcoming World Conference on International Telecommunications, WCIT 12. The conference will discuss updates to a 1988 UN treaty administered by the International Telecommunications Union, and throughout the year there have been reports that both governmental and non-governmental members of the ITU have been trying to use the rewrite to put the ITU squarely in the Internet business.
The Russian federation’s proposal, which was submitted to the ITU on Nov. 13th, would explicitly bring “IP-based Networks” under the auspices of the ITU, and would in specific substantially if not completely change the role of ICANN in overseeing domain names and IP addresses.
According to the proposal, “Member States shall have the sovereign right to manage the Internet within their national territory, as well as to manage national Internet domain names.” And a second revision, also aimed straight at the heart of today’s multi-stakeholder process, reads: “Member States shall have equal rights in the international allocation of Internet addressing and identification resources.” Continue reading →
Here’s a presentation I delivered on “The War on Vertical Integration in the Digital Economy” at the latest meeting of the Southern Economic Association this weekend. It outlines concerns about vertical integration in the tech economy and specifically addresses regulatory proposals set forth by Tim Wu (arguing for a “separations principle” for the tech economy) & Jonathan Zittrain (arguing for “API neutrality” for social media and digital platforms). This presentation is based on two papers published by the Mercatus Center at George Mason University: “Uncreative Destruction: The Misguided War on Vertical Integration in the Information Economy” (with Brent Skorup) & “The Perils of Classifying Social Media Platforms as Public Utilities.”
Here’s a presentation I’ve been using lately for various audiences about “Cronyism: History, Costs, Case Studies and Solutions.” In the talk, I offer a definition of cronyism, explain its origins, discuss how various academics have traditionally thought about it, outline a variety of case studies, and then propose a range of solutions. Readers of this blog might be interested because I briefly mention the rise of cronyism in the high-tech sector. Brent Skorup and I have a huge paper in the works on that topic, which should be out early next year.
Also, here’s a brief video of me discussing why corporate welfare doesn’t work, which was shot after I recently made this presentation at an event down in Florida. Continue reading →
By Berin Szoka and Ben Sperry
You’d think it would be harder for government to justify regulating the Internet than the offline world, right? Wrong—sadly. And Congress just missed a chance to fix that problem.
For decades, regulators have been required to issue a cost-benefit analysis when issuing new regulations. Some agencies are specifically required to do so by statute, but for most agencies, the requirement comes from executive orders issued by each new President—varying somewhat but each continuing the general principle that regulators bear the burden of showing that each regulation’s benefits outweigh its costs.
But the FCC, FTC and many other regulatory agencies aren’t required to do cost-benefit analysis at all. Because these are “independent agencies”—creatures of Congress rather than part of the Executive Branch (like the Department of Justice)—only Congress can impose cost-benefit analysis on agencies. A bipartisan bill, the Independent Agency Regulatory Analysis Act (S. 3486), would have allowed the President to impose the same kind of cost-benefit analysis on independent regulatory agencies as on Executive Branch agencies, including review by the Office of Information and Regulatory Affairs (OIRA) for “significant” rulemakings (those with $100 million or more in economic impact, that adversely affect sectors of the economy in a material way, or that create “serious inconsistency” with other agencies’ actions).
Republican Senators Rob Portman and Susan Collins joined with Democrat Mark Warner in this important cause—yet the bill has apparently died during this lame duck Congress. While some public interest groups have attempted to couch their objection on separation-of-powers grounds, their ultimate objection seems to be with subjecting the regulatory state’s rulemaking process to systematic economic analysis—because, after all, rigor makes regulation harder. But what’s so wrong with a cost-benefit analysis? Continue reading →
Today the Reason Foundation publishes my policy brief on keys to successful state regulation of Internet gambling.
Thanks to a Department of Justice’s December 2011 memo on the parameters of the Wire Act, states can now license real-money intrastate online casino games. Earlier this year, Nevada became the first state to permit online wagering, and in August granted the first online operating license to South Point Poker LLC, which was to have launched trials last month. Since the Reason report went to press, South Point disclosed that its software is still undergoing independent testing but hopes to have its site up by the end of the year.
Elsewhere, Delaware has enacted legislation to authorize online gambling under the auspcies of the state lottery commission and Illinois has begun selling lottery tickets online.
It goes without saying that U.S. citizens should be free to gamble online, just as they legally can in casinos throughout the country. The degree of regulation is subject to debate, but unfortunately remains a necessary element in policy. Yet lessons about taxation and regulation can be learned from experiences in Europe, as well as from regulation of brick-and-mortar casinos in the U.S. With a better understanding of usage trends, consumer game choices and operator cost models, legislators who want to offer constituents the freedom to play online can craft an environment that supports a robust online gaming climate, as opposed to one that drives legitimate operators away.
Regulation should derive from an enlightened approach that respects the responsibility and intelligence of its citizens. Internet gambling can be a safe, secure pastime. Overall, the government’s only goal should be to protect users from theft or fraud. Gambling should not approached as an activity that needs to be controlled or discouraged under the rationale that it is a “sin” (to moralists) or “destructive behavior” (to social utilitarians), and then, hypocritically, politically tolerated so it can be excessively taxed on those rationales.
Although it is likely states will differ in the particulars of how they structure the license and tax arrangements, a successful climate for legalized Internet gambling is likely to derive from the following fundamental principles. Lawmakers should heed the following guidelines:
Continue reading →
In the wake of the election, Matt Hindman, author of The Myth of Digital Democracy, analyzes the effect of the internet on electoral politics.
According to Hindman, the internet had a large—but indirect—effect on the 2012 elections. Particularly important was microtargeting to identify supporters and get out the vote, says Hindman. Data and measurements—two things that the GOP was once ahead in, but which they have ceded to the Democrats in the past 8 years—played a key role in determining the winner of the presidential election, according to Hindman.
Hindman also takes a critical look at the blogosphere, comparing it to the traditional media that some argue it is superseding, and he delineates the respective roles played by Facebook and Twitter within the electoral framework.
As some of you know, I’ve been closely following the World Conference on International Telecommunication, an international treaty conference in December that will revise rules, for example, on how billing for international phone calls is handled. Some participants are interested in broadening the scope of the current treaty to include rules about the Internet and services provided over the Internet.
I haven’t written much publicly about the WCIT lately because I am now officially a participant—I have joined the US delegation to the conference. My role is to help prepare the US government for the conference, and to travel to Dubai to advise the government on the issues that arise during negotiations.
To help the general public better understand what we can expect to happen at WCIT, Mercatus has organized an event next week that should be informative. Ambassador Terry Kramer, the head of the US delegation, will give a keynote address and take questions from the audience. This will be followed by what should be a lively panel discussion between me, Paul Brigner from the Internet Society, Milton Mueller from Syracuse University, and Gary Fowlie from the ITU, the UN agency organizing the conference. The event will be on Wednesday, November 14, at 2 pm at the W hotel in Washington.
If you’re in the DC area and are interested in getting a preview of the WCIT, I hope to see you at the event on Wednesday. Be sure to register now since we are expecting a large turnout.