Online Education – Technology Liberation Front https://techliberation.com Keeping politicians' hands off the Net & everything else related to technology Mon, 18 Apr 2022 15:00:00 +0000 en-US hourly 1 6772528 Should All Kids Under 18 Be Banned from Social Media? https://techliberation.com/2022/04/18/should-all-kids-under-18-be-banned-from-social-media/ https://techliberation.com/2022/04/18/should-all-kids-under-18-be-banned-from-social-media/#respond Mon, 18 Apr 2022 15:00:00 +0000 https://techliberation.com/?p=76966

This weekend, The Wall Street Journal ran my short letter to the editor entitled, “We Can Protect Children and Keep the Internet Free.” My letter was a response to columnist Peggy Noonan’s April 9 oped, “Can Anyone Tame Big Tech?” in which she proposed banning everyone under 18 from all social-media sites. She specifically singled out TikTok, Youtube, and Instagram and argued “You’re not allowed to drink at 14 or drive at 12; you can’t vote at 15. Isn’t there a public interest here?”

I briefly explained why Noonan’s proposal is neither practical nor sensible, noting how it:

would turn every kid into an instant criminal for seeking access to information and culture on the dominant medium of their generation. I wonder how she would have felt about adults proposing to ban all kids from listening to TV or radio during her youth. Let’s work to empower parents to help them guide their children’s digital experiences. Better online-safety and media-literacy efforts can prepare kids for a hyperconnected future. We can find workable solutions that wouldn’t usher in unprecedented government control of speech.

Let me elaborate just a bit because this was the focus of much of my writing a decade ago, including my book, Parental Controls & Online Child Protection: A Survey of Tools & Methods, which spanned several editions. Online child safety is a matter I take seriously and the concerns that Noonan raised in her oped have been heard repeatedly since the earliest days of the Internet. Regulatory efforts were immediately tried. They focused on restricting underage access to objectionable online content (as well as video games), but were immediately challenged and struck down as unconstitutionally overbroad restrictions on free speech and a violation of the First Amendment of the U.S. Constitution.

But practically speaking, most of these efforts were never going to work anyway. There was almost no way to bottle up all the content flowing in the modern information ecosystem without highly repressive regulation, and it was going to be nearly impossible to keep kids off the Internet altogether when it was the dominant communications and entertainment medium of their generation. The first instinct of every moral panic wave–from the waltz to comic books to rock or rap music to video games–has often been to take the easy way out by proposing sweeping bans on all access by kids to the content or platforms of their generation. It never works.

Nor should it. There is a huge amount of entirely beneficial speech, content, and communications that kids would be denied by such sweeping bans. That would make such ban highly counter-productive. But, again, usually such efforts just were not practically enforceable because kids are often better at the cat-and-mouse game than adults give them credit for. Moreover, imposing age limitations of speech or content are far more difficult than age-related bans on specific tangible products, like tobacco or other dangerous physical products.

Acknowledging these realities, most sensible people quickly move on from extreme proposals like flat bans of all kids using the popular media platforms and systems of the day. Over the past half century in the U.S., this has led to a flowering of more decentralized governance approach to kids and media that I have referred to as the “3E approach.” That stands for empowerment (of parents), education (of youth), and enforcement (of existing laws). The 3E approach includes a variety of mechanisms and approaches, including: self-regulatory codes, private content rating systems, a wide variety of different parental control technologies, and much more.

Over the past two decades, many multistakeholder initiatives and blue-ribbon commissions were created to address online safety issues in a holistic fashion. I summarized their conclusions in my 2009 report, “Five Online Safety Task Forces Agree: Education, Empowerment & Self-Regulation Are the Answer.” The crucial takeaway from all these task forces and commissions is that no silver-bullet solutions exist to hard problems. Child safety demands a vigilant but adaptive approach, rooted in a variety of best practices, educational approaches, and technological empowerment solutions to address various safety concerns. Digital literacy is particularly crucial to building wiser, more resilient kids and adults, who can work together to find constructive approaches to hard problems.

Importantly, our task here is never done. This is an ongoing and evolving process. Issues like underage access to pornography or violent content have been with us for a very long time and will never be completely “solved.” We must constantly work to improve existing online safety mechanisms while also devising new solutions for our rapidly evolving information ecosystem. Nothing should be off the table except the one solution that Noonan suggested in her essay. Just proposing outright bans on kids on social media or any other new media platform (VR will be next) is an unworkable and illogical response that we should dismiss fairly quickly. No matter how well-intentioned such proposals may be, moral panic-induced prohibitions on kids and media ultimately are not going to help them learn to live better, safer, and more enriching lives in the new media ecosystems of today or the future. We can do better.

 

]]>
https://techliberation.com/2022/04/18/should-all-kids-under-18-be-banned-from-social-media/feed/ 0 76966
Problematic “Do Not Track Kids” Bill Reintroduced https://techliberation.com/2013/11/14/problematic-do-not-track-kids-bill-reintroduced/ https://techliberation.com/2013/11/14/problematic-do-not-track-kids-bill-reintroduced/#respond Thu, 14 Nov 2013 20:27:58 +0000 http://techliberation.com/?p=73808

Sen. Edward J. Markey (D-Mass.) and Rep. Joe Barton (R-Texas) have reintroduced their “Do Not Track Kids Act,” which, according to this press release, “amends the historic Children’s Online Privacy Protection Act of 1998 (COPPA), will extend, enhance and update the provisions relating to the collection, use and disclosure of children’s personal information and establishes new protections for personal information of children and teens.” I quickly scanned the new bill and it looks very similar to their previous bill of the same name that they introduced in 2011 and which I wrote about here and then critiqued at much greater length in a subsequent Mercatus Center working paper (“Kids, Privacy, Free Speech & the Internet: Finding The Right Balance”).

Since not much appears to have changed, I would just encourage you to check out my old working paper for a discussion of why this legislation raises a variety of technical and constitutional issues. But I remain perplexed by how supporters of this bill think they can devise age-stratified online privacy protections without requiring full-blown age verification for all Internet users. And once you go down that path, as I note in my paper, you open up a huge Pandora’s Box of problems that we have already grappled with for many years now. As I noted in my paper, the real irony here is that the “problem with these efforts is that expanding COPPA would require the collection of more personal information about kids and parents. For age verification to be effective at the scale of the Internet, the collection of massive amounts of additional data is necessary.”

But that’s hardly the only problem. How about the free speech rights of teens? They do have some, after all, but this bill could create new limitations on their ability to freely surf the Internet, gather information, and communicate with others.

In the end, I don’t expect this bill to pass; it’s mostly just political grandstanding “for the children.” But it’s a real shame that smart people waste their time with counter-productive and constitutionally suspect measures such as these instead of focusing their energy on more constructive educational efforts and awareness-building approaches to online safety and privacy concerns. Again, read my paper for more details on that alternative approach to these issues.

]]>
https://techliberation.com/2013/11/14/problematic-do-not-track-kids-bill-reintroduced/feed/ 0 73808
California Eraser Button Passes https://techliberation.com/2013/09/26/california-eraser-button-passes/ https://techliberation.com/2013/09/26/california-eraser-button-passes/#comments Thu, 26 Sep 2013 14:32:09 +0000 http://techliberation.com/?p=73572

California’s continuing effort to make the Internet their own digital fiefdom continued this week with Gov. Jerry Brown signed legislation that creates an online “Eraser Button” just for minors. The law isn’t quite as sweeping as the seriously misguided “right to be forgotten” notion I’ve critique here (1, 2, 3, 4) and elsewhere (5, 6) before. In any event, the new California law will:

require the operator of an Internet Web site, online service, online application, or mobile application to permit a minor, who is a registered user of the operator’s Internet Web site, online service, online application, or mobile application, to remove, or to request and obtain removal of, content or information posted on the operator’s Internet Web site, service, or application by the minor, unless the content or information was posted by a 3rd party, any other provision of state or federal law requires the operator or 3rd party to maintain the content or information, or the operator anonymizes the content or information. The bill would require the operator to provide notice to a minor that the minor may remove the content or information, as specified.

As always, the very best of intentions motivate this proposal. There’s no doubt that some digital footprints left online by minors could come back to haunt them in the future, and that concern for their future reputation and privacy is the primary motivation for the measure. Alas, noble-minded laws like these often lead to many unintended consequences, and even some thorny constitutional issues. I’d be hard-pressed to do a better job of itemizing those potential problems than Eric Goldman, of Santa Clara University School of Law, and Stephen Balkam, Founder and CEO of the Family Online Safety Institute, have done in recent essays on the issue.

Goldman’s latest essay in Forbes argues that “California’s New ‘Online Eraser’ Law Should Be Erased” and meticulously documents the many problems with the law. “The law is riddled with ambiguities,” Goldman argues, including the fact that:

First, it may not be clear when a website/app is “directed” to teens rather than adults. The federal law protecting kids’ privacy (Children’s Online Privacy Protection Act, or COPPA) only applies to pre-teens, so this will be a new legal analysis for most websites and apps. Second, the law is unclear about when the minor can exercise the removal right. Must the choice be made while the user is still a minor, or can a centenarian decide to remove posts that are over 8 decades old? I think the more natural reading of the statute is that the removal right only applies while the user is still a minor. If that’s right, the law would counterproductively require kids to make an “adult” decision (what content do they want to stand behind for the rest of their lives) when they are still kids. Third, the removal right doesn’t apply if the kids were paid or received “other consideration” for their content. What does “other consideration” mean in this context? If the marketing and distribution inherently provided by a user-generated content (UGC) website is enough, the law will almost never apply. Perhaps we’ll see websites/apps offering nominal compensation to users to bypass the law.

Goldman also notes that it is unclear why California should even have the right to be regulating the Internet in this fashion. It is his opinion that, “states categorically lack authority to regulate the Internet because the Internet is a borderless electronic network, and websites/apps typically cannot make their electronic packets honor state borders.” I’ve been moving in that direction for the past decade myself since patchwork policies for the Internet — regardless of the issue — can really muck up the free flow of both speech and commerce. I teased out my own concerns about this in my January essay on “The Perils of Parochial Privacy Policies” and argued that the a world of “50 state Internet Bureaus isn’t likely to help the digital economy or serve the long-term interests of consumers.”  Sadly, some privacy advocates seem to be cheering on this sort of parochial regulation anyway without thinking through those consequences. They are probably just happy to have another privacy law on the books, but as I always try to point out not just in this context but also in debates over online child safety, cybersecurity, and digital copyright protection, the ends rarely justify the means. I just don’t understand why more people who care about true Internet freedom aren’t railing against these stepped-up state efforts (especially the flurry of California activity) and calling it out for the threat that it is.

In an essay over on LinkedIn entitled, “Let’s Delete The ‘Eraser Button,'” Stephen Balkam points out another mystery about the new California law: “It’s unclear why this law was even proposed when there exists a range of robust reporting mechanism across the Internet landscape.” Indeed, in this particular case it seems like much of the law is redundant and unnecessary. “What this bill should have been about is education and awareness, about taking responsibility for our actions and using the tools that already exist across the social media landscape,” Balkam says. “Here are three key actions that can already be taken:

Delete – you can take down or delete postings, comments and photos that you have put up on Facebook, Twitter, YouTube and most of the other platforms. Report – anyone can report abusive comments or inappropriate content by others about you or other people and, in many cases, have them removed. Request – you can ask that you be untagged from a photo or that a posting or photo be removed that has been uploaded by someone else. In addition there are in-line privacy settings on many of the leading social media sites, so that you or your teen can choose who sees what.”

Balkam is exactly right. The tools are already there; it’s the education and awareness that are lacking. As I have pointed out countless times here before, there is no need for preemptive regulatory approaches when less-restrictive and potentially equally effective remedies already exist. We just need to do a better job informing users about the existence of those tools and methods and then explain how to take advantage of them. Just adding more layers of law — especially parochial regulation — is not going to make that happen magically. Worse yet, in the process, such laws open the barn door to far more creative and meddlesome forms of state-based Internet regulation that should concern us all.

And now for the really interesting question that I have no answer to: Will anyone step up and challenge this law in court?

]]>
https://techliberation.com/2013/09/26/california-eraser-button-passes/feed/ 2 73572
Ethan Zuckerman on the connected world https://techliberation.com/2013/06/11/ethan-zuckerman/ https://techliberation.com/2013/06/11/ethan-zuckerman/#comments Tue, 11 Jun 2013 11:47:50 +0000 http://techliberation.com/?p=44935

Are we as globalized and interconnected as we think we are? Ethan Zuckerman, director of the MIT Center for Civic Media and author of the new book, Rewire: Digital Cosmopolitans in the Age of Connection, argues that America was likely more globalized before World War I than it is today. Zuckerman discusses how we’re more focused on what’s going on in our own backyards; how this affects creativity; the role the Internet plays in making us less connected with the rest of the world; and, how we can broaden our information universe to consume a more healthy “media diet.”

Download

Related Links

 

 

]]>
https://techliberation.com/2013/06/11/ethan-zuckerman/feed/ 7 44935
Alex Tabarrok on innovation https://techliberation.com/2013/04/30/alex-tabarrok/ https://techliberation.com/2013/04/30/alex-tabarrok/#respond Tue, 30 Apr 2013 10:00:22 +0000 http://techliberation.com/?p=44616 Launching The Innovation Renaissance: A New Path to Bring Smart Ideas to Market Fast discusses America's declining growth rate in total factor productivity, what this means for the future of innovation, and what can be done to improve the situation. ]]>

Alex Tabarrok, author of the ebook Launching The Innovation Renaissance: A New Path to Bring Smart Ideas to Market Fast discusses America’s declining growth rate in total factor productivity, what this means for the future of innovation, and what can be done to improve the situation.

Accroding to Tabarrok, patents, which were designed to promote the progress of science and the useful arts, have instead become weapons in a war for competitive advantage with innovation as collateral damage. College, once a foundation for innovation, has been oversold. And regulations, passed with the best of intentions, have spread like kudzu and now impede progress to everyone’s detriment. Tabarrok outs forth simple reforms in each of these areas and also explains the role immigration plays in innovation and national productivity.

Download

Related Links

]]>
https://techliberation.com/2013/04/30/alex-tabarrok/feed/ 0 44616
On the Pursuit of Happiness… and Privacy https://techliberation.com/2013/03/31/on-the-pursuit-of-happiness-and-privacy/ https://techliberation.com/2013/03/31/on-the-pursuit-of-happiness-and-privacy/#comments Sun, 31 Mar 2013 19:14:31 +0000 http://techliberation.com/?p=44261

Defining “privacy” is a legal and philosophical nightmare. Few concepts engender more definitional controversies and catfights. As someone who is passionate about his own personal privacy — but also highly skeptical of top-down governmental attempts to regulate and/or protect it — I continue to be captivated by the intellectual wrangling that has taken place over the definition of privacy. Here are some thoughts from a wide variety of scholars that make it clear just how frustrating this endeavor can be:

  • Perhaps the most striking thing about the right to privacy is that nobody seems to have any very clear idea what it is.” – Judith Jarvis Thomson, “The Right to Privacy,” in Philosophical Dimensions of Privacy: An Anthology, 272, 272 (Ferdinand David Schoeman ed., 1984).
  • privacy is “exasperatingly vague and evanescent.” – Arthur Miller, The Assault on Privacy: Computers, Data Banks, and Dossiers, 25 (1971).
  • [T]he concept of privacy is infected with pernicious ambiguities.” – Hyman Gross,  The Concept of Privacy, 42 N.Y.U. L. REV. 34, 35 (1967).
  • Attempts to define the concept of ‘privacy’ have generally not met with any success.” – Colin Bennett, Regulating Privacy: Data Protection and Public Policy In Europe and the United States,  25 (1992).
  • When it comes to privacy, there are many inductive rules, but very few universally accepted axioms.” – David Brin, The Transparent Society: Will Technology Force Us To Choose Between Privacy and Freedom? 77 (1998).
  • Privacy is a value so complex, so entangled in competing and contradictory dimensions, so engorged with various and distinct meanings, that I sometimes despair whether it can be usefully addressed at all.” – Robert C. Post, Three Concepts of Privacy, 89 GEO. L.J. 2087, 2087 (2001).
  • [privacy] can mean almost anything to anybody.” – Fred H. Cate & Robert Litan, Constitutional Issues in Information Privacy, 9 Mich. Telecomm. & Tech. L. Rev. 35, 37 (2002).
  • privacy has long been a “conceptual jungle” and a “concept in disarray.” “[T]he attempt to locate the ‘essential’ or ‘core’ characteristics of privacy has led to failure.” – Daniel J. Solove, Understanding Privacy 196, 8 (2008).
  • Privacy has really ceased to be helpful as a term to guide policy in the United States.” – Woodrow Hartzog, quoted in Cord Jefferson, Spies Like Us: We’re All Big Brother Now, Gizmodo, Sept. 27, 2012.
  • for most consumers and policymakers, privacy is not a rational topic. It’s a visceral subject, one on which logical arguments are largely wasted.” – Larry Downes,  A Rational Response to the Privacy “Crisis,” Cato Institute, Policy Analysis No. 716 (Jan. 7, 2013), at 6.

In my new Harvard Journal of Law & Public Policy article, “The Pursuit of Privacy in a World Where Information Control is Failing” I build on these insights to argue that:

  1. precisely because privacy has always been a highly subjective philosophical concept;
  2. and is also a constantly morphing notion that evolves as societal attitudes adjust to new cultural and technological realities;
  3. America may never be able to achieve a coherent fixed definition of the term or determine when it constitutes a formal right outside of some narrow contexts.

That doesn’t mean the privacy isn’t profoundly important to many of us, but privacy is, first and foremost, an exercise of personal determination and personal responsibility. To some extent, we have to make our own privacy in this world. In this sense, we can liken it to our right to pursue happiness. Here’s how I put it in Part I of my Harvard JLPP article:

Even if agreement over the scope of privacy rights proves elusive, however, everyone would likely agree that citizens have the right to pursue privacy. In this sense, we might think about the pursuit of privacy the same way we think about the pursuit of happiness. Recall the memorable line from America’s Declaration of Independence: “We hold these truths to be self-evident, that all men are created equal, that they are endowed by their Creator with certain unalienable Rights, that among these are Life, Liberty and the pursuit of Happiness.” Consider the importance of that qualifying phrase—“and the pursuit of”—before the mention of the normative value of happiness. America’s Founders obviously felt happiness was an important value, but they did not elevate it to a formal positive right alongside life, liberty, physical property, or even freedom of speech.

This framework provides a useful way of thinking about privacy. Even if we cannot agree whether we have a right to privacy, or what the scope of any particular privacy right should be, the right to pursue it should be as uncontroversial as the right to pursue happiness. In fact, pursing privacy is probably an important element of achieving happiness for most citizens. Almost everyone needs some time and space to be free with their own thoughts or to control personal information or secrets that they value. But that does not make it any easier to define the nature of privacy as a formal legal right, or any easier to enforce it, even if a satisfactory conception of privacy could be crafted to suit every context.

The most stable and widely accepted privacy rights in the United States have long been those that are tethered to unambiguous tangible or physical rights, such as rights in body and property, especially the sanctity of the home. Moreover, these rights have been focused on limiting the power of state actors, not private parties. By contrast, privacy claims premised on intangible or psychological harms have found far less support, and those claims have been particularly limited for private actors relative to the government. All this will likely remain the case for online privacy. Importantly, if privacy is enshrined as a positive right even in narrowly drawn contexts, it imposes obligations on the government to secure that right. These obligations create corresponding commitments and costs that must be taken into account since government regulation always entails tradeoffs.

Therefore, even as America struggles to reach political consensus over the scope of privacy rights in the information age, it makes sense to find methods and mechanisms—most of which will lie outside of the law—that can help citizens cope with social and technological changes that affect their privacy. Part III will outline some of the ways citizens can pursue and achieve greater personal privacy.

I fully realize that this way of thinking about privacy leaves many challenging questions at the margin and I also understand how it will be unsatisfactory to those who view privacy as a “dignity right” that trumps all other values and considerations. But, to reiterate, what I am suggesting here is that we will likely never be able to achieve a coherent fixed definition of the term or determine when it constitutes a formal right outside of some narrow contexts (such as for sensitive health or financial information, where the potential harms of collection, sharing, and use are more tangible).  The primary reason for this is that privacy primary comes down to assertions about “harms” that are primarily psychological in character. But precisely because such asserted harms (1) lack a tangible/physical/monetary nature and (2) also can come into conflict with other liberty rights (especially the right to freely gather information and speak about it; i.e., First Amendment rights), it makes it more difficult to classify psychological “harms” as harms at all.

I feel the same way about concepts like “safety” and “security.” Who among us doubts these values and goals are important? As the father of two young digital natives, I am living a constant struggle to mentor my kids and ensure they have safe and healthy online interactions. But that doesn’t mean I think anyone in this world — including my own children — has an amorphous “right to safety.” What they do have a right to is not to be harmed by others in their online interactions. Where things become sticky, however, is when some child safety advocates adopt an extremely expansive view of what constitutes “harm” in this context and suggest that hearing a single dirty word or seeing a fleeting dirty image somehow irrevocably “harms” their mental well-being and development, or perhaps just their personal morality. (I have written about this here in dozens of essays through the years such as this one on “The Problem of Proportionality in Debates about Online Privacy and Child Safety” as well in longer papers, such as my recent law review article about, “Technopanics, Threat Inflation, and the Danger of an Information Technology Precautionary Principle.”)

While I appreciate the diverse beliefs and values that drives sensitivities about potentially objectionable online content, it is an entirely different matter when one claims “rights” and actionable “harms” in this context. It means that politics will essentially answer what are fundamentally deeply personal “eye of the beholder” questions. It is better, I believe to educate and empower citizens about safe and sensible online interactions and then let them determine what works best for them. Again, whether we are talking about safety or privacy, this model relies upon a certain amount of personal (and parental) responsibility.

To be sure, real harms exist and, at times, law will need to be brought in to right certain wrongs. For example, in the online safety context I favor strong penalties for anyone attempting predatory behavior or extreme forms of incessant harassment. In the privacy context, we’ll still need laws to deal with identity/data theft and certain uses of highly sensitive health and financial information. Outside of those narrow contexts, however, it is better to let people define their own online experiences free of top-down, one-size-fits-all regulatory enactments that attempt to make those determinations for all of us. To reiterate, we all have the right to pursue the objectives we care about–safety, privacy, or just happiness more generally–according to our own value systems. But we should be careful about elevating such amorphous concepts to the level of “rights” and then expecting the State to enforce one set of values and choices on a diverse citizenry.

The Pursuit of Privacy in a World Where Information Control is Failing

]]>
https://techliberation.com/2013/03/31/on-the-pursuit-of-happiness-and-privacy/feed/ 1 44261