The FTC today announced it has reached a settlement with Google concerning privacy complaints about how the company launched its Buzz social networking service last year. The consent decree runs for a standard twenty-year term and provides that Google shall (i) follow certain privacy procedures in developing products involving user information, subject to regular auditing by an independent third party, and (ii) obtain opt-in consent before sharing certain personal information. Here’s my initial media comment on this:
For years, many privacy advocates have insisted that only stringent new regulations can protect consumer privacy online. But today’s settlement should remind us that the FTC already has sweeping powers to punish unfair or deceptive trade practices. The FTC can, and should, use its existing enforcement powers to build a common law of privacy focused on real problems, rather than phantom concerns. Such an evolving body of law is much more likely to keep up with technological change than legislation or prophylactic regulation would be, and is less likely to fall prey to regulatory capture by incumbents.
I’ve written in the past about how the FTC can develop such a common law. If the agency needs more resources to play this role effectively, that is what we should be talking about before we rush to the assumption that new regulation is necessary. Anyway, a few points about Part III of the consent decree, regarding the procedures the company has to follow:
- The company has to assess privacy risks raised by new products as well as existing products, much like data security assessments currently work. The company would have to assess, document and address privacy risks—and then subject those records to inspection by the independent auditor, who would determine whether the company has adequately studied and dealt with privacy risks.
- Google is agreeing to implement a version of Privacy by Design, in that the company will do even more to bake privacy features into its offerings.
- This is intended to avoid instances where the company makes a privacy blunder because it lacked adequate internal processes to thoroughly vet new offerings or simply to avoid innocent mistakes—as with the its inadvertent collection of content sent over unsecured Wi-Fi hotspots because the engineer designing its Wi-Fi mapping program mistakenly left that code in the system, even though it wasn’t necessary for what Google was doing. I wrote more on that here.
As to Part II of the consent decree, express affirmative consent for changes in the sharing of “identified information”: It’s well-worth reading Commissioner Rosch’s concurring statement. Continue reading →