Table of Contents
The United States Federal Trade Commission announced a settlement with Facebook, Inc. on July 24 in regards to violations of a 2011 agreement between Facebook and the FTC. The settlement includes a fine of USD 5 billion, as well as several structural changes to how Facebook handles privacy matters across its platforms.
The agreements and fine were brought under the Federal Trade Commission Act of 1914, as there currently is no federal regulation that the FTC could rely on to enforce privacy matters. This case provided more motivation for proponents of a U.S. federal regulation on data management, including the FTC, which would most likely gain more authority under a federal law to enforce fines and impose structural changes.
The FTC hailed the settlement as a watershed moment in privacy regulation. The settlement, the FTC stated in a press release, sends a message to the tech industry that privacy concerns are very serious and that the FTC and U.S. regulators are prepared to enforce privacy and security laws. Critics of the settlement, including representatives from the Electronic Privacy Information Center, the International Association of Privacy Professionals and the two dissenting members of the FTC, Commissioners Rohit Chopra and Rebecca Slaughter, argue that the fine is minimal, that the structural changes represent internally policed rather than publicly policed controls, and that the fundamental structure of the company and its business model remain unaffected.
“The settlement imposes no meaningful changes to the company’s structure or financial incentives, which led to these violations,” wrote Commissioner Chopra in his dissenting statement. “Nor does it include any restrictions on the company’s mass surveillance or advertising tactics. Instead, the order allows Facebook to decide for itself how much information it can harvest from users and what it can do with that information, as long as it creates a paper trail.”
Chopra also writes that the fine is a departure from previous fines, which were up to five times the company’s unjust gains in the case of a 2012 action against Google, but much less in this action. He also argues that the settlement provides the company with a shield and its executives with immunity from disclosed and undisclosed violations that Facebook now has no incentive to change or report.
“Given the many public reports of problems at Facebook, it is hard to know how wide the range of conduct left unaddressed in the proposed Complaint or settlement may be,” Commissioner Chopra states. “This shield is good for Facebook, but leaves the public in the dark as to how the company violated the law, and what violations, if any, are not remedied.”
Facebook also responded to the settlement order in both a blog post and a statement from CEO and Chairman Mark Zuckerberg:
“The accountability required by this agreement surpasses current US law and we hope will be a model for the industry,” the company stated in a blogpost. “It introduces more stringent processes to identify privacy risks, more documentation of those risks, and more sweeping measures to ensure that we meet these new requirements. Going forward, our approach to privacy controls will parallel our approach to financial controls, with a rigorous design process and individual certifications intended to ensure that our controls are working — and that we find and fix them when they are not.”
“We expect it will take hundreds of engineers and more than a thousand people across our company to do this important work. And we expect it will take longer to build new products following this process going forward,” added Zuckerberg. “Overall, these changes go beyond anything required under US law today. The reason I support them is that I believe they will reduce the number of mistakes we make and help us deliver stronger privacy protections for everyone.”
Elements of the order
The structural changes ordered by the settlement agreement include:
Prohibition against misrepresentations
Facebook cannot claim it protects personal data (defined as “Covered Information” in the order), keeps personal data out of the hands of third parties, has verified third-party privacy and security measures, or make any other claim that is untrue or misleading. This order stems from Facebook’s repeated violations of the 2011 Consent Agreement, and the company’s repeated attempts to mislead users and the authorities as to the extent of its culpability in privacy breaches, the extent to which it sells personal data, and the extent to which third parties have access to user’s personal data.
Changes to sharing of nonpublic user information
Facebook must disclose to users, prior to sharing any personal data, “(1) the categories of Nonpublic User Information that will be disclosed to such Covered Third Parties, (2) the identity or specific categories of such Covered Third Parties, and (3) that such sharing exceeds the restrictions imposed by the Privacy Setting(s) in effect for the User; and [o]btain the User’s affirmative express consent.”
Deletion of information
Facebook “must ensure that Covered Information cannot be accessed by any Covered Third Party from servers under [Facebook’s] control after a reasonable period of time, not to exceed thirty (30) days, from the time that the User has deleted such information or deleted or terminated his or her account, except as required by law or where necessary to protect the Facebook website or its Users from fraud or illegal activity ....” Facebook must also ensure that any such data is deleted or de-identified within 120 days. There are exceptions to this order, such as to prevent fraud and ensure safety and security, and it also seems as if third parties may have access to the data within the 30-day time period.
The order also prohibits Facebook from using telephone numbers acquired through two-factor authentication or password recovery for advertising purposes. Facebook is also ordered to delete all facial recognition templates and to not create any new ones unless it clearly and conspicuously discloses that information to the user and obtains the user’s affirmative express consent. More importantly, the order dictates that the company implement and maintain “a comprehensive information security program that is designed to protect the security of Covered Information” and “a comprehensive privacy program.” The order goes into some detail regarding what constitutes an effective privacy program.
The structural change that has made the most headlines is the addition of an independent privacy committee within the board of directors that must assess the company’s efforts to protect privacy and manage its data, and report back to the FTC and the U.S. Department of Justice on a regular basis. The settlement lays out the structure of the committee, its authority and duties, and also creates an independent nominating committee to oversee members of the board of directors.
The settlement is indeed without precedent in the U.S., and forces Facebook to make changes and document procedures that should enhance privacy protection going forward.
Takeaways
-
The U.S. Federal Trade Commission fined Facebook USD 5 billion, and ordered the company to change various aspects of its privacy program, including submit to an independent privacy committee that will report regularly to the Commission.
-
The measures are record-breaking in the U.S., but do not fundamentally change the way Facebook does business. The settlement may, however, spur the U.S. further down the road toward federal privacy regulation.