The U.S. Federal Trade Commission is alleging Facebook “repeatedly violated its privacy promises” and is proposing a “blanket prohibition” on parent company Meta’s monetization of data of users under 18.
The company, meanwhile, called the move “a political stunt.”
The FTC on Wednesday moved to expand its USD5 billion privacy order with then-Facebook from 2020, claiming the company failed to comply with the order and the Children’s Online Privacy Protection Act Rule, misrepresented access to private user data it provided app developers, and misled parents about their ability to control children’s communications through the Messenger Kids app.
“Facebook has repeatedly violated its privacy promises,” FTC Bureau of Consumer Protection Director Samuel Levine said. “The company’s recklessness has put young users at risk, and Facebook needs to answer for its failures.”
The FTC voted 3-0 to issue an “Order to Show Cause,” which states an independent assessor identified “several gaps and weaknesses in Facebook’s privacy program,” the “breadth and significance” of which “pose substantial risks to the public.”
The agency’s proposed changes to the 2020 order include restricting how Meta’s Facebook, Instagram, WhatsApp and Oculus services use data collected from children and teens — prohibiting the company from monetizing data of users under age 18, “or otherwise using it for commercial gain” after users turn 18.
Meta would also be banned from launching “new or modified products, services, or features” without written confirmation of the independent assessor that its privacy program is fully compliant with the FTC order and would be required to “disclose and obtain users’ affirmative consent for any future uses of facial recognition technology.”
The FTC said privacy program provisions in the 2020 order would also be strengthened, including those involving “privacy review, third-party monitoring, data inventory and access controls, and employee training.”
The order would extend to any companies Meta merges with or acquires, the FTC said.
Despite the 3-0 vote, FTC Commissioner Alvaro Bedoya raised concerns in a statement that “there are limits to the Commission’s order modification authority.”
“Here, the relevant question is not what I would support as a matter of policy. Rather, when the Commission determines how to modify an order, it must identify a nexus between the original order, the intervening violations, and the modified order,” he said. “Based on the record before me today, I have concerns about whether such a nexus exists.”
Meta said it will “vigorously fight this action and expect to prevail.”
The company said the FTC’s assessment “looked at just the first six months of a 20-year agreement,” noting it was found in compliance with all order requirements following an initial assessment and has “only continued to invest and improve on them since then.” The company also said it has not had an opportunity to address the concerns raised in the proposal.
“None of these issues warrant the drastic changes the FTC is seeking just three years into our decades-long agreement — and that the FTC lacks unilateral authority to impose. We have not violated the agreement and operate an industry-leading privacy program,” Meta said.
“Let’s be clear about what the FTC is trying to do: usurp the authority of Congress to set industry-wide standards and instead single out one American company while allowing Chinese companies, like TikTok, to operate without constraint on American soil. FTC Chair Lina Khan’s insistence on using any measure — however baseless — to antagonize American business has reached a new low.”
IAPP Managing Director, Washington, D.C., Cobun Zweifel-Keegan, CIPP/US, CIPM, said the FTC has “made clear that it will pursue every available avenue to correct privacy violations.”
“We don’t get to choose our regulators any more than we choose our parents,” Zweifel-Keegan said. “When you enter into a consent order, the FTC expects you to follow it. Here we see the agency deploying another uncommon tool in its toolbelt. Opening up this public process means we will soon learn a lot about how future consent orders can be modified when the underlying facts change.”
Privacy advocates applauded the FTC’s action.
Common Sense Media CEO James Steyer said he hopes the move “serves as a warning to all social media companies that kids’ private information should not be used to line Big Tech’s pockets.” He also called on Congress to pass legislation to ensure “all companies are held to the highest standards when it comes to protecting kids online.”
“Meta continues to target kids with new products and then uses them to capture kids’ data for their own profit. It must stop, and Meta must be held accountable for violating its privacy promises,” Steyer said. “But Meta is not alone in utilizing invasive data and advertising practices toward kids. That’s why we look forward to the FTC issuing its commercial surveillance rule, and we also call on Congress to properly fund the agency to oversee compliance by the tech industry.”
EPIC Executive Director Alan Butler said it is “essential” that the FTC “vigorously enforce its orders and hold these companies to account when they fail to protect the privacy of their users.”
“We have seen time and time again that Meta and other Big Tech companies have broken their privacy promises and failed to meet their obligations to users, including children who are especially at risk of manipulation and abuse,” he said.
Butler added, “even incredibly large fines are not enough to stop data abuses where there are profits to be made.”
Center for Digital Democracy Executive Director Jeff Chester said the FTC’s proposed limitations on Meta’s data use “will bring critical protections to both children and teens.”
“It will require Meta/Facebook to engage in a proper ‘due diligence’ process when launching new products targeting young people — rather than its current method of ‘release first and address problems later approach,'” he said.
The FTC has asked Meta to respond to the independent assessor’s findings within 30 days and said, “After carefully considering the facts and any arguments by the parties, the Commission will ultimately determine whether modification of the 2020 order is in the public interest or justified by changed conditions of fact or law.”