About Us | Contact Us

FTC wants to prevent Facebook parent Meta from commercialising the data of young users.

After discovering that Meta had violated a 2020 privacy injunction, the FTC is considering taking action against the firm for its handling of the data of young users.

On Wednesday, the FTC announced that it was considering amending the privacy rule to make it illegal for Facebook’s parent company, Meta, to make money off of the data of users under 18. The response is the result of an assessment that revealed “several gaps and weaknesses” in the organization’s privacy programme that “pose substantial risks to the public.”

Director of the FTC’s department of consumer protection Samuel Levine stated in a news release that “Facebook has repeatedly broken its privacy promises.” “Facebook needs to take responsibility for its mistakes after the company’s carelessness put young users at risk.”

Thomas Richards, a spokesman for Meta, asserted that the business “spent vast resources building and implementing an industry-leading privacy programme under the terms of our FTC agreement.”

What is stated in the FTC proposal?

Facebook, Instagram, WhatsApp, and Oculus are just a few of the Meta brands that would be affected by the proposed amendments to the 2020 rule, which would also:

Prevent Meta and its affiliates from commercialising data from minors.
Without written certification from an assessor that Meta’s privacy programme is in complete conformity with the order, hold off on launching new goods and services.
Any businesses that Meta buys or merges with must abide by the directive.
Require Meta to get user approval before using facial recognition technology in the future.
Strengthen the 2020 order’s privacy programme clauses.

Additionally, the FTC has requested Meta to answer to claims that between late 2017 and mid-2019, the business misled parents about who their kids may connect with using the Messenger Kids app.

The FTC asserts that users were able to communicate with unapproved contacts via group text messages and group video calls, despite parental instructions that children using Messenger Kids could only contact people their guardian had allowed. According to the FTC, this misrepresentation violates a 2012 FTC order, the FTC Act, and the Children’s Online Privacy Protection Act, which requires that organisations that run websites and online services targeted at children under 13 obtain parental consent before collecting children’s personal information.

History of Meta and the FTC

The FTC has taken legal action against Meta three times for allegedly failing to safeguard consumers’ information.

The corporation was forcibly ordered in 2012 by the commission to stop misrepresenting its privacy policies.
In 2019, after the FTC’s investigation into the Cambridge Analytica data scandal, Meta consented to a second FTC order. The firm was ordered to pay a record-breaking $5 billion fine to settle allegations that it violated a 2012 FTC order by misleading customers about their ability to regulate the privacy of their personal information. The order went into effect in 2020.
In its most recent action, the FTC accuses Meta of breaking both the COPPA Rule and the 2020 directive.

Response from Meta

Richards, a representative for Meta, described the proposal as a political gimmick.

Despite three years of ongoing communication on our agreement with the FTC, Richards claimed that the agency never gave him the chance to bring up this novel, completely unheard-of theory. “Let’s be clear about what the FTC is trying to do: usurp Congress’ authority to establish industry-wide standards and instead single out one American company while allowing Chinese companies, like TikTok, to operate freely on American soil.”

Leave a Comment