Facebook has been accused of misleading parents and failing to safeguard children's privacy on its Messenger Kids app.
The Federal Trade Commissionfound that Meta — Facebook's parent company — had "misrepresented the access" granted to app developers for users' private dataand "misled parents about their ability to control with whom their children communicated."
As a result of its findings, on Wednesday the FTC proposed significant changes to a 2020 privacy order with Meta which would restrict it from profiting from data it gathers on users under 18, including via its virtual-reality products.
Additionally, Meta would have to comply with other restrictions, such as being required to disclose and obtain users' consent to use facial-recognition technology and offer further protections for its users.
Meta would also be "prohibited" from releasing new products or services without "written confirmation from the assessor that its privacy program is in full compliance" with the order, and it would need to ensure compliance with the FTC order for any companies it acquires or merges with and uphold the acquired companies' previous privacy commitments.
This is the third time the FTC has acted against Meta over allegations of not protecting users' privacy.
SEE MORE: TikTok fined $15.9M by UK watchdog over misuse of kids' data
"Facebook has repeatedly violated its privacy promises," said Samuel Levine, Director of the FTC's Bureau of Consumer Protection. "The company's recklessness has put young users at risk, and Facebook needs to answer for its failures."
The 2020 order was put in place after Facebook violated a 2012 order barring the company from misrepresenting its privacy practices. The FTC said "today's action alleges" that Facebook failed to fully comply with the 2020 order and violated the Children's Online Privacy Protection Act Rule.
Under the 2020 privacy order, Facebook was required to pay a $5 billion fine and expand its privacy program, including third-party assessments of its effectiveness, but according to the FTC, the assessor identified gaps.
"The independent assessor, tasked with reviewing whether the company's privacy program satisfied the 2020 order's requirements, identified several gaps and weaknesses in Facebook's privacy program, according to the Order to Show Cause, in which the Commission notes that the breadth and significance of these deficiencies pose substantial risks to the public," said the FTC.
In 2017, Facebook introduced Messenger Kids as a means for children to communicate with family members and parent-approved friends, but according to the FTC, some children were able to communicate with unapproved contacts in group text messages and video calls.
The proposed changes to the 2020 order would apply to Facebook, Instagram, WhatsApp, and Oculus.
Meta has 30 days to respond to the FTC's findings.
SEE MORE: Utah law aims to curb teen use of TikTok, Instagram
Trending stories at Scrippsnews.com