The European Union has found that both Meta and TikTok failed to properly protect children, including making it difficult to report the presence of child sexual abuse material (CSAM) in their apps.
Separately, Meta has faced a setback in its defense of lawsuits filed by multiple US states, accusing the company of deliberately making its apps addictive despite knowing they were harmful to teenagers …
Meta and TikTok failed to protect children
Both Meta and TikTok are guilty of violating child protection rules found within the Digital Services Act (DSA), according to what the EU described as preliminary findings.
Specifically, both companies were found to have unlawfully placed barriers in the way of researchers seeking data on whether children are exposed to illegal or harmful content.
Today, the European Commission preliminarily found both TikTok and Meta in breach of their obligation to grant researchers adequate access to public data under the Digital Services Act (DSA) […]
The Commission’s preliminary findings show that Facebook, Instagram and TikTok may have put in place burdensome procedures and tools for researchers to request access to public data. This often leaves them with partial or unreliable data, impacting their ability to conduct research, such as whether users, including minors, are exposed to illegal or harmful content.
Meta was additionally found to have made it difficult for users to report illegal content, such as CSAM.
Neither Facebook nor Instagram appear to provide a user-friendly and easily accessible ‘Notice and Action’ mechanism for users to flag illegal content, such as child sexual abuse material and terrorist content.
Worse than this, Meta was accused of using so-called dark patterns to deliberately make filing such reports both complex and confusing.
Facebook and Instagram appear to use so-called ‘dark patterns’, or deceptive interface designs, when it comes to the ‘Notice and Action’ mechanisms.
Both companies will now have an opportunity to examine the report and to file responses. Should the responses be deemed unacceptable, the companies can be fined up to 6% of their total worldwide annual turnover.
Accusations of Meta hiding reports of teen harm
Back in 2021, Meta was accused of suppressing internal research showing that Instagram was toxic for teen girls. When the reports were made public following a Wall Street Journal investigation, the company claimed that the findings were taken out of context.
Multiple US states filed lawsuits against the company, accusing it of deliberately seeking to make its apps addictive despite knowing that they were harmful to teenagers.
It was said that the company’s lawyers advised Meta to keep the findings quiet, and the social network tried to argue that this advice could not be heard in court since it was subject to attorney-client privilege.
Bloomberg Law (via Engadget) reports that the judge has dismissed this argument.
Judge Yvonne Williams ruled that Meta can’t use attorney client privilege to block the DC attorney general from using Meta’s internal documents in the District’s suit over teen mental health harms.
Williams, ruling for the Superior Court of the District of Columbia, found the communication in the documents falls under the crime-fraud exception to attorney client privilege because Meta sought legal advice to obfuscate potential liability or engage in fraud.
“The Court notes that Meta’s counsel explicitly advised Meta researchers to ‘remove,’ ‘block, ‘button up,’ ‘limit,’ and ‘update’ their research,” Williams said. “Meta’s counsel offered such legal advice to specifically limit Meta’s potential liability.”
The first of the lawsuits is set to be heard next year.
Highlighted accessories
- Official Apple Store on Amazon
- Apple 40W Dynamic Power Adapter for iPhone 17
- Official Apple iPhone Air cases and bumpers
- iPhone Air MagSafe Battery
- Official iPhone Air case
- Official iPhone 17 cases
- Official iPhone 17 Pro cases and Pro Max cases
Photo by Christopher Ryan on Unsplash
FTC: We use income earning auto affiliate links. More.

Comments