The European Union has found that both Meta and TikTok failed to properly protect children, including making it difficult to report the presence of child sexual abuse material (CSAM) in their apps.
Separately, Meta has faced a setback in its defense of lawsuits filed by multiple US states, accusing the company of deliberately making its apps addictive despite knowing they were harmful to teenagers …
Meta and TikTok failed to protect children
Both Meta and TikTok are guilty of violating child protection rules found within the Digital Services Act (DSA), according to what the EU described as preliminary findings.
Specifically, both companies were found to have unlawfully placed barriers in the way of researchers seeking data on whether children are exposed to illegal or harmful content.
Today, the European Commission preliminarily found both TikTok and Meta in breach of their obligation to grant researchers adequate access to public data under the Digital Services Act (DSA) […] The Commission’s preliminary findings show that Facebook, Instagram and TikTok may have put in place burdensome procedures and tools for researchers to request access to public data. This often leaves them with partial or unreliable data, impacting their ability to conduct research, such as whether users, including minors, are exposed to illegal or harmful content.
Meta was additionally found to have made it difficult for users to report illegal content, such as CSAM.
Neither Facebook nor Instagram appear to provide a user-friendly and easily accessible ‘Notice and Action’ mechanism for users to flag illegal content, such as child sexual abuse material and terrorist content.
Worse than this, Meta was accused of using so-called dark patterns to deliberately make filing such reports both complex and confusing.
Facebook and Instagram appear to use so-called ‘dark patterns’, or deceptive interface designs, when it comes to the ‘Notice and Action’ mechanisms.
... continue reading