Skip to content
Tech News
← Back to articles

UK probes Telegram, teen chat sites over CSAM sharing concerns

read original get Telegram Premium Subscription → more articles
Why This Matters

The UK’s investigation into platforms like Telegram and teen chat sites highlights ongoing concerns about illegal content sharing and online safety, emphasizing the need for stricter enforcement and accountability in the tech industry. These probes could lead to significant regulatory actions, impacting how social media and messaging services manage harmful content and protect users, especially minors. For consumers, this underscores the importance of digital safety and the role of platforms in safeguarding vulnerable populations.

Key Takeaways

Ofcom, the United Kingdom's independent communications regulator, has launched an investigation into Telegram based on evidence suggesting it's being used to share child sexual abuse material (CSAM).

The investigation was launched under the UK's Online Safety Act to examine whether the social media and instant messaging (IM) service is complying with its illegal content safety duties, which require it to prevent CSAM from being shared.

Ofcom says it received evidence regarding the alleged presence and sharing of CSAM on Telegram from the Canadian Centre for Child Protection, and that it had also conducted its own assessment of the platform.

"In light of this, we have decided to open an investigation to examine whether Telegram has failed, or is failing, to comply with its duties in relation to illegal content," Ofcom said.

However, Telegram denied Offcom's accusations, saying that it "virtually eliminated the public spread of CSAM" on its platform since 2018.

"We are surprised by this investigation and concerned that it may be part of a broader attack on online platforms that defend freedom of speech and the right to privacy," Telegram said.

Ofcom has also launched formal investigations into two teen chat sites (Teen Chat and Chat Avenue) over concerns that predators are using them to groom children and to check if the two services are taking all required steps to assess and mitigate these risks.

The UK's independent online safety watchdog is also probing X under the UK's Online Safety Act over nonconsensual sexually explicit content generated using the Grok AI chatbot account.

If it identifies compliance failures, Ofcom can impose fines of up to £18 million or 10% of qualifying worldwide revenue (whichever is greater). Additionally, in serious cases of non-compliance, it can request a court order effectively banning the offending platform in the United Kingdom.

"In the most serious cases of non-compliance, and where appropriate given risks of harm to individuals in the UK, we can seek a court order to require third parties to take action to disrupt the business of the provider," Ofcom noted.

... continue reading