Skip to content
Tech News
← Back to articles

Social media giants urged to tighten child safety after UK rejects blanket ban for teens

read original get Parental Control Smartphone App → more articles
Why This Matters

The push for stricter child safety measures on social media highlights the ongoing challenge for regulators and tech companies to balance innovation with online safety. While some countries consider bans, others are urging platforms to implement more robust protections, emphasizing the importance of safeguarding minors in digital spaces. This development signals a growing industry focus on accountability and responsible design to protect young users.

Key Takeaways

Governments across Europe are weighing stricter regulations to limit teens' use of social media after Australia became the first country to enforce a sweeping ban for under-16s in December. Spain , France, and Denmark are among the countries weighing similar measures.

The U.K. government has launched a consultation on children's social media use to gather views of parents and young people on whether a social media ban would be effective.

It comes after U.K. lawmakers voted against a proposal to include a social media ban for under-16s in the a piece of child welfare legislation being debated earlier this month.

Online safety organizations Ofcom and the Information Commissioner's Office said they had written to YouTube, TikTok, Facebook, Instagram, and Snapchat on Thursday, urging them to tackle a broad range of child safety issues, from implementing stringent age verification measures to tackling child grooming on their platforms.

U.K. regulators are calling on social media giants to enforce stricter protection for children on their platforms after a blanket ban for under-16s was rejected by lawmakers.

Ofcom said it had written to social media platforms calling on them to report on what they're doing to keep children off their platforms, with a deadline of April 30 for them to respond.

Its demands included better enforcement of minimum age requirements, preventing strangers from being able to contact children, safer content for teens, and an end to product testing, such as AI, on children.

Tech giants are "failing to put children's safety at the heart of their products," and are falling short on promises to keep children safe online," said Ofcom CEO Melanie Dawes.

"Without the right protections, like effective age checks, children have been routinely exposed to risks they didn't choose, on services they can't realistically avoid," Dawes said.

The ICO published an open letter on Thursday, saying that social media platforms need to use facial age estimation, digital ID, or one-time photo matching to get better at age verification.

... continue reading