Skip to content
Tech News
← Back to articles

Kids, Social Media and Safety: Why a Years-Long Battle Has No End in Sight

read original get Parental Control Software → more articles
Why This Matters

This article highlights the ongoing challenges in protecting children from online harms, including exploitation, cyberbullying, and mental health issues, emphasizing that social media platforms often fall short in safeguarding young users. It underscores the urgent need for improved regulation, accountability, and collaborative efforts among platforms, parents, and authorities to create safer digital environments for children. The case exemplifies the devastating real-world consequences of online abuse and the importance of proactive measures to prevent such tragedies.

Key Takeaways

John Doe was around 13 years old when he was tricked, blackmailed and threatened by sex traffickers on Snapchat into taking nude photos and videos of himself. Two years later, he learned from his high school classmates that his images were being shared as child sex abuse material on Twitter.

The social networking platform, later renamed X, initially dismissed the family's reports, responding: "We've reviewed the content, and didn't find a violation of our policies, so no action will be taken at this time."

John Doe's family filed multiple reports with Twitter, their local police department and ultimately with the US Department of Homeland Security before Twitter removed the sexually graphic material. While it was live, the illegal content racked up over 167,000 views on Twitter, and John Doe experienced "harassment, vicious bullying, and became suicidal," according to the family's initial complaint in its 2021 lawsuit against Twitter.

The creation and distribution of child sex abuse material, called CSAM, is one of the extreme dangers that children and teenagers face when using the internet. There's always been a spectrum of potential online harms, going back to the early days of MySpace.

Parents have long been concerned about the lasting psychological effects of screen addiction. Many young people say social media harms their mental health, fostering isolation, anxiety and depression. Researchers have found that teens who spend time scrolling through curated and edited content can develop unrealistic body images and eating disorders. Others might turn to suicide and self-harm or become vulnerable to predators.

Few issues in today's digital age spark as much fiery debate as online safety, regulation and policy for children and teens. The dilemma is determining who has the primary duty and role in safeguarding children: social media companies, the government, parents, educators — or a combination.

At the core of the debate is a belief shared by many: The internet should be safe for its youngest users. But nobody can agree on how, exactly, to make that a reality.

Existing programs have loopholes, and proposed legislation and initiatives — especially age-verification laws — are controversial. While policymakers and tech leaders endlessly debate the merits and pitfalls of each potential solution, younger generations, their parents and educators are forced to navigate an ever-changing terrain rife with landmines.

Social media's Big Tobacco moment

One mechanism for change is through the courts. On March 24, a New Mexico jury found Meta liable for misleading users about safety and allowing child exploitation on its platforms, ordering the company to pay $375 million to the state in penalties. The next day, a Los Angeles jury found Google and Meta liable for creating intentionally addictive platforms to hook young users, ordering the two companies to pay a combined $3 million in compensatory damages.

... continue reading