Tech News
← Back to articles

Instagram adds new protections for accounts that primarily feature children

read original related products more articles

Meta is introducing additional safeguards for Instagram accounts run by adults that primarily feature children, the company announced on Wednesday. These accounts will automatically be placed into the app’s strictest message settings to prevent unwanted messages, and will have the platform’s “Hidden Words” feature enabled to filter offensive comments. The company is also rolling out new safety features for teen accounts.

Accounts that will be placed into the new, stricter message settings include ones run by adults who regularly share photos and videos of their children, along with accounts run by parents or talent managers that represent children.

“While these accounts are overwhelmingly used in benign ways, unfortunately, there are people who may try to abuse them, leaving sexualized comments under their posts or asking for sexual images in DMs, in clear violation of our rules,” the company wrote in a blog post. “Today we’re announcing steps to help prevent this abuse.”

Meta says it will attempt to prevent potentially suspicious adults, such as people who have already been blocked by teens, from finding accounts that primarily feature children. Meta will avoid recommending suspicious adults to these accounts on Instagram, and vice versa, and make it harder for them to find each other in Instagram Search.

Today’s announcement comes as Meta and Instagram have taken steps over the past year to address mental health concerns tied to social media. These concerns have been raised by the U.S. Surgeon General and various states, some of which have even gone so far as to require parental consent for access to social media.

The changes will significantly impact the accounts of family vloggers/creators and parents running accounts for “kidfluencers,” both of which have faced criticism for the risks associated with sharing children’s lives on social media. A New York Times investigation published last year found that the parents are often aware of their child’s exploitation or even participating in it, by selling photos or clothing their child wore. In The NYT’s examination of 5,000 parent-run accounts, it found 32 million connections to male followers.

The company says the accounts that are placed into these stricter settings will see a notification at the top of their Instagram Feed notifying them that the social network has updated their safety settings. The notice will also prompt them to review their account privacy settings.

Meta notes it has removed almost 135,000 Instagram accounts that were sexualizing accounts that primarily feature children, as well as 500,000 Instagram and Facebook accounts that were associated with the original accounts it had removed.

Image Credits:Meta

Alongside today’s announcement, Meta is also bringing new safety features to DMs in Teen Accounts, its app experience with built-in protections for teens that are automatically applied.

... continue reading