Tech News
← Back to articles

Will you have to show your ID at the app store?

read original related products more articles

This is The Stepback, a weekly newsletter breaking down one essential story from the tech world. For more on the action (and inaction) of lawmakers seeking to rein in tech platforms, follow Lauren Feiner. The Stepback arrives in our subscribers’ inboxes at 8AM ET. Opt in for The Stepback here.

How it started

In the offline world, age verification is often as simple as flashing a cashier your driver’s license to buy a pack of beer, or an adult magazine (for whoever still does this kind of thing). Advocates for stronger barriers preventing children from accessing online porn have long argued for an equivalent on the internet: online age verification. The idea comes with different challenges than those that exist in the physical world, like the possibility of that information getting hacked, which could be enough to chill consumers from trying to access legal speech. In a 2004 Supreme Court ruling, Ashcroft v. ACLU, the justices found that age verification couldn’t be mandated on porn sites since policymakers had yet to show that less burdensome alternatives, like enabling parents to turn content filters on their own computers to block kids from accessing inappropriate sites, were less effective.

Still, activists and many legislators continued to focus efforts on porn sites and other platforms they believed stood to bear the most harm to kids and teens, or expose them to the very things the local corner store would have barred them from accessing. Last year, the Supreme Court cracked open the door to some versions of this age verification on the internet. The court effectively decided that the now-vast and highly accessible open internet required the court to reconsider its earlier ruling and that “adults have no First Amendment right to avoid age verification.”

At the same time, efforts across many states to require age verification to access social media platforms have largely been blocked in the courts. It’s one thing to keep kids from accessing pornography, but it’s another to place hurdles in front of teens and adults trying to access a broad swath of speech. While courts generally recognize that minors don’t have a right to access porn, placing hurdles in front of both children and adults to access other legal speech creates serious constitutional challenges. While teens might encounter some content on social media sites that the state could have a compelling interest in shielding them from, they’re also likely to come across a lot more speech that is fully protected, making it trickier to impose age verification to access these content platforms. That’s led some advocates and policymakers to focus on a different kind of platform — one that may arguably be a closer equivalent to the local corner store.

How it’s going

App stores are the gateway to many of the platforms that users enjoy every day. While it’s possible to navigate to various websites from a mobile or desktop browser, most users choose to use apps for a richer and more streamlined experience on their favorite social media services and games.

That’s made the centralized nature of mobile app stores like Apple’s and Google’s attractive targets for age gating. Rather than play whack-a-mole with millions of apps, proponents of app store age verification laws see the marketplaces as ideal checkpoints. Plus, it would mean users would only have to send relevant age information to one or two companies one time, rather than to many companies with less-tested security protocols any time they wanted to download an app.

Parent advocates pushed for the first version of the law that passed in Utah, with similar versions later passed in Texas and Louisiana. But the method gained backing from Meta, Snap, and X — all developers who would benefit from age verification responsibility largely falling on Apple’s and Google’s app stores, rather than their own services. That could push more of the heat onto the app stores when young users come across harmful content or people on their own social platforms. While Apple has remained critical of the approach and fought against the laws, Google has taken a slightly different tactic, recently backing a separate method passed in California. Google said that the law, which Meta also supports, protects consumer privacy and recognizes keeping kids safe online is a “shared responsibility across the ecosystem.” The California model requires desktop or mobile operating systems to collect the age or date of birth of the account holder upon signup to share with the app store and relevant apps when they’re downloaded. But under other versions of the law, those who are really motivated could try to access some of the same sites through a browser, rather than a mobile app.

What happens next

... continue reading