Meta wants parents, app stores to keep teens off unsafe apps


Nov 1, 2015

Meta wants parents, app stores to keep teens off unsafe apps​

The tech giant is calling for legislation requiring that parents approve teens’ downloads on app stores like Google and Apple​

By Cristiano Lima
Naomi Nix
November 15, 2023 at 9:00 a.m. EST

Facebook's head of global safety Antigone Davis speaks during a roundtable on cyberbullying. (Evan Vucci/AP)

5 min

Meta is pushing for rival tech giants such as Google and Apple to play a bigger role in keeping teens off potentially harmful sites, calling for the first time for legislation to require app stores to get parental approval when users age 13 to 15 download apps.

The proposal, which the Facebook and Instagram parent company is set to announce Wednesday, counters mounting calls by state and federal policymakers for individual sites to proactively screen kids to limit their use of social media platforms over safety concerns.

Antigone Davis, Meta’s global head on safety, will argue the “best way to help support parents and young people is a simple, industry-wide solution where all apps are held to the same, consistent standard,” according to a blog post shared exclusively with The Washington Post.

“With this solution, when a teen wants to download an app, app stores would be required to notify their parents, much like when parents are notified if their teen attempts to make a purchase,” Davis writes. “Parents can decide if they want to approve the download.”

Meta’s newest position comes as policymakers debate how much responsibility various Silicon Valley giants should take to protect youth on internet platforms. In recent years, lawmakers and children’s safety advocates have largely focused their attention on combating the damaging experiences of kids on social media apps such as Instagram, Snapchat and TikTok.

While Davis’s blog post does not name any specific companies, if implemented, the company’s proposal would shift much of the onus of verifying children’s ages to Google’s Play Store and the Apple App Store.

“What we’re really trying to do is create something that’s simple and consistent for parents ... If parents are approving apps, what you don’t want is for parents to chase every single app,” Davis said in an interview Tuesday.

Davis’s comments arrive as states take up sweeping new measures to restrict kids from accessing social media amid renewed concern that such products may compromise the mental health and well-being of younger users. States including Arkansas and Utah this year passed laws requiring minors get parental consent to create accounts on platforms including Meta-owned Instagram and TikTok. Some of these proposals mandate that tech companies attempt to verify users’ ages.
Federal lawmakers have proposed similar bills to create an age minimum for social media, citing concerns that the sites are contributing to teen mental health issues like anxiety and depression.

But the proposals face significant implementation hurdles, as companies have struggled to develop nonintrusive and effective ways to vet users’ ages.

Industry groups and digital rights advocates have criticized the efforts, arguing that such laws will force companies to collect more data from younger users, undermining children’s privacy. Several of the laws have also run into major legal hurdles, with federal judges halting the measures in Arkansas and Utah over concerns they may be unconstitutional.

In the blog post, Davis argues that by allowing parents to “verify the age of their teen when setting up their phone,” it would negate “the need for everyone to verify their age multiple times across multiple apps.”

“Teens move interchangeably between many websites and apps, and social media laws that hold different platforms to different standards in different states will mean teens are inconsistently protected,” Davis writes.

Both Google and Apple offer optional services that allow parents to manage or block their children’s app downloads. Meta’s proposal would make it a federal requirement that app stores get parental consent before downloading apps for some teens.
Meta has faced growing scrutiny in recent years over its efforts to protect kids, which reached new heights in 2021 after Facebook whistleblower Frances Haugen disclosed internal research showing the company’s products at times worsened body image issues among teen girls.

Last week, another former Facebook staffer testified to Congress that the company ignored internal warnings that it was failing to devote adequate resources and staff to safeguarding its most vulnerable users, particularly kids. At the same time, the company has worked to attract younger users as it competes with rival apps such as Snapchat and TikTok, both popular among the youth.

This isn’t the first time Meta has subtly taken a swipe at Apple. In 2021, Meta ran a marketing campaign making the case that targeted advertising on the internet helped small businesses, following the phone-maker’s decision to cut down on the practice with new privacy rules.

Meta’s leadership has suggested that companies such as Apple and Google should play a larger role in vetting users’ ages as far back as 2021, but Wednesday’s announcement marks the first time the company has publicly called for federal legislation mandating such a system.

Davis told The Post that the company is “actively engaging” with its Silicon Valley peers to “find an industry-wide solution,” as well as with government officials.

“We’ve done some discussions at the state level with state legislators as various pieces of legislation have been passing,” as well as with federal policymakers, she said.


Dec 15, 2019
Great. I agree with them:ohhh:

Can we start with the instagram app? :pachaha:

Ayo , that page is wild :ohhh:



All Star
May 1, 2012
This is already on iOS - set up a family and you can set child accounts to ask parent approval before downloading ANY app.