Friday, 19 March 2021

Facebook is developing an Instagram app for children under the age of 13.

"We have identified youth work as a priority for Instagram and have added it to our H1 priority list, Instagram vice president of product Vishal Shah said in an internal memo, according to BuzzFeed.

"We will be building a new youth pillar within the Community Product Group to focus on two things: (a) accelerating our integrity and privacy work to ensure the safest possible experience for teens and (b) building a version of Instagram that allows people under the age of 13 to safely use Instagram for the first time," Shah added, according to BuzzFeed.

Currently, Instagram policies prohibit children under 13 from using the app, though a parent or manager can manage an account on their behalf.

BuzzFeed News reported the kid-focused version will be overseen by Instagram head Adam Mosseri and led by Pavni Diwanji, a Facebook vice president who previously led YouTube Kids and other child-focused products at the Google subsidiary.

"Increasingly kids are asking their parents if they can join apps that help them keep up with their friends. Right now there aren't many options for parents, so we're working on building additional products - like we did with Messenger Kids - that are suitable for kids, managed by parents," a Facebook spokesperson told Insider in a statement.

We're exploring bringing a parent-controlled experience to Instagram to help kids keep up with their friends, discover new hobbies and interests, and more," they added.

But Facebook's push to draw young children into its app ecosystem is likely to draw scrutiny given its track record on privacy, preventing abuse and harassment, and scandals involving its Messenger Kids app.

Facebook's stepped-up efforts to protect children follow years of reports that rampant bullying, child sex abuse material, and child exploitation exists on its platform, and some research suggests the problem may be getting worse.

A November report by the UK-based National Society for the Prevention of Cruelty to Children found Instagram was the most widely used platform in child grooming cases in the early months of the pandemic, being used in 37% of cases, up from 29% in the past three years. The US-based National Center for Missing and Exploited Children said in 2020, Facebook and its family of apps reported 20.3 million instances of possible child abuse on their platforms.

Facebook said in January that its AI systems "proactively" catch 99% of child exploitation content before it's reported by users or researchers - however, that number doesn't account for content that goes unreported.

In 2019, a privacy flaw in Facebook's Messenger Kids app also allowed thousands of children to enter chats with strangers, and Facebook secretly built an app that paid teens to give it extensive access to their phone and internet usage data, before Apple forced Facebook to shutter the app for violating its App Store policies.

That same year, the Federal Trade Commission hit Facebook with a $5 billion fine over privacy violations - though privacy advocates have argued that it did little to prevent Facebook from scooping up user data.

https://ift.tt/3raqwYh

No comments:

Post a Comment