GREENSBURG (TNS) — Accounts for any Instagram users under 16 will now be made private, Meta announced Tuesday.
Users will now need a parent’s approval to change the restricted settings, called “Teen Accounts” — which filter out offensive words and limit who can contact the account, NPR reported.
The changes are being made in an effort to “shield kids from harm,” according to NPR, and they will affect millions of teenagers’ Instagram accounts.
“It’s addressing the same three concerns we’re hearing from parents around unwanted contact, inappropriate contact and time spent,” said Naomi Gleit, head of product at Meta. “”So teen accounts is really focused on addressing those three concerns.”
Since accounts are being switched to private, NPR said teenagers will only be able to be messaged or tagged by people they follow on the social media platform.
The teen accounts will be private by default. Private messages are restricted so teens can only receive them from people they follow or are already connected to. “Sensitive content,” such as videos of people fighting or those promoting cosmetic procedures, will be limited, Meta said. Teens will also get notifications if they are on Instagram for more than 60 minutes and a “sleep mode” will be enabled that turns off notifications and sends auto-replies to direct messages from 10 p.m. until 7 a.m.
Instagram has faced “intensifying scrutiny” due to its failure to adequately address the platform’s role in fueling the youth mental health crisis and the promotion of child sexualization, according to NPR.
During a Congressional hearing in January, Meta CEO Mark Zuckerberg apologized to parents of kids who died of causes related to social media, like those who died by suicide following online harassment.
Around the same time, Instagram blocked content involving self-harm, eating disorders and nudity for teen users of the platform.
However, fewer than 10% of teens on Instagram had enabled the parental supervision setting by the end of 2022, the Washington Post reported. Snapchat, TikTok, Google and Discord all have rolled out parental controls in recent years, the Post said.
“The dirty secret about parental controls is that the vast majority of parents don’t use them,” said Zvika Krieger, the former director of Meta’s responsible innovation team who now works as a consultant for technology companies, the Post reported. “So unless the defaults are set to restrictive settings, which most are not, they do little to protect users.”
Another concern is people who may not be parents might try to gain oversight of a teen’s account on Instagram, NPR reported.
“If we determine a parent or guardian is not eligible, they are blocked from the supervision experience,” Meta wrote in a white paper about Tuesday’s new child safety measures, NPR said.
U.S. Surgeon General Vivek Murthy said last year that tech companies put too much on parents when it comes to keeping children safe on social media.
“We’re asking parents to manage a technology that’s rapidly evolving that fundamentally changes how their kids think about themselves, how they build friendships, how they experience the world — and technology, by the way, that prior generations never had to manage,” Murthy said in May 2023.