Instagram Enhances Teen Safety with New Parental Controls and Privacy Protections

Riddhi Doshi 17/09/2024

Instagram is set to overhaul its platform with significant updates aimed at enhancing privacy and parental control for teenage users.

These changes are part of a broader initiative to ensure that young people have a safer experience online while giving parents more oversight of their children’s activities on the platform. The new “teen accounts” will be introduced for users aged 13 to 15, and they will include several built-in protections and controls by default, rather than relying on teenagers to opt-in manually.

Enhanced Privacy Settings for Teen Accounts

One of the major changes Instagram is rolling out involves default privacy settings for teen accounts. Once the update goes live, teenagers’ posts will automatically be set to private, meaning only approved followers will be able to view their content. New followers will also need to be manually approved, giving teenagers more control over who can interact with them online. This is a significant step in limiting exposure to unwanted or potentially harmful interactions.

In addition, these privacy settings will not be changeable unless a parent or guardian supervises the account. Alternatively, teenagers will be able to modify these settings when they turn 16. By forcing privacy settings on by default, Instagram is placing more responsibility on parents and guardians to oversee their children’s online activities, which could reassure those concerned about the dangers teens face in the digital world.

Growing Pressure for Safer Social Media Platforms

These changes come amid growing pressure on social media companies to make their platforms safer for younger users. Over the past few years, concerns about teenagers’ exposure to harmful content, including bullying, sexual exploitation, and mental health issues, have escalated. Meta, Instagram’s parent company, has been at the center of this conversation, facing scrutiny over whether it is doing enough to protect vulnerable users.

UK-based children’s charity NSPCC welcomed the new measures as a "step in the right direction," but they also raised concerns that Meta’s new system still places too much responsibility on children and their parents to keep themselves safe. Rani Govender, the NSPCC’s online child safety policy manager, said that while the changes are positive, more proactive measures are needed to prevent harmful content from spreading on Instagram in the first place. “All children deserve comprehensive protections on the products they use,” Govender emphasized.

Rollout and Availability of New Teen Accounts

The new teen accounts will initially be launched in the UK, US, Canada, and Australia, with plans to expand to the European Union later this year. Meta describes this update as a “new experience for teens, guided by parents,” aimed at giving parents peace of mind while offering young users the necessary protections as they navigate the platform.

However, despite these changes, there are still questions about how much parents will engage with the new tools available to them. Media regulator Ofcom raised concerns earlier this year over the apparent reluctance of many parents to intervene and take action to keep their children safe online. According to Sir Nick Clegg, a senior Meta executive, “One of the things we find… is that even when we build these controls, parents don’t use them.”

Concerns Over Effectiveness

Some parents, like Ian Russell, whose 14-year-old daughter Molly took her own life after being exposed to harmful content on Instagram, are cautiously optimistic about the changes but remain concerned about their effectiveness. “Whether it works or not we’ll only find out when the measures come into place,” Russell told the BBC. “Meta is very good at drumming up PR and making big announcements, but they also need to be transparent and share how well these measures are working.”

Indeed, the effectiveness of these changes will largely depend on how well Instagram enforces them. According to social media industry analyst Matt Navarra, teenagers are often resourceful when it comes to bypassing restrictions. “As we’ve seen with teens throughout history, they will find a way around blocks if they can,” Navarra said. He emphasized that Instagram will need to ensure its safeguards cannot be easily circumvented by tech-savvy teens.

How the New System Will Work

For teenagers aged 13 to 15, Instagram will automatically enable strict controls on sensitive content. These controls will prevent potentially harmful material from being recommended to young users. Notifications will also be muted overnight to minimize screen time during sleeping hours, helping teens to maintain healthier habits. Additionally, teenagers’ accounts will be set to private by default, meaning users must actively accept new followers, and their content will not be visible to people who do not follow them.

Parents who choose to supervise their child’s account will have access to a range of oversight tools. They will be able to see who their child messages and the topics their child has expressed interest in. However, parents will not be able to view the content of messages, maintaining a balance between privacy and supervision.

Instagram will begin migrating millions of existing teen accounts into the new system within 60 days of notifying users of the changes.

Age Verification and AI Detection

While Instagram’s new system will rely on users being honest about their ages, the platform already has tools in place to verify a user’s age if there are doubts. Starting in January 2024, Instagram will begin using artificial intelligence (AI) to proactively detect teens using adult accounts and place them back into the appropriate teen account settings.

This follows the UK’s Online Safety Act, passed earlier this year, which mandates online platforms to take action to protect children or face significant fines. Ofcom, the regulatory authority, has warned that social media platforms could be named and shamed—and potentially banned for under-18s—if they fail to comply with the new safety rules.

Moving Forward: Challenges and Opportunities

While the new changes represent a significant step forward for Instagram’s teen safety measures, they also raise several questions. With platforms like Snapchat and YouTube also introducing family-centric control tools, including limiting content recommendations, there is a clear shift in the industry toward more stringent oversight for young users.

Still, the central issue remains whether these measures will be enough to protect teenagers from harmful content. An Ofcom study found that every child it interviewed had encountered violent material online, with Instagram, WhatsApp, and Snapchat named as the most frequently mentioned platforms. Although the largest social media platforms are taking steps to address these problems, the persistence of harmful content indicates that much work remains.

Under the Online Safety Act, platforms will be required to show they are committed to removing illegal content, including child sexual abuse material and content that promotes self-harm or suicide. However, the rules may not fully come into force until 2025, leaving a gap where teens are still at risk.

Final Thoughts

Instagram’s latest tools are placing more control in the hands of parents, allowing them to decide how much freedom their children should have on the platform and providing oversight of their interactions. However, as social media expert Paolo Pescatore notes, the root of the issue lies in how the platform operates and what content is recommended to young users. More still needs to be done to ensure that children are protected from disinformation and inappropriate content, starting with giving control back to parents while also holding tech companies accountable for the safety of their platforms.

Share this article

Leave your comments

Post comment as a guest