Instagram, the immensely popular photo-sharing app, faces increasing scrutiny regarding its impact on the mental health of its youngest users.
In response to pressure from experts and authorities, the platform has announced significant updates to enhance teen account safety.
Antigone Davis, Meta’s vice president for safety issues, emphasized that these changes aim to give parents peace of mind.
New Safety Features for Teen Accounts
Under the new policy, users aged 13 to 15 will automatically have private accounts. This update includes tighter controls over who can contact them and what content they can view.
Teens who want a public profile—often motivated by aspirations to become influencers—must now obtain parental permission.
These rules apply to both existing and new users, marking a substantial shift in how Instagram manages younger audiences.
Davis stated,
However, it remains unclear whether these measures will sufficiently address concerns from governments and online safety advocates about Instagram’s addictive nature and its potential harm to young people’s mental health.
The Dark Side of Instagram
Research consistently links excessive Instagram use with negative mental health outcomes among adolescents.
For instance, a report from Facebook revealed that 32% of teen girls felt worse about their bodies after using the platform.
Critics argue that Instagram fosters unrealistic beauty standards, leading to body dissatisfaction, anxiety, and depression.
Matthew Bergman, founder of the Social Media Victims Law Center, raised concerns about Instagram’s role in promoting harmful content.
He noted that many young users have been led down dangerous paths by algorithmically recommended videos. This exposure contributes to severe mental health crises, including suicidal ideation.
Ongoing Legal and Regulatory Pressure
The scrutiny on Instagram has intensified globally. Last October, forty U.S. states filed a complaint against Meta, accusing its platforms of endangering the mental and physical health of young people through addictive features and harmful content exposure.
Meanwhile, Australia is considering new regulations to raise the minimum age for social media use to between 14 and 16 years.
Despite this pressure, Meta has resisted implementing widespread age verification measures due to privacy concerns.
Davis suggested that age checks could be more effectively managed at the operating system level by companies like Google and Apple, which have accurate user data.
While Meta has introduced some measures—such as restricting the promotion of extreme diets—advocates argue that these actions are merely “baby steps.”
Experts believe that more comprehensive changes are necessary to reduce the addictive nature of social media platforms like Instagram.
Bergman advocates for a fundamental shift in how social media operates: