Technology News

Instagram Serves More Eating Disorder Content to Vulnerable Teens, Confidential Meta Study Reveals

Meta New Instagram Study Findings

Key Highlights

  • Meta study shows teens with body dissatisfaction see 3x more eating disorder-related content on Instagram.
  • Content includes body-focused images and judgmental messaging around body types.
  • Meta’s detection tools missed 98.5% of sensitive content shown to teens.
  • Experts warn that the algorithm may be profiling vulnerable users and reinforcing harmful patterns.

A confidential internal study by Meta has revealed that Instagram’s algorithm shows significantly more eating disorder-adjacent content to teens who already feel bad about their bodies. The research, conducted during the 2023–2024 academic year, involved 1,149 teens and a three-month analysis of their Instagram feeds.

Teens with Body Image Issues See More Harmful Content

The study found that 223 teens who frequently reported body dissatisfaction saw more than triple the amount of harmful content, which was 10.5% of their feeds, compared to 3.3% for others. This material included images highlighting specific body parts and posts containing explicit judgment about body types. 

Researchers emphasized that this content was not necessarily violating Instagram’s rules but could still contribute to harmful self-perception and disordered eating behaviours.

You may also like to read – Claude Brings New Life Sciences Tools That Automate Medical Discoveries

Provocative and Disturbing Posts Target Vulnerable Users

In addition to body-related content, teens who felt worst about themselves also encountered more content involving risky behavior, suffering, and mature themes. These categories made up 27% of what they saw, twice the rate of their peers.

Some content examples included women in lingerie, violent imagery, and artwork featuring phrases like “how could I ever compare?” and “make it all end.” Though these posts didn’t break platform rules, Meta’s own researchers found them disturbing enough to add “sensitive content” warnings in their internal report.

Detection Tools Fail to Flag Content

The research study then revealed a significant flaw in Meta’s security mechanisms; the present moderation tools failed to detect 98.5% of the sensitive content identified during the study. The researchers mentioned that this was very much anticipated since Meta was only then in the process of developing systems to detect such content.

The company’s internal report conceded that although the results do not suggest that Instagram is harmful, they do indicate a very strong link between getting to see the beauty, fitness, and fashion content and having a negative body image.

Expert Concerns: Algorithm May Be Profiling Teens

Dr. Jenny Radesky, an Associate Professor of Pediatrics at the University of Michigan, reviewed the study and labeled it disturbing. She cautioned that Instagram is probably identifying the most susceptible adolescents and entangling them deeper into adverse content by means of its recommendation algorithms.

“This supports the idea that teens with psychological vulnerabilities are being profiled by Instagram and fed more harmful content,” Radesky said. “We know that a lot of what people consume on social media comes from the feed, not from search.”

What Meta Has To Say in This

Meta’s spokesperson Andy Stone stated that the research shows the company’s commitment to enhancing safety and security. 

“This research is further proof we remain committed to understanding young people’s experiences and using those insights to build safer, more supportive platforms for teens,” he said. 

Moreover, Stone stated that Meta is now working on halving the volume of age-restricted content shown to teenagers by July 2025. The company also intends to make teenage content PG-13-equivalent. 

Researchers under Ongoing Scrutiny from Regulators 

This study comes at a time when legal and regulatory pressures are increasing. Meta is under investigation by U.S. authorities and schools that are suing the company, claiming its platforms harm students and were deceptively marketed as safe.

Past leaks have also revealed that internal warnings from Meta’s researchers about Instagram’s effect on teen mental health, especially in connection with body image issues.

Arshiya Kunwar
Arshiya Kunwar is an experienced tech writer with 8 years of experience. She specializes in demystifying emerging technologies like AI, cloud computing, data, digital transformation, and more. Her knack for making complex topics accessible has made her a go-to source for tech enthusiasts worldwide. With a passion for unraveling the latest tech trends and a talent for clear, concise communication, she brings a unique blend of expertise and accessibility to every piece she creates. Arshiya’s dedication to keeping her finger on the pulse of innovation ensures that her readers are always one step ahead in the constantly shifting technological landscape.
You may also like