Meta apologizes after Instagram users subjected to graphic content


This recording was made using enhanced software.

Full story

  • Many Instagram users received unwanted violent or graphic content on their reels page. Some of the users realized that others had a similar experience when they communicated on other social media.
  • Meta apologized for the mistake.
  • The company says it has 15,000 reviewers and advanced technology to help detect, prioritize and remove disturbing content.

Full Story

Instagram users recently received more than they signed up for when logging into the app. Meta, which operates Instagram, admitted that an error subjected users to violent and graphic content on their reels page.

How has Meta responded to the complaints?

Meta said it has fixed an error that caused some users to see explicit, unwanted content on their Instagram reels feed. The social media company apologized for the mistake.

Instagram users shared the news with each other on social media platforms about an uptick in violent content and recommendations dubbed “not safe for work.”

The explicit reels showed up, even though some users had the sensitive content control features set to the highest level.

What is Meta’s policy about graphic content?

According to Meta’s own policy, it works to protect users from disturbing images and removes especially graphic content. This content includes videos that show dismemberment, innards and charred bodies, as well as sadistic remarks centered around images that depict human or animal suffering.

What did Instagram users specifically see?

CNBC reported images that Instagram users unexpectedly received included dead people, graphic injuries and violent attacks.

Meta said it has a team of more than 15,000 reviewers and advanced technology to help detect, prioritize and remove disturbing content before it offends users.

The latest error comes less than two months after the company announced plans to improve its policies to promote free expression.

Tags: ,

Bias comparison

  • Media outlets on the left utilize neutral language by categorizing the situation as an "error," emphasizing the minimized severity of inappropriate content exposure.
  • Media outlets in the center adopt a more dramatic tone, detailing specific instances of extreme violence and identifying types of graphic content, emphasizing a heightened sense of urgency.
  • Media outlets on the right highlight persistent issues with disturbing material, suggesting a systemic problem beyond an isolated incident.

Media landscape

Click on bars to see headlines

23 total sources

Key points from the Left

  • Meta apologized on Feb. 27 after an error exposed Instagram users to graphic and violent content on their Reels page.
  • Instagram users reported seeing inappropriate content despite having the Sensitive Content Control set to its highest level.
  • Meta's policy aims to remove violent content to protect users, but this error led to disturbing imagery being shown.
  • A Meta spokesperson stated, "We apologize for the mistake," as reported by CNBC.

Report an issue with this summary

Key points from the Center

  • Meta admitted an "error" caused Instagram users to view violent and pornographic content on their Reels pages, including school shootings and murders.
  • The company apologized for the issue and stated that it has been fixed, though details were not provided on the cause of the error.
  • Many users reported seeing extreme content despite having activated Sensitive Content Control, violating Meta's policies.
  • Meta will remove graphic content from the platform and has implemented measures to protect underage accounts.

Report an issue with this summary

Key points from the Right

  • Meta apologized for an "error" that caused Instagram to recommend disturbing and violent videos to users’ Reels feeds, including minors.
  • Even after claiming the problem was fixed, a Wall Street Journal reporter still saw disturbing videos late Wednesday night.
  • According to a company spokesperson, Meta removed over 10 million pieces of violent content from Instagram from July to September of last year.
  • An Instagram spokesman stated, "We have fixed an error that caused some users to see content in their Instagram Reels feed that should not have been recommended."

Report an issue with this summary

Powered by Ground News™
This recording was made using enhanced software.

Full story

  • Many Instagram users received unwanted violent or graphic content on their reels page. Some of the users realized that others had a similar experience when they communicated on other social media.
  • Meta apologized for the mistake.
  • The company says it has 15,000 reviewers and advanced technology to help detect, prioritize and remove disturbing content.

Full Story

Instagram users recently received more than they signed up for when logging into the app. Meta, which operates Instagram, admitted that an error subjected users to violent and graphic content on their reels page.

How has Meta responded to the complaints?

Meta said it has fixed an error that caused some users to see explicit, unwanted content on their Instagram reels feed. The social media company apologized for the mistake.

Instagram users shared the news with each other on social media platforms about an uptick in violent content and recommendations dubbed “not safe for work.”

The explicit reels showed up, even though some users had the sensitive content control features set to the highest level.

What is Meta’s policy about graphic content?

According to Meta’s own policy, it works to protect users from disturbing images and removes especially graphic content. This content includes videos that show dismemberment, innards and charred bodies, as well as sadistic remarks centered around images that depict human or animal suffering.

What did Instagram users specifically see?

CNBC reported images that Instagram users unexpectedly received included dead people, graphic injuries and violent attacks.

Meta said it has a team of more than 15,000 reviewers and advanced technology to help detect, prioritize and remove disturbing content before it offends users.

The latest error comes less than two months after the company announced plans to improve its policies to promote free expression.

Tags: ,

Bias comparison

  • Media outlets on the left utilize neutral language by categorizing the situation as an "error," emphasizing the minimized severity of inappropriate content exposure.
  • Media outlets in the center adopt a more dramatic tone, detailing specific instances of extreme violence and identifying types of graphic content, emphasizing a heightened sense of urgency.
  • Media outlets on the right highlight persistent issues with disturbing material, suggesting a systemic problem beyond an isolated incident.

Media landscape

Click on bars to see headlines

23 total sources

Key points from the Left

  • Meta apologized on Feb. 27 after an error exposed Instagram users to graphic and violent content on their Reels page.
  • Instagram users reported seeing inappropriate content despite having the Sensitive Content Control set to its highest level.
  • Meta's policy aims to remove violent content to protect users, but this error led to disturbing imagery being shown.
  • A Meta spokesperson stated, "We apologize for the mistake," as reported by CNBC.

Report an issue with this summary

Key points from the Center

  • Meta admitted an "error" caused Instagram users to view violent and pornographic content on their Reels pages, including school shootings and murders.
  • The company apologized for the issue and stated that it has been fixed, though details were not provided on the cause of the error.
  • Many users reported seeing extreme content despite having activated Sensitive Content Control, violating Meta's policies.
  • Meta will remove graphic content from the platform and has implemented measures to protect underage accounts.

Report an issue with this summary

Key points from the Right

  • Meta apologized for an "error" that caused Instagram to recommend disturbing and violent videos to users’ Reels feeds, including minors.
  • Even after claiming the problem was fixed, a Wall Street Journal reporter still saw disturbing videos late Wednesday night.
  • According to a company spokesperson, Meta removed over 10 million pieces of violent content from Instagram from July to September of last year.
  • An Instagram spokesman stated, "We have fixed an error that caused some users to see content in their Instagram Reels feed that should not have been recommended."

Report an issue with this summary

Powered by Ground News™