
Meta Apologizes for Violent Videos Shown to Instagram Users
Meta has apologized and fixed a glitch that caused Instagram’s short-form video system, Reels, to show graphic and violent content to users. The disturbing videos appeared in the recommended section for nearly all users for a week in late February.
The company finally issued an apology and moved to fix the mistake on February 26. “We have fixed an error that caused some users to see content in their Instagram Reels feed that should not have been recommended,” a Meta spokesperson said in a statement. “We apologize for the mistake.”
Meta explicitly prohibits the dissemination of graphic content, including “dismemberment,” and “visible innards,” according to its own content policy.
The glitch comes shortly after Meta publicly announced it would loosen its content moderation and fact-checking programs to allow “free expression.” However, the company ensures that the incident is unrelated to the recent policy changes.
Reports of the issue quickly spread across social media and on Reddit, particularly in the Instagram subreddit. The forum became flooded with complaints from users describing the severity of what they were seeing.
“I’ve been having THE MOST VIOLENT posts on my [Reels feed]. People being [sliced, chopped, thrown, crushed,] you name it,” one Reddit user wrote. “Should I try to log out and then log back in or should I just wait it out?”
Another user, commenting under a post titled “Meta needs to face consequences for this…”, added, “I just got traumatized 10 different times today[,] I’m balling my eyes out. Ain’t no way they’re just gonna [fix it] and walk away like nothing ever happened.”
Most of these videos were hidden behind a “sensitive content” warning, but they still appeared in the recommended section of Reels, regardless of the type of content users had previously interacted with.
Content moderation remains a sensitive issue on Instagram, particularly given that a significant portion of its user base is made up of minors and young adults. Last year, a judge allowed a lawsuit to proceed, claiming that Instagram exploited children with addictive design features.