Oct 17, 2024
Instagram has introduced new safety features designed to protect teenagers from the growing threat of “sextortion” scams, as concerns continue to rise over the role social media plays in child safety.
Key Facts
- Instagram will now block screenshots or screen recordings of disappearing messages in direct messages. When a user attempts to screenshot a message intended to be viewed once, a black screen will appear.
- The platform is testing safety notifications that alert teenagers when they are messaging someone located in another country.
- Accounts showing signs of “scammy” behavior, such as newly created profiles, will be restricted from viewing a user’s following and followers lists—a tool scammers often use to find victims’ friends and family for blackmail.
- Images flagged for nudity will automatically be blurred for users under the age of 18.
- These updates come after Instagram rolled out designated Teen accounts last month, which included built-in restrictions like making accounts private by default and allowing parents to monitor their children’s messaging activity.
What Is “Sextortion”?
Sextortion is a scam in which predators coerce or trick minors into sending explicit images or videos and then threaten to release the material unless the victim provides more content or money. According to the National Center for Missing and Exploited Children, financially motivated sextortion cases are on the rise, with teenage boys being the primary targets. A staggering 79% of predators now seek money rather than more explicit imagery. From October 2021 to March 2023, the FBI and Homeland Security Investigations received over 13,000 reports of financial sextortion involving minors, leading to at least 20 suicides. The FBI reports a 20% increase in such scams involving minors between October 2022 and March 2023 compared to the same period the previous year. Most predators are based outside the United States, according to the FBI.
Key Background
These updates are part of Meta’s broader effort to address growing concerns that platforms like Instagram aren’t doing enough to protect children online. Meta has faced intense scrutiny for its handling of child sexual abuse material. A June 2023 report from the Stanford Internet Observatory revealed large networks of accounts advertising and selling such content on Meta’s platforms. While Instagram was identified as the primary platform, the issue spans across multiple online services. Meta CEO Mark Zuckerberg apologized to parents during a Senate hearing in January after accusations that Instagram played a role in their children’s exploitation and suicides. Meta spokesperson Sophie Vogel said, “Child exploitation is a horrific crime,” adding that Meta actively works with law enforcement to fight exploitation and arrest those responsible. She also noted that Meta has improved its reporting systems and taken steps to limit the visibility of abusive content.
Chief Critic
Annie Seifullah, a lawyer with experience in sextortion cases involving Meta apps, criticized the company for not acting sooner, telling The Washington Post, “This feels like too little, too late.” Seifullah highlighted that automated tools have made it easier for predators to contact more victims at once. Meta responded, stating that it continues to develop new technologies to combat these evolving tactics.
Tangent
Last month, Snapchat was hit with a lawsuit from New Mexico’s attorney general, accusing the company of failing to address rampant reports of sextortion and abuse involving minors. The lawsuit claims that Snapchat prioritized growth over safety, citing an internal analysis that reported the platform received 10,000 sextortion complaints per month in late 2022.