Instagram Announces New Measures to Combat Sextortion

May 2, 2024 Criminal Defense, News & Announcements, Sex Crimes, Social Media

Just weeks after Florida Gov. Ron DeSantis signed HB 3 to prevent certain minors from obtaining social media accounts, the platform Instagram has announced new methods and tools being used to combat online predators. Since the measures under HB 3 won’t go into effect until January of next year, Instagram’s steps in preventing online sextortion will apply to minors in Florida who still use the app.

This page will provide information on Instagram’s recent announcement, how the program Lantern works, and the criminal offenses a person in Florida can face for targeting minors through social media or other online platforms.

Instagram’s Announcement

On April 11, 2024, Instagram published an announcement on how the social media platform is taking steps to prevent sextortion online. The newest tools being implemented through Instagram include:

Nudity protection in DMs

To help address and prevent predators from using the messaging/DM feature to send or request illegal content from minors, Instagram is testing a new nudity protection feature that would blur detected images containing nudity.

This tool will be defaulted for Instagram users under 18 across the globe, along with providing a notification to adults to encourage them to turn the feature on. If this feature is turned on a person’s account, a message containing a potential nude image or video will receive a message “reminding them to be cautious when sending sensitive photos, and that they can unsend these photos if they’ve changed their mind.”

The image or video itself will be blurred underneath a warning—giving the recipient the option to choose whether or not to view it, or if they wish to block the sender from receiving further messages. Additionally, a person who attempts to forward a nude image they’ve received will get a message asking them to reconsider. They will receive links to potential victim resources such as Meta’s Safety Center, support helplines, and Take it Down.

How will this work? Instagram explained that the nudity protection tool uses on-device machine learning to analyze if an image contains potential nudity, by working in end-to-end encrypted chats. Meta will not have access to these images unless someone reports to them.

Preventing scammers from connecting with minors

To prevent potential sextortion accounts from connecting with young people online, Instagram is now making any message request from a potential sextortion account to automatically get sent to the recipient’s hidden request folder. This makes it so the recipient won’t receive a notification about the message and can choose to never open it. If an Instagram user is already messaging with a potential sextortion account, Instagram will provide a safety notice to let the user know to report any potential threats they may have received to share private images or information.

Additionally, a stricter messaging default began for Instagram users under 16, in which they can only receive messages from users they are already connected to—regardless of the other user’s age. A potential sextortion account will not be able to click the “message” button on a teenager’s Instagram account, even if they have already connected. Instagram says they are also considering hiding teen accounts altogether from other people’s followers and making it more difficult to find teens in the Search results. 

New resources for those who may have interacted with scammers

To help individuals who may have already fell victim to alleged sextortion, Instagram announced the testing of new pop-up messages to direct those who have interacted with accounts removed due to evidence of sextortion. The pop-up message will provide links to resources like Stop Sextortion Hub, support helplines, and org. Instagram also plans to add global reporting options to direct potential victims of exploitation or solicitation to local child safety helplines.

Important: In its announcement, Instagram states: “While these signals aren’t necessarily evidence that an account has broken our rules, we’re taking precautionary steps to help prevent these accounts from finding and interacting with teen accounts.” That means a signal itself is not a definite violation of law. However, these new policies will allow the site to investigate potential violations, which could in turn result in more allegations of unlawful content online.

To read Instagram’s announcement in its entirety, refer to their page here.

What is the Lantern program?

Established in  by the Tech Coalition, Lantern is a program that “enables technology companies to share signals about accounts and behaviors that violate their child safety policies.”

In the announcement made by the Technology Coalition, they addressed how predators often seek out victims on multiple platforms, which often results in the online company for each platform to “only see a fragment of the harm facing a victim.” Through Lantern, multiple platforms—including Instagram, Snapchat, Twitch, and Meta—can share signals securely and responsibly when there is activity potentially violating the company’s policies on child sexual exploitation and abuse (CSEA).

Signals can refer to any of the following:

According to the Technology Coalition, “signals are not definitive proof of abuse.” Instead, they merely “offer clues for further investigation and can be the crucial piece of the puzzle that enables a company to uncover a real-time threat to a child’s safety.”

The following lists the method of how Lantern is set to work across multiple online platforms:

  1. Participating company detects a potential violation on its platform;
  2. The company takes action according to its policies (if the activity includes an illegal offense, then they should report it to the authorities);
  3. Company adds appropriate signals to Lantern (such as violating URLs or keywords);
  4. Another participating company can review the signals in Lantern to see if they help to surface violating content on its platform;
  5. The participating company reviews the surfaced content and activity from the signals against its policies;
  6. The company acts to report such content or activity according to its policies; and
  7. The company provides feedback to Lantern about the signals used.

The initial platforms joining Lantern’s first phase includes Discord, Google, Meta, Quora, Roblox, Snap, and Twitch.

Internet Crimes Against Children in Florida

Instagram’s announcement addresses the variety of ways children can be targeted online. According to the Florida Department of Law Enforcement (FDLE), cybercrimes can take many forms.

CSAM is considered “any visual (imagery or video) depiction of sexually explicit conduct involving a minor.” A person accused of possessing, distributing, transmitting, or manufacturing any alleged CSAM can result in harsh penalties under the following offenses:

Contact a Criminal Defense Attorney in Tallahassee, Florida

If you have any questions about a criminal charge that you or a loved one is being accused of, contact the defense team with Pumphrey Law Firm. Our attorneys have a broad understanding of Florida’s laws. Depending on what you’ve been charged with and the surrounding details, a conviction can result in a variety of steep consequences, including paying expensive fines, facing imprisonment, and the possible lifelong registration as a sexual offender.

Pumphrey Law can help defend your case. Contact our office at (850) 681-7777 or fill out the form on our website to schedule a free consultation.


Back to Top