What is the STOP CSAM Act?

November 22, 2023 Criminal Defense, News & Announcements

In April 2023, U.S. Senate Majority Whip Dick Durbin introduced a bill titled the Strengthening Transparency and Obligation to Protect Children Suffering from Abuse and Mistreatment Act of 2023 (STOP CSAM Act). The Act was created to stop the proliferation of child sexual abuse material (CSAM) online.

The U.S. Senate Committee on the Judiciary reported that the number of child victims identified in CSAM from March 2009 to February 2022 rose from 2,172 to over 21,413 victims. The National Center for Missing & Exploited Children (NCMEC) stated their volume of child sexual exploitation reports increased from 415,650 to over 32 million reports to their CyberTipilne.

Under Durbin’s proposed bill, the Act will require mandatory reporting of child abuse by child services that receive federal funding, expands protections for child victims and witnesses in court, provides restitution for child victims of exploitation and other crimes of violence, allows victims to request that tech companies remove CSAM content, and providing a penalty for tech companies who fail to do so.

As there are multiple elements to the STOP CSAM Act, it is not surprising that the proposed legislation is being met with both approval and opposition. Those in favor of the bill believe it will help crack down on crimes against children; however, those against the bill believe it will result in over-reporting, suppressed lawful speech, and jeopardized issues of privacy and security for online users.

This page will provide information on the existing federal law, the proposed changes to be made to the STOP CSAM Act, and its potential downsides, along with responses from both sides.

Existing Requirements Under Federal Law

18 U.S. Code §2258A requires online service providers to report knowledge of an apparent CSAM violation on their service. Federal law states that any provider who knowingly and willingly fails to report the apparent CSAM shall be fined up to $150,000 for the initial knowledge and willful failure to report. If the provider makes a second or subsequent failure to report, they can be fined up to $300,000.

Changes Made Under STOP CSAM Act

The following provides a summary of the STOP CSAM Act and its recent changes to Federal law:

  • Mandatory Child Abuse Reporting – Proposed expansion of federal law to require certain youth athletic programs to report child abuse, specifically for all programs that receive federal grants over $10,000 per year for servicing children. A supplemental grant program will also be created to add funds to the Internet Crimes Against Children (ICAC) Task Force.
  • Protection for Children in Federal Court – Enhances special privacy protections for child victims and witnesses during federal criminal prosecutions. Section 3509 would be amended to include “exploitation, or kidnapping, including intentional parental kidnapping,” removing “physical or mental injury” to instead include “physical injury, psychological abuse.” The term “psychological abuse” includes:
    • A pattern of acts, threats of acts, or coercive tactics intended to degrade, humiliate, intimidate, or terrorize a child; and
    • The infliction of trauma on a child through:
      • Isolation;
      • The withholding of food or other necessities to control behavior;
      • Physical restraint; or
      • The confinement of the child without the child’s consent and in degrading conditions.
    • Restitution for Certain Child Victims – Creates a statutory framework permitting courts to appoint a trustee to manage restitution for certain victims. Section 3664 would be amended to state the following:

“When the court issues an order of restitution…for a [child] victim [of CSAM], the court, at its own discretion or upon motion by the Government, may appoint a trustee or other fiduciary to hold any amount paid for restitution in a trust or other official account for the benefit of the victim.”

  • Report & Remove CSAM – Would create an easier way for victims to report CSAM to tech companies and to ask them to remove the content, with an administrative penalty if the tech company fails to remove it. Section 2258A would be amended to state the following:

“In order to reduce the proliferation of online child exploitation and to prevent the online sexual exploitation of children, as soon as reasonably possible after obtaining actual knowledge of any facts or circumstances described in paragraph (2) or any apparent child pornography on the provider’s service, and in any event not later than 60 days after obtaining such knowledge, a provider shall submit to the CyberTipline of NCMEC, or any successor to the CyberTipline by NCMEC, a report containing:

  1. The mailing address, telephone number, facsimile number, electronic mailing address of, and individual point of contact for, such provider; and
  2. Information described in subsection (b) concerning such facts or circumstances or apparent child pornography.”

Section (b) titled “Contents of Report,” states that each report shall include, to the extent that it is applicable and available:

  • Identifying information regarding any individual who is the subject of the report, including the name, address, electronic mail address, user or account identification, Internet Protocol address, and uniform resource locator;
    • The terms of service in effect at the time of the apparent violation or the detection of the apparent child pornography or imminent violation;
  • A copy of any apparent child pornography that that is the subject of the report that was identified in a publicly available location;
  • For each item of apparent child pornography included in the report, it shall include information indicating whether:
    • The apparent child pornography was publicly available; or
    • The provider, in its sole discretion, viewed the apparent child pornography, or any copy thereof, at any point concurrent with or prior to the submission of the report; and
  • For each item of apparent child pornography, which is the subject of the report, an indication as to whether the apparent child pornography:
    • Has previously been the subject of another alleged report; or
    • Is the subject of multiple contemporaneous reports due to the rapid and widespread distribution.

Section (F) explains that the report of any apparent child pornography shall indicate whether the depicted sexually explicit conduct involves:

  • Genital, oral, or anal sexual intercourse;
  • Bestiality;
  • Masturbation;
  • Sadistic or masochistic abuse; or
  • Lascivious exhibition of the anus, genitals, or pubic area of any person; and whether the depicted minor is:
    • An infant or toddler;
    • Prepubescent;
    • Pubescent;
    • Post-pubescent; or
    • Of an indeterminate age or developmental stage.
  • Permits Civil Lawsuits – The Act would expand Section 2255 to allow victims who suffered sexual abuse or sexual exploitation as children to file civil lawsuits against online service providers and app stores for the “intentional, knowing, reckless, or negligent promotion or facilitation” or CSAM or child trafficking.
  • CyberTipline Revisions – The National Center for Missing & Exploited Children (NCMEC) implements CyberTipline as a tool to report suspected CSAM online. The Act would require tech companies to report child exploitation offenses and requires certain basic information to be included in the CyberTip.
  • Provider Transparency – Would require an annual report to be submitted by large tech companies to describe their efforts to promote a culture of safety for children on their platform.
  • Accountability Measures – Would provide tools to promote compliance with the CyberTipline statute’s mandates, as well as creating a new criminal provision that prohibits the use of tech platforms to promote or facilitate online child sexual exploitation. Section 2260B states it is unlawful for a provider of an interactive computer service, which operates using any facility or means of interstate or foreign commerce, through such service to knowingly:
    • Host or store child pornography or make child pornography available to any person; or
    • Otherwise knowingly promote or facilitate a violation of section 2251, 2251A, 2252, 2252A, or 2422(b).

A provider of an interactive computer service who violates the above section can result in a fine up to $1,000,000. If the violation involves a conscious or reckless risk of serious personal injury or an individual is harmed as a direct and proximate result of the violation, then it can result in a fine up to $5,000,000.

Penalties Under STOP CSAM Act

To ensure compliance with the STOP CSAM Act, the Federal government has created potential penalties for any provider who fails to comply with its requirements. The Act includes both criminal and civil penalties.

Under the proposed Act, it would be unlawful for a provider to knowingly fail to report any apparent CSAM or preserve such material for reporting purposes. A provider who violates the Act by failure to preserve or report can result in the following criminal penalties:

  • Initial violation of reporting failure – Up to a $150,000 fine
  • Second or subsequent reporting failure – Up to a $300,000 fine

Important: If an individual is harmed as a direct and proximate result of a reporting violation, the maximum fine under the criminal penalties shall be tripled.

A provider will be held liable to the U.S. Government for a civil penalty between $50,000 and $100,000 if the provider has knowingly:

  1. Failed to submit a report of CSAM within the required timeframe;
  2. Failed to preserve material as required; or
  3. Submits a report that either:
    1. Contains materially false or fraudulent information; or
    2. Omits identifying information described in the contents of reporting.

Further, a provider will be held liable to the U.S. Government for a civil penalty between $100,000 and $1,000,000 if the provider has knowingly:

  1. Fails to submit an annual report as required; or
  2. Submits an annual report that either:
    1. Contains a materially false, fraudulent, or misleading statements; or
    2. Omits information in the report that is reasonably available.

Important: If an individual is harmed as a direct and proximate result of the applicable violation, the amount of the civil penalty shall be tripled.

Causing More Harm Than Good?

Although the STOP CSAM Act of 2023 has substantial support, there’s also opposition to its potential passing. The key issues found in the proposed bill include the following:

  • Increased incentives to over-report and remove – By requiring annual reporting and increasing the incentives for providers to report and remove online content, the Act risks overwhelming both NCMEC and law enforcement with reports. Over-reporting could result in reports that do not even contain illicit content, meaning it could potentially take away from the resources that should be directed for combatting genuine issues of child exploitation. Further, the bill’s expansion on reporting requirements for content could create an incentive for online service providers to be “overbroad” in the removal of user-generated content. For example, the amendment to Section 2258A expands reporting requirements from “apparent child pornography” to include “any facts and circumstances indicating an apparent, planned, or imminent violation” of child exploitation crimes.
  • Mandatory content filtering – In the Report and Remove section, there are several provisions that require providers to employ content filters. If a child victim or their representative files a notification to remove CSAM or non-CSAM content relating to the victim and the provider fails to remove it, or engages in “recidivist hosting,” the victim can file a claim with the Child Online Protection Board. The issue here arises by opening the door to legal claims against the online provider, alleging that they were negligent in their failure to scan and monitor for indications of grooming or child trafficking. If the provider is from an end-to-end encrypted service, they may not even be able to perform such filtering.
  • Suppression of lawful speech – Section 2260B creates the new federal crime to “knowingly promote or facilitate a violation” of a variety of federal child exploitation statutes beyond just the knowledge of CSAM distribution. Additionally, the section that covers the “grooming” of a child is unclear regarding an online service provider. Service providers may prohibit end-to-end encrypted services that prevent content filtering or allowing pseudonyms for users to communicate anonymously, out of fear of criminal prosecution.
  • Endangers strong encryption and privacy concerns – The Act has the potential to threaten strong encryption, which is extremely important for privacy and security online. End-to-end encryption is used to protect users, promote commerce, and ensure cybersecurity. The Act’s provision to allow suits against providers for “the intentional, knowing, or reckless hosting” of CSAM. By allowing end-to-end encryption to be used as evidence against the online provider in a suit, it could result in prosecutors going after companies that were unaware that they were hosting CSAM simply due to them providing end-to-end encryption and someone else using their service to host or distribute the illicit content.

Responses

There are mixed opinions in response to the proposed STOP CSAM Act of 2023. Those in favor of the bill believe that it ultimately supports child victims and increases the accountability and transparency of online platforms.

The following is a statement provided by Senator Richard Durbin, who introduced the bill:

“In almost every aspect of the real world, child safety is a top priority. But in the virtual world, criminals and bullies don’t need to pick a lock or wait outside the playground to cause harm. They can harass, intimidate, addict, or sexually exploit our kids without anyone leaving home. The system is failing our children and we, as lawmakers, need to address this head-on. The STOP CSAM Act is a comprehensive approach to close gaps in the law and crack down on the proliferation of child sex abuse material online. We need to protect our children and I look forward to working with my colleagues on this effort.”

Michelle DeLaune, President and CEO of NCMEC, provided the following statement regarding Durbin’s proposed bill:

“NCMEC is proud to support Senator Durbin’s STOP CSAM Act, a comprehensive child protection bill that will fundamentally change how our nation combats online child sexual exploitation. In 2022, NCMEC’s CyberTipline received over 32 million reports relating to child sexual exploitation. The STOP CSAM Act will update the CyberTipline process to ensure reports are more comprehensive, timely, and include a broader range of child sexual exploitation offenses, including planned and imminent crimes. These improvements will enable NCMEC to prioritize reports more quickly and provide law enforcement with key information to help recover and safeguard victimized children and investigate their offenders. The STOP CSAM Act also provides essential, innovative remedies for children whose images are circulated online. The Report and Remove program will provide victims with legal remedies against online platforms that are notified they are hosting CSAM and sexually exploitative content of a child but refuse to remove the content.”

In response to the proposed bill, the Center for Democracy & Technology stated the following:

“The STOP CSAM Act jeopardizes children and adults’ constitutional rights to privacy and freedom of expression and risks overwhelming law enforcement with bogus reports transmitted by risk-adverse tech companies. Legal mandates to filter and block speech without a court order will not pass constitutional scrutiny and are thus not actually tools in Congress’ toolbox. Instead, Congress should look to addressing the barriers that currently stand in the way of the fight against online child exploitation. It could seek to understand, for example, what limits smaller online service providers face in incorporating tools like PhotoDNA into their content moderation systems, and charge NCMEC with providing technical and other resources to help services be able to voluntarily implement different technical tools.”

The Electronic Privacy Information Center (EPIC), wrote the following statement as their opposition to STOP CSAM Act of 2023:

“Even under the current reporting regime—which only requires providers to send a CyberTip if they know of an apparent violation of anti-CSAM laws—the STOP CSAM Act’s attempt to lower the bar for liability to reckless acts would likely violate the Fourth Amendment. Creating a duty to search for CSAM, combined with a duty to report the person who possesses that CSAM, would transform providers into government agents. This would endanger prosecutions for possessing or distributing actual CSAM, as such prosecutions currently rely on the theory that providers are acting voluntarily in searching for CSAM on their services. The STOP CSAM Act threatens encryption and imposes serious privacy and speech harms on internet users.”

Contact Pumphrey Law Firm to Represent Your Case

While the proposed STOP CSAM Act has the intention to prevent child exploitation, there is a possibility that the bill’s passing could result in harm to regular citizens who are not actually committing a crime. Due to its vague language, the bill could result in confusion and penalties. Requiring online service providers to constantly search for and report CSAM along with expanding the language to include content not necessarily considered CSAM, could result in a huge increase in investigations. If you or someone you know was arrested for an alleged CSAM crime, it is important that you consult with a defense attorney. Both the Federal government and the State of Florida take a harsh stance against crimes against children. In addition to the federal penalties, a conviction for a criminal offense could lead to imprisonment and a mandatory registration as a sex offender.

If you are in need of legal representation, contact Pumphrey Law Firm. Our attorneys have years of experience defending those who were wrongfully accused of a crime. To receive a free consultation, call our office today at (850) 681-7777 or leave us a message on our site.

Written by Karissa Key


Back to Top