A majority of companies that operate major online platforms used throughout the world are founded and based in the United States. While each of these companies have global user bases, the regulatory, legal, and legislative environment of the United States has a significant impact on regulations, legislation, reporting practices, age limits, mandatory detection, legal definitions, data access, and many other issues relating to combatting online child sexual exploitation in countries throughout the world. In some instances, this leads to U.S. laws having an outsized impact on how online platforms moderate content, detect child sexual exploitation online, and can be held responsible for the exploitation of children on their platforms.

Regulatory Authority

Unlike the United Kingdom, Australia, and several other jurisdictions, the U.S. does not have a national-level entity with sole regulatory authority over tech companies for purposes of online safety. Multiple legislative proposals have been considered in the U.S. that would create such an entity, but none have passed into law.

Without a U.S.-based regulatory authority to address online safety issues where platforms are headquartered, tech companies are responding to a patchwork of regulations imposed elsewhere around the world, sometimes adjusting services specifically and only for users in certain jurisdictions.

While the U.S. lacks a singular regulatory authority with oversight over online safety, the U.S. legal and legislative systems are among the most active in the world relating to efforts to improve online safety for children and combat online child sexual exploitation. Under U.S. law, it is common to initiate lawsuits for damages and other judicial relief against corporate entities that facilitate or otherwise are responsible in some way for harm caused to an individual. While online platforms currently are entitled to sweeping legal immunity for activities that occur on their products and services ( 47 U.S.C. § 230 ), dozens of lawsuits have been filed against online platforms on behalf of child victims seeking justice for the exploitation they suffered online. Similarly, there are consistent legislative efforts to address the overbroad immunity accorded to online platforms under current law. The U.S. also is extremely active in legislative efforts at the federal and state levels to create new legal responsibilities for online platforms to better protect children online; detect offenders who seek to exploit children online; and provide stronger remedies for child victims.

Related Video: Professor Mary Graw Leary on the Communications Decency Act

Reporting Practices

U.S. law ( 18 U.S.C. § 2258A ) requires certain companies, including operators of major online platforms, to report to NCMEC's CyberTipline apparent violations of U.S. laws related to CSAM, child sex trafficking, and online enticement of children for sexual acts. A separate law ( 34 U.S.C. § 11293 ) lists additional types of child sexual exploitation about which the CyberTipline may receive reports. Because many of the largest online platforms are headquartered in the U.S. and subject to this reporting requirement—and because the requirement applies to platforms under U.S. jurisdiction, regardless of where the reported conduct occurred—the CyberTipline is the world's largest reporting mechanism for concerns of child sexual exploitation.

Age Limits

The Children's Online Privacy Protection Act of 1998 ("COPPA," 15 U.S.C. §§ 6501-6505 ) is the law from which many online platforms derive policies prohibiting use by children under the age of 13. This law does not ban children under 13 from using platforms; rather, it bans platforms from collecting personal information from children under 13. U.S.-based platforms ask new users to affirm that they are at least 13 years old, so that any collection of personal information does not violate COPPA and place the platform at legal risk.

Yet neither COPPA nor any other U.S. federal law requires online platforms to verify the age of their users. As a result, simply asking users for their age is the common basis for many platforms to approve or reject new user accounts. To gain access to platforms that lack age verification measures, younger children routinely misrepresent their age. A 2022 study by Ofcom, the online safety regulator in the United Kingdom, found that one-third of children ages five to seven years and 60% of children ages eight to eleven years had social media accounts in violation of platforms' policies.

Mandatory Searching

The U.S. Constitution contains certain provisions (i.e., the Fourth Amendment) that have influenced legislation and online platform policies intended to avoid actual or apparent government involvement in directing or influencing searches conducted by online platforms for exploitative content. Most online platforms identify independent business interests to search for and remove certain content so that users can enjoy a safe experience and companies can guard against the risk of reputational harm by providing "family friendly" online experiences. Current U.S. law ( 18 U.S.C. § 2258A ) requires online platforms to report apparent violations of laws against CSAM, online enticement of children, and sex trafficking of children, but it does not require or direct online platforms to search for such violations. Rather, that law specifically avoids any such requirement under a subheading of "Protection of Privacy."

Even when operating in other jurisdictions, U.S.-based online platforms have demonstrated caution and/or avoidance in collaborating with law enforcement or responding to requests for information outside official legal process (such as a subpoena or court order) to avoid becoming or appearing to be an "agent of the government." These efforts are made to avoid an online platform's search of user's content being ruled in violation of the Fourth Amendment under U.S. law.

Legal Definitions

Despite nearly a decade of global advocacy to abandon the term, most U.S. laws still refer to visual depictions of a child engaged in sexually explicit conduct as "child pornography." Even in jurisdictions that have adopted other terminology, CyberTipline reports about CSAM possession, manufacture, or distribution still use variations of and references to "child pornography" because it is the official term used in relevant U.S. laws. NCMEC, among other advocates, supports legislative proposals to eliminate the term "child pornography" when referring to such material and has already adopted the term "child sexual abuse material" or "CSAM" for internal and external use whenever possible.

Related Video: Dr. Mary Anne Franks on appropriate terminology

Under 18 U.S.C. § 1591 , a commercial sex act—which "means any sex act, on account of which anything of value is given to or received by any person"—is a required element of the crime of sex trafficking, even of a child. This differs from global and non-U.S. definitions (e.g., the " Palermo Protocol ") that, in addition to the exploitation of children through commercial sex, include any form of sexual exploitation of a child as a type of trafficking. While other jurisdictions might consider the production of child sexual abuse material (CSAM) as a trafficking offense, such conduct does not constitute sex trafficking of children in the U.S. The impact of this U.S. legal distinction is apparent in how CyberTipline reports are categorized. Every year, the overwhelming majority of all CyberTipline reports submitted—99.2% in 2023—are categorized as relating to "child pornography" and made available as such to law enforcement agencies around the world. Some of these "child pornography" reports could be understood, in various jurisdictions with different legal definitions, to be child sex trafficking reports. CyberTipline reports categorized as "child sex trafficking" constitute a small fraction of all reports, at least in part because of the applicability of U.S. legal definitions to the reporting process.

Data Access

Generally, U.S.-based online platforms provide user information to law enforcement upon official requests conforming to relevant laws. Non-content data—such as user-provided profile information—may be accessible through an official law enforcement request on agency letterhead or a subpoena (depending on local laws and company policies). However, content data—such as messages, images, and other substantive information—is generally available only pursuant to a search warrant or appropriate court order. U.S.-based companies typically disclose content data to non-U.S. government officials following procedures detailed in a relevant Mutual Legal Assistance Treaty (MLAT). The MLAT process can be very slow, as it typically requires both diplomatic and justice system engagement in both countries. When no MLAT exists, non-U.S. government officials may not be able to access content data held by U.S.-based companies at all. As an alternative to the MLAT process, the Clarifying Lawful Overseas Use of Data ("CLOUD") Act allows for the U.S. Government to negotiate executive agreements with other governments to streamline access to content data. As of 2024, the U.S. is negotiating or has executed agreements with Canada, the United Kingdom, the European Union, and Australia.

U.S. Legislative Proposals Endorsed by NCMEC

During the 118th Congress (2023-2024), NCMEC worked with survivor consultants to examine various legislative proposals, ultimately identifying five child protection bills to support.

  1. REPORT Act became law May 7, 2024
  2. STOP CSAM Act
  3. EARN IT Act / COSMA
  4. SHIELD Act
  5. Project Safe Childhood Act

Read more about these proposals and what survivors have said about them on NCMEC's blog, " Survivors Speak Out in Support of Critical Child Protection Legislation ." Under the U.S. legislative process, any pending bill not enacted prior to the end of 2024 will expire and must be re-introduced in 2025 to become active for consideration again.