European Union

Legal definitions

The Council of Europe Convention on the Protection of Children against Sexual Exploitation and Sexual Abuse (Lanzarote Convention), ratified by all EU Member States, establishes binding minimum criminal-law standards across Europe. Although not EU legislation, it directly informs and underpins EU criminal law in this area.

EU law consistently defines a “child” as any person under the age of 18. This definition appears across relevant EU Directives addressing trafficking, sexual abuse, sexual exploitation, victims’ rights, and procedural safeguards and aligns with the Lanzarote Convention. While EU legislation does not adopt a single universal definition of “minor,” where this term is used, it generally refers to a person below the age of 18. The EU does not establish a uniform age of consent for sexual activity. Instead, the “age of sexual consent” is defined as “the age below which, in accordance with national law, it is prohibited to engage in sexual activities with a child.” Accordingly, the legal age of consent for sexual activity varies among Member States.

EU legislation does not provide a standalone definition of “sexually explicit conduct,” but that term is used in the definitions of other terms, including “child pornography.” The definition of “online child sexual abuse material” (CSAM) directly references the definition of child pornography, which includes depictions of a child engaged in “real or simulated sexually explicit conduct” and “realistic images” of a child engaged in sexually explicit conduct, among other elements.

“Child sexual abuse” is addressed through the criminalization of specific conduct under the Lanzarote Convention and Directive 2011/93/EU (CSAD), including engaging in sexual activities with a child below the nationally defined age of sexual consent, as well as conduct involving coercion, abuse of authority, or exploitation of vulnerability.

Conduct commonly described as “enticement” or “grooming” is addressed through the offense of “solicitation of children.”

EU law does not use the term “sextortion” expressly, but related conduct is captured under offenses involving coercion, threats, sexual exploitation, and misuse of sexual images of children under the Lanzarote Convention and CSAD.

Regulatory requirements/recommendations

EU law does not impose a general obligation for online platforms to proactively monitor all content. However, it establishes structured notice-and-action, reporting, removal, and risk-mitigation obligations under multiple instruments.

The Digital Services Act (DSA) requires hosting providers to implement notice-and-action mechanisms allowing any individual or entity to notify online platforms of illegal content, including CSAM. Upon gaining awareness of illegal content online, platforms must act expeditiously to remove or disable access to it or risk losing liability protection. Platforms must notify law enforcement authorities of suspected criminal offenses involving threats to life or safety. Very Large Online Platforms (VLOPs), defined as platforms with 45 million or more monthly active users in the EU, must conduct annual systemic risk assessments and implement reasonable, proportionate mitigation measures, including child-protection tools such as age verification and parental controls.

The Audiovisual Media Services Directive requires Video-Sharing Platforms (VSPs) to take “appropriate measures” to protect users from criminal content, including CSAM, under EU law.

The Interim Regulation (EU) 2021/1232 permits voluntary use of detection technologies (including hashing and AI) to detect, report, and remove online CSAM, notwithstanding relevant privacy restrictions. Suspected illegal content must be subject to human review and confirmation before online platforms fulfill reporting obligations. The Regulation has been extended until 3 April 2026.

A proposed CSAM Regulation would establish a long-term framework, replacing the Interim Regulation with a permanent rule about the voluntary use of detection technologies, among other elements. The details of the proposed regulation are subject to continued negotiation while final adoption remains pending.

Age verification requirements/recommendations

EU law does not impose a universal age verification requirement for access to online platforms.

Under Article 28 of the DSA, platforms accessible to minors must adopt appropriate and proportionate measures to ensure a high level of privacy, safety, and security. Age verification may be required where appropriate and proportionate, particularly for higher-risk services such as pornography websites.

VLOPs must incorporate age verification and parental control tools as part of risk mitigation measures.

The proposed CSAM Regulation does not establish a general age verification requirement for platform access, but risk-based age-related measures may be required under certain circumstances for particular platforms.

Parental consent requirements/recommendations

There is no universal requirement for parental consent before a child uses an online platform.

Under Article 8 of the General Data Protection Regulation (GDPR), where consent is relied upon as the legal basis for processing personal data , parental consent is required for children below the “age of digital consent”—16 by default, but a Member State may set a different age of digital consent no lower than 13 years old.

The DSA does not mandate parental consent but requires VLOPs to provide parental control tools and encourages compliance with applicable laws, including GDPR.

Legal remedies for child victims

EU law provides criminal, civil, takedown, and data protection remedies for child victims of online sexual exploitation. Directive 2011/93/EU requires Member States to criminalize grooming CSAM-related offenses, enabling investigation and prosecution. The DSA requires online platforms to remove illegal content upon notice and permits binding removal orders, while Member States must ensure prompt removal or blocking of CSAM.

Victims are entitled to information, support, protection, and access to compensation under the Victims’ Rights Directive and, where applicable, the Anti-Trafficking Directive. Under the GDPR, victims may invoke the right to erasure and object to unlawful processing of personal data, including CSAM imagery.

"Safety by Design" requirements

Although EU law does not use the term “Safety by Design” explicitly, related obligations exist. Article 25 of the GDPR requires data protection by design and by default, mandating the integration of appropriate technical and organizational measures into systems from the outset and on an ongoing basis. Article 28 of the DSA requires platforms accessible to minors to implement appropriate and proportionate measures ensuring a high level of safety, and VLOPs must integrate child-protection measures into their systemic risk assessments and mitigation processes. These obligations apply continuously and must be incorporated before deploying new functionalities that may affect systemic risks.

Compliance with GDPR obligations is supervised by national Data Protection Authorities, with potential fines of up to €20 million or 4% of global annual turnover. DSA obligations are supervised by National Digital Services Coordinators and, for VLOPs, directly by the European Commission, with potential fines of up to 6% of global annual turnover.

Global Platform for Child Exploitation Policy

The views and opinions expressed in content attributed to sources other than NCMEC are those of the respective author, speaker, or contributor. Some survivor consultants have been compensated for their contributions to content. Inclusion of content or external links does not signify NCMEC's endorsement.

Copyright © National Center for Missing & Exploited Children. All Rights Reserved.