This legislation is needed for 3 main reasons: developments in the use and misuse of the Internet, EU countries not fully managing to tackle child sexual abuse alone, and companies not fully managing to detect child sexual abuse material on their services.
- The internet has given offenders a new way of approaching children.
Predators contact children on social media, gaming platforms and chats to lure them into producing compromising images of themselves or into offline meetings. Children are spending more time online than ever before, increasing the risk of coming in contact with online predators. Yet most services do not distinguish between child and adult users, treating children as if they were internet-savvy grown-ups able to protect themselves – though evidently, they are not. Many service providers also turn a blind eye when perpetrators misuse their platforms to share child sexual abuse images and videos.
- EU countries need to work together.
A single EU country cannot prevent child sexual abuse material from circulating the internet without cooperating with online service providers working in several EU countries. We need a coherent system for the whole EU.
- Voluntary reporting by companies of child sexual abuse material is not enough.
Voluntary reporting of child sexual abuse by online service providers already exists today, however, it varies from company to company. Some service providers take comprehensive action. Some take no action at all. These gaps in reporting mean that abuse continues undetected. Because detection is voluntary, companies may decide to change their policies at any time, hindering authorities’ efforts to fight child sexual abuse.
Finally, on 3 August 2024, the EU law that allows service providers to continue voluntary detection and reporting of online child sexual abuse and removal of child sexual abuse material will expire. If this happens, tech companies will no longer be able to detect, report and remove illegal content in communication services, which are today the most relevant way to disseminate child sexual abuse material and to groom children: 80% of reports last year came from electronic communications. This will make it easier for predators to sexually abuse children and to get away with it unpunished.
The Commission’s proposal would make it mandatory for the relevant companies offering their services in the EU to prevent the dissemination of child sexual abuse material and the grooming of children. When prevention is not enough, the relevant companies could be required to detect and report child sexual abuse online to the authorities. With this proposal, the European Commission aims to ensure effective public−private cooperation across the EU to keep children safe from online predators and to stop the spread of child sexual abuse material online.
Companies, more precisely known as online service providers, make available the infrastructures that facilitate:
- The spreading of child sexual abuse images and videos
- The solicitation of children into abuse.
As owners of the infrastructure, providers are often the only ones able to detect abuse, which takes place hidden from public view, among networks of offenders that can include hundreds of thousands of users. If service providers ignore this, it is likely that no-one will be able to identify and help the victims. Already today in some EU countries, up to 80% of investigations are launched only because of reports from service providers.
Companies might receive a detection order to detect child sexual abuse material, or grooming on their services. The Commission’s legislative proposal imposes case-by-case detection obligations as targeted as possible. The detection obligations will be ordered by a judge or an independent administrative authority, and is justified by the need to retrieve, report and remove clearly illegal content.
The actual detection of child sexual abuse will closely follow existing data protection rules and rules on privacyof communication.
- What does the order concern?
Detection concerns clearly illegal content: child sexual abuse material.
- When is the order imposed?
Detection is imposed as a measure of last resort to service providers. A detection order is imposed only after it is determined that the service provider’s risk-assessment and mitigating measures are not sufficient to protect the fundamental rights of children.
- To whom is the order imposed?
Detection is not imposed on the whole platform, but only on those services at risk of misuse of online child sexual abuse.
- For how long does the order last?
Detection orders are limited in time and subject to reviews.
- How will the detection take place?
The detection is conducted without any possibility for the technology (or the provider that runs it) to understand the content of conversations or collect any further knowledge or information behind the existence of a possible match.
- How will data be processed?
The data processed to detect child sexual abuse online is limited only to what is necessary. The data is in principle deleted immediately and forever, unless strictly necessary for the purposes listed in the Regulation
The EU Centre to prevent and combat child sexual abuse is a key safeguard that will ensure the transparency of the detection process and facilitate access to the least privacy intrusive detection technology.
- The EU Centre will ensure accountability and transparency in the process.
This includes the collection of data for transparency reports, providing clear information about the use of tools and their effects, and supporting audits of data and processes.
- All reports would be reviewed at the level of the EU Centre.
This means that providers are given feedback on inaccurate detection and can further refine their detection tools. The Centre will help to ensure that there is no erroneous takedown of legitimate content, or abuse of the search tools to report legitimate content, including misuse of the tools for purposes other than the fight against child sexual abuse.
- The EU Centre will also support users who feel that their content was mistakenly removed.
Therefore, the proposed law ensures the respect of all the fundamental rights at stake, including the right to privacy and the rights of children to be protected from abuse.
The companies that receive a detection order by a court will be required to use state-of-the-art technologies that are the least privacy-intrusive, limiting the error rate of false positives to the maximum extent possible.
Technologies that can ensure effective detection without substantially undermining the privacy of electronic communication exist. As there are 3 types of child sexual abuse material to be detected, we will list examples down below for: known material, new material, grooming.
- Detection of known child sexual abuse material relies on Photo DNA technology.
Photo DNA technology consists of converting a previously flagged image into a unique (and non-recoverable) identifier. This process is built on the similar concept of digital fingerprinting as originally developed for application in the detection of malware and copyrighted content.
- Detection of new child sexual abuse and grooming relies on artificial intelligence classifiers.
These classifiers are trained on databases of known child sexual abuse material and confirmed grooming conversations and develop the capacity to identify analogous images, videos or conversations. Artificial intelligence classifiers are set to detect material that corresponds to new child sexual abuse material or grooming with a predetermined rate of likelihood. Therefore, it is perfectly possible to instruct them to only detect material that has an extremely high chance of being CSA.
- The flagging of potential grooming conversations would occur based on artificial intelligence classifiers trained on confirmed grooming.
Grooming detection exists already and has proven reliable. Grooming detection will be subject to strict performance safeguards in particular in terms of the accuracy rate. Technologies are currently at the basis of content moderation and are already widely used, for example on gaming sites, where adult offenders often target children.
All reports sent by service providers would be reviewed at the level of the EU Centre, so that providers are given feedback on inaccurate detection and can further refine their detection tools.
- If certain technologies are not yet fully developed, the proposal will constitute a powerful incentive for research and innovation to develop them.
- The EU Centre shall be supported by the Technology Committee, consisting of independent technical experts appointed by the Management Board to act in the public interest, to make sure that the technologies are effective and meet their objectives.
- The EU Centre will make available technologies that providers may acquire, install and operate, free of charge. For small companies, the EU Centre will help companies fulfil the obligations.
The EU Centre will work with companies and law enforcement to help them exchange information and best practices, providing oversight, transparency and accountability.
The Centre will support 3 main actors: companies, law enforcement and countries. The EU Centre will support companies by providing them with a database of indicators to detect online child sexual abuse. It will support law enforcement, so that they can act on reports and save children. The EU Centre will support countries in preventing online child sexual abuse and helping victims.
- In more detail concerning companies:
- The EU Centre will work with service providers to ensure that victims receive timely support and offenders are brought to justice.
- The Centre will maintain a database of indicators of known child sexual abuse material, new material and grooming to help companies detect child abuse in their systems in line with EU data protection rules.
- In more detail concerning law enforcement:
- The centre will facilitate the exchange of best practices in the EU and beyond.
- Via law enforcement’s final responses, the Centre will support victims in removing their images and videos to protect their privacy. The Centre will set up an online platform where they can find helpful resources.
- The centre will encourage dialogue between relevant stakeholders, and help develop state-of-the-art research and knowledge.
- In more detail concerning governments:
- The Centre will support and facilitate EU countries in their prevention efforts (both those focused on children and those on potential offenders).
- The Centre will work closely with national authorities and global experts to ensure that victims receive support
- In cooperation with national administrations, the Centre will carry out research to support evidence-based policy on assisting victims.
The European Commission will work together with relevant stakeholders from the public and private sector to increase cooperation and exchange of best practices under the prevention network of practitioners and researchers.
- The EU Centre to prevent and combat child sexual abuse will have a key role in boosting prevention efforts. The centre will cooperate with the prevention network but also act as a counterpart for similar centres around the world.
- The Commission is also working to improve the protection of children from sexual abuse globally by cooperating with the WeProtect Global Alliance.
- In addition, the Commission will continue to fund initiatives on enhancing prevention.
Together with the legislative proposal, the Commission has also adopted a renewed strategy for a better internet for children to further support and protect children online.
The Digital Services Act aims to create a harmonised baseline for addressing all illegal content in general. However, due to its general and horizontal nature, it addresses the issue of child sexual abuse only partially. Preventing the dissemination and circulation of known child sexual abuse material requires a more systematic and targeted approach.
The Directive harmonises the criminal legislation of EU countries, while the proposed Regulation will define the responsibilities of digital service providers. In this respect, there is no overlap between the two instruments.
The new proposal supports the implementation of prevention and support measures which are actually included in the Directive. This will be one of the main roles of the EU Centre to prevent and combat child sexual abuse.