Together with MEP Evin Incir, I would like to invite you to our event on “How to effectively fight child sexual exploitation and abuse online?” on June 1st from 13h00 to 14h00 in the European Parliament and virtually.
On May 11th, EU Commissioner Johansson presented a new legislative proposal to prevent and combat child sexual exploitation and abuse.
This law succeeds the temporary derogation on combating online child sexual abuse, which the co-legislators adopted in July 2021. The new law formalises the temporary derogation, but also goes further, obliging online communication services to detect, report and remove child sexual abuse material.
Following a keynote from EU Commissioner Ylva Johansson, panellists Linda Wijkström (ECPAT Sweden), Ella Jakubowska (EDRi), Prof. Hany Farid (UC Berkeley) and Arda Gerkens (Expertise Centre Online Child Abuse) will share their views in a panel debate.
The event is over. Thank you for attending and/or for your interest!
Because of technical issues, the first part of the event wasn’t recorded. Nonetheless, we are very grateful Commissioner Johansson was willing to share a transcript of her speech with us. Please find below the recording of the panel debate (second part of the program) and below Commissioner Johansson’s keynote speech.
Thank you, Evin and Paul for hosting this.
I know that we all agree that child sexual abuse is a terrible and horrific crime. And we see the reports all the time. There was a recent case now in Germany as you probably know, where a man abused 12 children, half of them under the age of three years old, abusing, raping, torturing them. And the police could actually also identify other suspects and other victims in this case.
So fighting child sexual abuse is my top priority. One of the first proposals I put on the table was the strategy to fight child sexual abuse – one of the first things I did as a Commissioner – and that is a comprehensive strategy with a lot of different angles.
Today, we are focusing on the online component and that also was in focus on the proposal I put forward three weeks ago.
Why is it important to focus on that?
First, of course, that we know that a lot of perpetrators actually reach their victims online. We spoke to survivors and that very often this happens.
Another part of the online component is that we know that most of these horrible crimes take place with somebody that the child trust or trusted in the family or close friends or football coach, things like that. It could be a priest actually. And the only way to know about this is very often when the perpetrator posts it online or sends it to somebody else. Otherwise this could go on for years without anybody knowing that this is ongoing.
The third component is that we know that those who share child sexual abuse material online even if it’s old material so to say, that has existed for many years – because as you know it is illegal to look at and to share this material – and when the police can go and search that person by the IP address, they unfortunately very often find out that this person also commits crimes in the real world.
So this is very often the trace to find ongoing child sexual abuse – that is maybe not posted online – because they share other content online.
So that’s why the online component is essential. In many of the Member States, more than 80% of the reports that police can work on come from internet companies. So that’s the main source to go after, to rescue the victims, children and to go after the perpetrators.
The other reason why it’s important to do this is that the detection of child sexual abuse material online has been ongoing for more than 10 years.
Last year, that resulted in 85 million videos and pictures being reported globally. This has been ongoing for more than 10 years as I said.
What happened in the EU legislation – which is the strongest one for the protection of personal data – is that in the EU legislation, only detection for malware is allowed.
Then you’re allowed to do all the detection, if you are searching for malware. But companies are not allowed to do any detection for child sexual abuse material with our legislation.
To deal with that I put forward a temporary emergency legislation, to allow for the continuing of the voluntary detection. And that is the situation we have now, but this legislation will expire soon – in two years – and if we don’t have a new legislation in place, it will be totally forbidden to look for this in the European Union, and that will probably have a global impact as well.
So that is why we need the new legislation now to start on this. And with my new legislation, we learned lessons. When we went into the process of the emergency legislation, and when we started to build up to this, I realized that we have to have much better safeguards when it comes to the balance between being effective in going after the child sexual abuse material. But also make sure that we don’t go further than absolutely necessary, because we also need to protect the privacy of the users. So this is the balance to be found.
So this proposal that I put now in place is based on the GDPR, that’s the basic, GDPR always applies. Then we have the Digital Service Act, this also applies all the time. And then we have this one on top of that, so they are based on these two legislations. And in some part we also in my legislation go further than the GDPR. We are saying that you have to make impact assessments even in more cases than is absolutely necessary due to the GDPR. So this is on the basis.
And then we also said that the most important thing is to make companies act to prevent this from happening to share all the child sexual material.
So that’s why the focus is prevention. So we say that companies have to do risk assessments. All companies have to do that. A risk assessment: Can my service as a service provider or a hosting provider could this be abused, used for child sexual abuse material? If so, then they have to also describe what kind of mitigating measures are you taking to avoid this from happening?
So that’s the process to make companies do the prevention, the safety by design already from the beginning. So they have to present this to a national coordinator and a national authority and they make an assessment if these mitigation measures and the risk assessment together, is that enough.
Or is it necessary to do something more? If it’s necessary to do something more, for example, some detection or some other things – that is not normally legal. Then they can’t do that without a specific order to do so. So that means they cannot do any detection if they are not having this decision.
And the decision is not by the coordinating authority it is by another independent authority.
It is probably a court that will have to take this decision. And it’s the same way we use when we have other kinds of, you know, how do we balance between what can law enforcement do for example, and what of privacy you have to have a court decision or something similar to that to be able to do it .But then I also make it mandatory to do it because today only a few companies are producing these 85 million – the videos and photo reporting.
And this order is based on a specific time period. And it has to be consulted with the data protection authorities and companies can come in and the centre I will also propose can come in on that.
So I think what we put in place here is trying to be inspired by how we deal with the rule of law, to put processes in place that we can rely on.
So what kind of different steps need to be taken before you’re allowed, but then not only allowed, you’re also obliged. So companies cannot say no.
And we can hear companies are going to complain because of course they don’t want to be regulated. That’s always the case.
So and I know that some are criticizing saying why do you not exclude some specific technologies? Or what kind of technologies will be possible in this legislation?
And I think this is really important, because the proposal is technology neutral. And it has to be because technology is developing so extremely quick.
So if we put in specific technology or exclude specific technologies in the legislation, they will probably be obsolete before we have ended the Trilogue on this.
So I think the processes are the most important ones. So we have the right processes in place to make sure that we use the technologies that are the least intrusive, but also effective, so that we can rely on good processes here.
That’s what we do.
And also I was visiting Europol yesterday and they told me –those who work on the cyber crimes – that when they compare the organized criminal groups, and they are very professional as you know, how they act online, and how the perpetrators that share Child Sexual Abuse material or groom children, how they act online, Europol’s assessment was that the perpetrators that go after children or share the content are around approximately two years ahead, technology wise, compared to organised criminal groups, so they are quite advanced.
So of course, everything that we do on prevention by helping children how to behave online, parents, teachers, all that, of course, is absolutely necessary.
But it’s important that we don’t leave children alone with the perpetrators online or that we do not use the possibility to go after the perpetrators that have not shared this specific crime online, but they are sharing other child sexual abuse material online.
The other part of my proposal is an independent centre to fight child sexual abuse, an EU centre. And I’m really proud of this, that we will set up. And I must also say, in the proposal we are also making it mandatory for companies to remove and report child sexual abuse wherever they find it on their services.
And we also put a lot of power to victims and to national hotlines. So hotlines helping victims to get rid of content that is circulated online.
Now, the hotlines can go to this coordinating authority and they can issue a removal order. So this is also how we strengthen to get rid of it and to help the victims.
And the centre will be dealing or helping companies also on what kind of technology is available, especially for smaller companies that could be important, helping them sharing best practice on how to do that. For example, how to define if a user is a child or not. That kind of support they will have from the centre.
So they will work with the companies. They will also be able, if a national authority will not succeed in getting rid of the content, they can go to the EU centre and be helped by them.
So they will work on the online part. But they will also work on supporting Member States, especially when it comes to prevention. And when it comes to supporting victims.
And I must tell you, when I see how Member States work, there’s a lot of good things going on. But so much more can be done on prevention and on supporting victims.
So really, I think there’s a big need for sharing best practice, helping, maybe also supporting with specific material or other things to support what’s going on in Member States.