[ad_1]
Amid a peak crisis of surging online child sexual exploitation, the European Union has simply closed its eyes.
This moment was meant to be a glorious victory for privacy rights – the rule hampered technology corporations from monitoring email, messaging apps, and other online platforms in the EU.
But the problem is, this rule was overly broad because it also inhibited technology companies from proactively identifying, reporting, and removing child sexual exploitation online – including child sexual abuse materials (CSAM, or child pornography) and grooming.
The impact of this directive was not theoretical.
Since the rule passed, there was a 58 percent decrease in EU-related reports of child sexual exploitation online.
Let’s pause and think about this number, because it represents real children whose chance of being identified or removed from an abusive situation was dramatically reduced.
And the EU was not ignorant of the risks they took with children’s lives.
Advocates and experts around the world, including organisations that identify CSAM online like the National Center on Missing and Exploited Children and Thorn, called on the EU to create a carve out.
A simple carve out could have protected the existing practices, and the development of even better tools, to detect both known and novel CSAM as well as patterns of online grooming.
In fact, as reported by the New York Times, even Facebook’s global head of safety bluntly shared the company was “concerned that the new [ePrivacy] rules as written today would limit our ability to prevent, detect and respond to harm.”
Privacy absolutism
But the march for privacy absolutism was too strong.
And while some small last-minute concessions were made in the ePrivacy language, it still resulted in a 58 percent decrease in reports of child sexual abuse online.
All of this happened at a time when CSAM has surged over 106 percent during Covid-19.
We know that online grooming of children for sexual abuse can take less than 20 minutes according to researchers.
Small focus groups held by contacts of the National Center on Sexual Exploitation with youth aged 16-18 found that all report that they have been solicited for sex and sent nude images online by strangers.
Finally, last month, four months after the ePrivacy Directive was passed, EU legislators have reached a temporary agreement that would allow tech companies to continue scanning for, reporting, and removing online child sexual abuse.
This is only a temporary fix, and it still requires oversight measures which may or may not hinder the protection of children online – it’s still to be determined.
This case has been a clear example of how the online privacy debate too often leaves child protection as an afterthought.
Of course, online privacy is an incredibly important issue that deserves support.
But child sexual abuse victims can’t be sacrificed on the altar of absolute adult privacy. Yes, there are important and robust debates to be had about where the line should be drawn between privacy rights and enforcing the laws of the offline world in the online ecosystem. These debates are ever evolving and incredibly complex.
But let me propose this: the line we draw should – at minimum – allow the removal of child sexual abuse materials, and the identification of those who groom or sexually exploit children.
That isn’t too much to ask. This basic idea should be assumed and accepted before we even begin the debate about online privacy – it should not be an afterthought that’s only given a temporary fix just four months after seeing unprecedented drops in reports that could have saved children’s lives.
We can’t close our eyes to online child sexual exploitation anymore.
[ad_2]
Source link