Safety, Security, Privacy, and the Victimization of Children Lost in it All
September 1, 2021
In 2018 The New York Times reported that more than 45 million images and videos of child sexual abuse material (CSAM), federally known as child pornography, existed on the internet. This number has been growing exponentially since the beginning and subsequent boom of the internet age. With nearly 48% of the global population now having smart phones, and even more with the access and ability to carry a device in their pockets, perpetrators of sexual abuse of children are more easily able to photograph or film their assaults. Furthermore, children who are being manipulated or threatened are more often able to generate their own sexual exploitation and abuse images. As access to these devices and the internet increases, the sexual abuse and exploitation of children being published online continues to grow.
Warnings and reports from the FBI and National Center for Missing and Exploited Children (NCMEC) assert that the impacts of COVID-19 and the increased push of children into online spaces has increased the risk of online sexual exploitation of children. At Children’s Cove we saw firsthand an increase in these online cases, as did the other regional Child Advocacy Center’s in Massachusetts. And indeed, amongst our community of professionals we saw not only that children are increasing their time online, but perpetrators of abuse and exploitation are also online more.
Imagine the very worst thing that has ever happened to you was recorded and then it was shared repeatedly for other peoples’ pleasure
The New York Times article also stated that in 2008 the issue of CSAM was already an epidemic (a term which holds a more household understanding now) and in 2018 it was a crisis at the breaking point. So where are we in 2021? Much like the discussion related to the climate crisis, the opioid epidemic, and the COVID-19 pandemic: we are at a point where urgent and decisive actions need to take place. One such action was recently taken in a controversial move by technology giant Apple.
Apple recently announced it will be completing an update to its iOS operating system which will contain a tool capable of scanning and identifying child sexual abuse materials on its platforms. Working in partnership with NCMEC, which functions as the national partner to investigators working on cases involving internet crimes against children, Apple will have an automated system scan individual devices to recognize what are known as hashes of images on their phone. Each image and video created and circulating on the internet is assigned a “hash.” When CSAM images/videos are identified, their hash is stored in the NCMEC database and flagged. The Apple system will work in tandem with the NCMEC system to trigger an alert for review if it detects 30 or more known hashes on an Apple device.
Seems straightforward, so where’s the controversy?
The controversy, as raised by privacy advocates and former NSA whistleblower Edward Snowden, is this: “Where is the limit of what is monitored on individuals’ devices?” The action taken by Apple is effectively installing a system to monitor what you do on a private device. And while the scope of the monitoring is for random numbers and letters, the concern isn’t the process as designed and intended, but what could be done with the same implementation process. Even as Apple has steadfastly stated it will not allow this process to be abused, the discussion of privacy in our society with the increased push of technology to monitor our devices, is very worthwhile. But what are the costs we are willing to pay?
On one hand, we are talking about individual property, liberty and privacy. It is, in a certain manner of thought, a violation of constitutional privacy without due process; a potential violation of the 4th Amendment of the Constitution, even if it is simply looking for a series of hashes. And while the allowances of digital surveillance under the Patriot Act cleared the path for this type of autonomous screening, the challenges to these types of programs have matched the pace at which this technology grows and expands.
On the other hand, as described by the mother of one victim of CSAM, “Imagine the very worst thing that has ever happened to you was recorded and then shared repeatedly for other peoples’ pleasure.” And, that this repetitious sharing continues in perpetuity- the act is relived and reshared for days, months, years and decades by those who consume the content of exploitation; the survivor continues to be revictimized. There are more than 45 million of these images and videos in existence, and the agencies that hold the perpetrators of these crimes and those who share them responsible, are outmatched and underfunded.
We are far into an epic journey wrought with the challenges of the digital age, and rapidly expanding our use of technologies with inherent dangers akin to the wild west. What actions are we willing to take? What sacrifices should be considered to protect children from exploitation and victimization?
Consider this: we are aware that private companies follow our movements online and advertise to us based on that history. We know social media companies suggest new connections based on our current connections and interests. We know that cars, phones, and GPS systems monitor our travel and upload traffic data in real-time. We know all this information is stored, sold, shared, and used for private businesses already. And, for the most part, everyone accepts this as a part of the agreement to use these devices. Why then, shouldn’t the work to protect and prevent the exploitation of children be part of the agreement too?
UPDATE:
Shortly after the writing of this piece, on September 3rd Apple made the decision to delay the rollout of its iOS update after the backlash from privacy and cybersecurity advocates claiming there were inherent safety risks. While Apple indicated the primary reason for the backlash is the handling of information and how the process works rather than the functionality, it still made the decision to delay. The National Center for Missing and Exploited Children offered a statement in reply, “Apple’s decision to delay their proactive measures to detect, report and remove child sexual exploitation material is disheartening. At the National Center for Missing & Exploited Children, we will continue to be a voice for children and support decisions that prioritize child safety. We hope that Apple’s conviction to implement meaningful child protection features does not waver in the months to come.