Apple also addressed the hypothetical possibility of a particular region in the world deciding to corrupt a safety organization in an attempt to abuse the system, noting that the system’s first layer of protection is an undisclosed threshold before a user is flagged for having inappropriate imagery. Even if the threshold is exceeded, Apple said its manual review process would serve as an additional barrier and confirm the absence of known CSAM imagery. Apple said it would ultimately not report the flagged user to NCMEC or law enforcement agencies and that the system would still be working exactly as designed.
After yesterday’s news and today’s responses from experts, here’s a recap: Apple is going to scan all photos on every iPhone to see if any of them match against a dataset of photos – that Apple itself hasn’t verified – given to them by the authorities of countries in which this is rolled out, with final checks being done by (third party) reviewers who are most likely traumatized, overworked, underpaid, and easily infiltrated.
What could possibly go wrong?
Today, Apple sent out an internal memo to Apple employees about this new scanning system. In it, they added a statement by Marita Rodriguez, executive director of strategic partnerships at the National Center for Missing and Exploited Children, and one of the choice quotes:
I know it’s been a long day and that many of you probably haven’t slept in 24 hours. We know that the days to come will be filled with the screeching voices of the minority.
Apple signed off on that quote. They think those of us worried about invasive technologies like this and the power backdoors like this would give to totalitarian regimes all over the world are the “screeching voices of the minority”.
No wonder this company enjoys working with the most brutal regimes in the world.
In the UK there is law governing this area and thresholds are in place. They are not a state secret. The official view of the police is they do not encourage vigilante action and even the Internet Watch Foundation is only allowed to operate on the basis of an understanding. The training and support requirements of staff monitoring reports is a none trivial exercise. Before we begin Apple is contrained by child abuse law and GDPR and other rights regarding information they hold on individuals.
As for the topic one of the first thoughts which crossed my mind was what if an innocent human rights activist was in a photo standing next to the wrong person? Would that person then be targetted by state oppression? Would they wind up dead?
Putting the law and real world aside what if Apple’s system just made us more complacent?
That is simply arrogant patronising and insulting.