A new plan from Apple involves child porn scanning on iPhones. The company announced it would search for images of child pornography and report possible matches to authorities.
On the one hand, Apple’s child porn scanning plan sounds like a phenomenal idea. Automated scans will search for images known as child sexual abuse material. That is the new name authorities have given to child pornography. The new term, they say, better indicates that the children are not willing participants.
I think most of us already know that much.
When the scans turn up images that might legitimately be criminal in nature, the images go to a human reviewer. If that reviewer confirms the images may show criminal behavior, they report them to the Center for Missing and Exploited Children.
But no good deed goes unpunished.
Privacy advocates quickly raised red flags.
The New York Times reported that privacy advocates worry the plan could mean a new level of government surveillance.
A Johns Hopkins University cryptography professor told the paper the plan could set a dangerous precedent “by creating surveillance technology that law enforcement or governments could exploit.”
The Washington Post, meanwhile, said new software will scan devices without users’ “knowledge or explicit consent, and potentially put innocent users in legal jeopardy.”
However, Apple says it will compare images users attempt to upload to its iCloud service against known CSAM images. That means two things.
First, the child porn scanning technology will likely miss some actual CSAM images. After all, authorities must upload criminal images so there’s something to compare it to.
Second, Apple claims the comparison to known CSAM images means parents who store photos of their own children without clothes shouldn’t be flagged. However, given the growing concerns about child predators, I’m not sure why parents would want to keep nude photos of their own children.
Let’s suppose the software didn’t flag such images. One still wonders what would happen if an account got hacked. Those images could still be seen by others.
If the software does happen to mistakenly flag innocent images, I imagine the fallout will be huge.
Scans will also monitor chat messages from children.
This part of the plan seems a little frightening. Yet it also seems to be necessary in this day and age. The software will alert parents if it detects their children sending nude images via chat.
If I were a parent, I’d definitely want to know if there were something fishy going on. Even though it’s the last thing I’d want to think about.
National Center for Missing & Exploited Children CEO and President John Clark called the expanded protection for children “a game changer.”
The new features will debut later in 2021 in updates to  iOS 15,  iPadOS 15,  macOS Monterey, and  watchOS 8, Apple Insider reported.
I find the privacy concerns understandable on one hand. But on the other, I ask myself how many times people’s photos on cloud services have already been viewed by someone. I would imagine it happens more often than anyone would admit.
Any effort designed to protect children from predators seems worth a reasonable amount of risk.