Apple may be working on iOS feature to detect child abuse photos: Report
Apple is reportedly planning to announce photo identification
tools that would identify child abuse images in
iOS photo libraries, the media reported.
Apple has previously removed individual apps from the App Store
over child pornography concerns, but now it's said to be about to introduce
such a detection system widely. Using photo hashing, iPhones could identify
Child Sexual Abuse Material (CSAM) on the device.
Apple has not confirmed this and so far the sole source is the
security expert Matthew Green, a cryptographer and associate professor at Johns
Hopkins Information Security Institute, AppleInsider reported on Thursday.
According to Green, the plan is initially to be client-side --
that is, have all of the detection done on a user's iPhone. He argues, however,
that it's possible that it's the start of a process that leads to surveillance
of data traffic sent and received from the phone.
"Eventually it could be a key ingredient in adding
surveillance to encrypted messaging systems," Green said.
"The ability to add scanning systems like this to E2E [end to
end encryption] messaging systems has been a major 'ask' by law enforcement the
world over," he added.
According to Green, this sort of tool can be a boon for finding
child pornography in people's phones.
Green and Johns Hopkins University have also previously worked
with Apple to fix a security bug in Messages.
Comments
Post a Comment