Apple said yesterday that it will report child sexual abuse images uploaded to its US iCloud service to law enforcement.
Apple’s new system will use a process called "hashing", which can transform an input of any length into a fixed-length output through a hashing algorithm, thereby performing the Sexual Abuse Material (referred to as "CSAM") pictures for monitoring, and then the pictures are converted into unique numbers corresponding to them.
How to work "hash"
Before the pictures are stored in Apple's iCloud service, Apple compares the "hash" of the pictures with the hash database provided by the National Center for Missing and Exploited Children (hereinafter referred to as "NCMEC").
Apple said that this database will be enabled when the company releases the iOS 15 update and will be released in the form of iOS code. The matching process will be done on the user’s iPhone, not in the cloud.
If Apple subsequently detects a certain number of illegal files in an iCloud account, the system will upload a file that allows Apple to decrypt and view the pictures on the account, and then manually view the pictures to confirm whether they match.
Apple will only review images that match known content and report them to these databases.
If the person responsible for the manual review concludes that there is no error in the system, Apple will disable the iCloud account of the user who uploaded the offending image and send a report to NCMEC or notify law enforcement if necessary. An Apple representative said that if users believe that their account has been incorrectly flagged, they can appeal.
Apple stated that the system only applies to pictures uploaded to the iCloud service (users can choose to turn off this system), and does not apply to photos or other pictures on devices that are not uploaded to the company's servers.
The test of privacy issues
Apple began testing the system yesterday, but it said that most US iPhone users will not be able to use the system until the iOS 15 update later this year.
At the same time, this is also a test for Apple. Apple said that compared with previous methods to eliminate illegal images of child sexual abuse, the "Hash" system is more private to users because it uses a complex password system on Apple’s servers and user devices. And the actual picture will not be scanned, only the "hash" will be scanned. But for many users who are sensitive to privacy issues, they don’t want to see the software they use to inform the government about the content of their devices or cloud services, so they may react negatively to Apple’s move—especially Because Apple has been vocal in defending device encryption, and in the countries where the company operates, speech protection is less than in the United States.
Law enforcement officials around the world have also pressured Apple to weaken the encryption of other software services such as iMessage and iCloud in order to investigate acts such as child sexual abuse or terrorism.
Apple tried to use the new system to solve two necessary matters: to meet the law enforcement agencies' request for assistance in curbing child sexual abuse, and to maintain its core values of privacy and safety.
Will not be used to identify other images
Some security researchers worry that this technology may eventually be used to identify other types of pictures, such as photos of political protests. Apple said that the company's system only applies to images cataloged by NCMEC or other child safety organizations, and the company's way of constructing a password system can prevent it from being used for other purposes.
Apple stated that the company cannot add additional "hashes" to the database. The company said it is demonstrating the system to cryptography experts to prove that it can detect illegal child sexual abuse pictures without compromising user privacy.
While releasing this feature on Thursday, Apple also announced other features designed to protect children from sexual exploiters. One of the features is that Apple will use machine learning technology on children’s iPhones with family accounts to blur pictures that may contain nudity, and parents can choose an alert function so that children under 13 can use iMessage Receive an alert when you receive pornographic content in. In addition, Apple has also updated Siri to provide information on how to report child sexual abuse content.
**粗体** _斜体_ [链接](http://example.com) `代码` - 列表 > 引用
。你还可以使用@
来通知其他用户。