MADRID, 3 Sep. (Portaltic/EP) –

Apple has delayed a few months the implementation of its technology for detection and fight against the distribution of images of sexual exploitation of minors through company services like iCloud, and will introduce changes before its launch.

The American company planned to launch in the iCloud family accounts with iOS 15, iPadOS 15 and macOS Monterrey their new technology, which will eventually be delayed for a few months due to “feedback from clients, advocacy groups, researchers, and others“, as Apple has recognized through a statement sent to 9to5Mac.

We have decided to take more time over the next few months to gather information and make improvements. before launching these child safety features, “said a company spokesperson.

This technology, which Apple had announced in August but had not yet applied, were intended to protect children from sexual harassers that they use the company’s communication tools to contact and exploit minors, as well as to prevent the dissemination of these contents.

These “new cryptographic applications” allow to detect images of this type that are stored in iCloud. This method does not scan the images in the cloud, but is based on an on-device comparison of known images provided by child safety organizations before they are uploaded to iCloud.

What is compared is not the image itself, but the ‘hashes’ of the images, a kind of fingerprint. A cryptographic technology called ‘private set intersection’ is what determines if there is a match without revealing the result, and is attached to the image once uploaded to iCloud.

The ‘secret exchange threshold’ technology ensures a high level of matching, and that’s when Apple receives an alert for human teams to review. If confirmed, the user’s account is deactivated and a report is sent to the relevant associations and the Police.

Likewise, through a second technology implemented in its Messages messaging app, parents or guardians will receive a notification (optional) each time the child sends or receives an image with explicit sexual content, but this occurs after the child receives a warning informing that if he proceeds to view said image, his parents will be notified.

At the moment, Apple has not specified when this new technology will be launched after its delay or in what aspects it intends to change it.

This decision comes after expert and organization reviews who denounced that it violated the privacy of users and that it could be used as a back door to spy on people.

By Editor

Leave a Reply