` – There is no sacrifice of privacy in Apple’s fight against child pornography. “There has been a lot of confusion surrounding our communication,” Craig Federighi, the software chief of the American electronics giant, said in an interview with the Wall Street Journal.

The Apple executive defended the company’s new software to combat child pornography even as the new service raised privacy concerns for the iPhone.– Federighi in the interview with the WSJ stressed that the new system will be verifiable while admitting errors in the presentation of the two new instruments last week.

The first is intended to identify sexually explicit images of children stored in the company’s cloud storage service and the second will allow parents to better monitor which images are shared with and by their children through text messages.

“Many messages have been confused in terms of understanding things,” Federighi said. “We would have liked the news to be clearer for everyone, because we feel confident in what we are doing”. The Cupertino group has built a reputation for defending user privacy, and the company has framed the new tools as a way to continue this effort while also protecting children. Apple and other tech companies have faced pressure from governments around the world demanding access to user data to eradicate illegal child pornography. Apple’s efforts have drawn praise from some but also numerous criticisms. An open letter against these technologies was signed by various NGOs and over 7,700 people, including former CIA computer scientist and whistleblower Edward Snowden.

The main concern is whether Apple can use software that identifies illegal material without the system being exploited by others, such as governments. However, a hypothesis that Apple strongly denies. Federighi explained that the new systems will be protected by “multiple levels of verifiability”.

We consider ourselves to be absolutely at the forefront of privacy, we see what we are doing as an advancement of the state of the art in privacy, to enable a more private world, “Federighi added. One of the tools, announced last week, calls for an alarm when someone uploads pornographic photos of children to the service. of the company’s cloud storage, known as iCloud. Unlike other cloud providers, Apple does not scan everything in a user’s online account. Apple has devised a means of identifying images on the iPhone that correspond to a database of known illegal images. If a person never uploads images to iCloud, then Apple is not notified. If enough offensive images are uploaded to iCloud, the system alerts Apple and the company checks those specific images to confirm that they are prohibited before reporting them to the National Center for Missing and Exploited Children, the center for reporting child abuse.

By Editor

Leave a Reply