MADRID, 7 Sep. (Portaltic/EP) –
Facebook has been ensuring for years that no one can read the content of WhatsApp messages because they are protected by end-to-end encryption, a claim that an investigation has cast in doubt by revealing that it has specific workers to review individual content of the conversations.
The research conducted by ProPublica is based on data, documents and dozens of interviews with employees and subcontractors, both current and former, and even a confidential complaint filed in 2020 with the United States Securities and Exchange Commission, collected by this news agency, in which it is assured that the privacy statements made by Facebook about WhatsApp are false, and that there are workers who use specific programs to access user messages.
According to the news agency, WhatsApp has at least a thousand subcontracted workers in their offices from Austin, Texas (United States), Dublin (Ireland) and Singapore, which they examine millions of contents per week shared by users through their messaging service.
Messages are private, as the technology company ensures on its website, since they are protected by end-to-end encryption, which prevents anyone other than the sender and receiver from knowing the content of the conversations.
However, these outsourced workers use special Facebook software to analyze messages and other content shared in chats that other users have reported for violating the platform’s policies, and are examined by an artificial intelligence system.
Subsequently, they must make a judgment on that content, although they cannot delete individual content, as it does on the Facebook and Instagram platforms. With this moderation -which is not referred to as such in WhatsApp-, workers review messages to identify and eliminate “the worst abusers,” as the company’s Communications Director, Carl Woog, has acknowledged to ProPublica.
This message has been reiterated by a company spokesperson in statements to Europa Press. “WhatsApp provides a way for people to report ‘spam’ or abuse, including sharing the latest messages in a chat. This feature is important to prevent the worst abuse on the Internet. We strongly disagree with the idea that accepting reports that a user chooses to send us is incompatible with end-to-end encryption.“, Has expressed.
Facebook acquired the messaging service WhatsApp in 2014 and two years later incorporated end-to-end encryption. In 2019, CEO Mark Zuckerberg shared his intention to extend this protection to your other communication services, Instagram and Messenger. A year earlier, he had assured the United States Senate that the company did not see any of the WhatsApp content.
Currently, only WhatsApp has this encryption, and it is also the only service that does not offer periodic transparency reports, in which the company details the actions carried out against content not allowed on Facebook or Instagram.
The research also includes the scope of user data that the company shares with law enforcement authorities when they are required. ProPublica points out that WhatsApp shares metadata, unencrypted records that can end up revealing a user’s activity. Precisely to avoid this, Signal, one of the competitors, collects less metadata to avoid affecting the privacy of users.