Last week a series of investigations by the Wall Street Journal on Facebook made noise all over the world. Principles: Facebook knows, in full detail, that its platforms are riddled with flaws that cause damage, often in ways that only the company fully understands. This finding is based on internal Facebook documents, including research reports, employee discussions and drafts presented to senior management.
Again and again, the documents show, Facebook researchers have identified the bad effects of the platform. Time and time again, despite congressional hearings, its own commitments and numerous statements to the media, society has not corrected them. The documents offer perhaps the clearest picture so far of how well.known Facebook problems are within the company, down to the CEO himself.
1. Facebook says its laws apply to everyone, company documents reveal – elites are exempt
Mark Zuckerberg said that Facebook allows its users to talk equally with the elites of politics, culture and the press, and that its standards apply to everyone. Privately, the company has built a system that has exempted high.profile users from all or all of its laws. The program, known as “CrossCheck” or “XCheck”, is designed as a means of quality control for high profile accounts. Today, it protects millions of VIP users from regular enforcement of company laws, the documents show. Many abuse the privilege, publishing material that includes harassment and incitement to violence, which in subsequent cases would have led to sanctions. Facebook says the criticism of the program is fair, that it is for a good cause and that the company is working to correct it.
2. Facebook knows that Instagram is toxic for many young girls, the company documents reveal
Researchers within Facebook, which is owned by Facebook, have been researching for years how its photo.sharing app affects millions of young users. The company has repeatedly found that Instagram harms a significant percentage of them, especially teenage girls, more than other social media platforms. Publicly, Facebook has consistently reduced the negative effects of the app, including after the congressional hearing, but has not made its research open to the public or available to academics or lawmakers who have requested it. In response, Facebook said the negative effects are uncommon, that mental health research is valuable and that some of the harmful aspects are not easy to treat.
3. Facebook tried to make Facebook a healthier place – and the result – a more angry place
Facebook made a change to its algorithm in 2018 designed to improve its platform – and stop signs of declining user engagement. Zuckerberg stated that his goal is to strengthen relationships between users and improve their well.being by fostering interactions between friends and family. Within the company, the documents show, staff members warned that the change had the opposite effect. It made Facebook, and those who used it, more angry. Zuckerberg objected to some of the amendments proposed by his team, according to the documents, because he feared that they would drive people away from Facebook – reduce their use and interaction on the platform. Facebook, in response, says that any algorithm can promote harmful or harmful content and that the company is doing its best to solve the problem.
4. Facebook employees warn of phenomena such as drug cartels and human trafficking, but the company’s response against them is insufficient
Feedback filled by Facebook employees and reached the Wall Street Journal shows that they warn against the negative use of Facebook platforms in developing countries, where its user base is vast and expanding. Workers noted that human traffickers in the Middle East used the site to entice women to work in exploitative sex jobs. They also warned that armed groups in Ethiopia had used the site to call for violence against ethnic minorities. According to the documents, they alerted their executives to phenomena of organ sale, pornography and actions taken by governments against political opposition. The company’s response to these cases was often unsatisfactory. A Facebook spokesman said the company had deployed global teams, formed local partnerships and hired fact checkers to keep users safe.
Probably not by chance, after the poignant investigations, Facebook announced yesterday (Thursday) a new policy that would allow it to disable accounts that deal with “social harm.” The company said the change could help the platform combat harmful behavior that it would not otherwise be able to address under existing rules. Social harm will allow a company a framework to deal with harmful actions from accounts that are considered legitimate today.