Operation “Guardian of the Walls”: this is how Facebook behaved during it

The social networks, including Facebook and Instagram, have faced a lot of criticism in recent years for the way they filter the content that is uploaded to their platforms, and make sure that no content that violates their rules is uploaded. Among other things, these are certain documents, inflammatory or false things, or certain contents that deal with explosive issues while spreading wrong information – for example, the corona vaccines and security issues. Criticism of the way content is filtered is also reflected in security issues, such as wars and military conflicts around the world.

One of the cases in which the social networks faced such criticism was Operation Wall Guardian in May 2021. In response to this criticism, the Meta company published this morning (Thursday) an independent inspection report, produced by the organization Business for Social Responsibility, which examines the company’s impact on the human rights of Israelis and Palestinians during the “Wall Guard” operation in May last year. The company emphasized that the report checks all of the company’s platforms – WhatsApp, Facebook and Instagram – and the test in question took place between the months of September 2021 and April 2022.

The report shows, based on the information reviewed by the organization, that Meta’s actions during the operation had a negative impact on the human rights of the Palestinian users. It is about freedom of expression, freedom of demonstration, political participation and non-discrimination – and in fact also about their ability to share information and their experiences in real time. Many shared their opinion with the organization that Meta suppressed their voices. The report relies on many data indicating over-enforcement alongside under-enforcement carried out during this period, when the content that was subject to over-enforcement is the content in the Arabic language.

Gunners massively firing at targets in Gaza. Operation “Guardian of the Walls” (Photo: Yonatan Zindel, Flash 90)

Also, the data reviewed by BSR also showed that the rates of proactive monitoring of potentially infringing Arabic content were significantly higher than the rates of proactive monitoring of potentially infringing Hebrew content. These data can certainly be attributed to the company’s policy that incorporates certain legal obligations on the subject related to foreign terrorist organizations, and the fact that there were phrases that were defined as hostile in Arabic – and not Hebrew.

At the same time, a lack of enforcement was also found, for example in cases of incitement to violence and words of praise for Hamas – including by the Palestinian authorities. However, the materials also show that at times, Hebrew content was subject to a greater lack of enforcement – largely due to the lack of a Hebrew classifier, and the departure of Hebrew-speaking employees in the period preceding the operation, when at the same time there were also cases of excessive enforcement of Hebrew content.

The organization noted that it was able to locate the possible circumstances that led to the enforcement of the permit, which Meta must continue to investigate, including the possibility that the algorithms in the Hebrew language, which identify and sort certain content, tend to make more mistakes in Hebrew – relative to Hebrew. Also, it is possible that potentially infringing content in the Arabic language – does not trace to testers who understand the language. Also, according to stakeholders quoted in the document – the company did not employ enough Hebrew and Arabic speaking content testers to handle the sharp increase in the amount of content during this period.

Based on the testing and feedback received, it becomes clear that over-enforcement has become significant, as users have accumulated “false disqualifications” that have affected the visibility and exposure of their content, after their posts have been inadvertently removed for policy violations. The impact on human rights was more severe in the context, especially for activists and journalists. Also, the lack of supervision was revealed – which allowed mistakes in the enforcement of the policy.

One example is an employee of Meta’s outsourcing services added Al Aqsa to the list of blocked tags – after taking from an updated list of phrases from the US Treasury Department, which included the Al Aqsa Brigades. This means that the tag was removed from the searches – when in fact it was widely used in posts that referred to the Holy Mosque of Islam.

The military operation raised the issue of lines of praise and glorification of violence, and society must consider – according to the organization – whether the policy is adequately required for words of praise and glorification of indiscriminate violence.

Another issue raised in the report, by stakeholders, is concerns about anti-Semitic content, which mostly falls under the hate speech policy – although it does not apply to all types and does not clearly outline a distinction between these categories or a complete definition. Meta does not have a complete definition of these things, so it does not have indicators that allow understanding the prevalence of anti-Semitic content and whether there was an increase in its presence in May 2021.

Also, external stakeholders interviewed by the organization reported cases in which Israelis from the right wing used WhatsApp to incite violence and coordinate attacks against Arabs and Israelis, as well as against journalists. At the same time, there were journalists and academics whose accounts were inadvertently disabled as a result of enforcement activities against terrorist organizations. According to sources cited in the report, users may have difficulty understanding what is praise for terrorist organizations and what is incitement to violence.

Despite these data, the organization stated that they did not detect bias or intentional activity by Meta or any of its employees in favor of or against a particular group, this at the same time as cases of unintentional bias that were identified, and led to various effects on the human rights of Palestinian and Arabic-speaking users.

It is important to emphasize that the likelihood that Palestinians will violate Meta’s policy on the issue is higher, due to the fact that Hamas is the organization that controls the Gaza Strip. In addition, Palestinians are more likely to face more severe consequences of both correct and incorrect enforcement of policies when they are prevented from sharing certain political content.

By Editor