MADRID, 9 Jul. (Portaltic/EP) –
Facebook and Instagram have not applied one of their content standards since 2018, which establishes an exception that allows to speak about the human rights of dangerous people or organizations, as the Facebook Content Advisory Council, the body of independent experts that advises Facebook on complex moderation cases.
This warning has come in the context of an Advisory Council resolution by the deleting an Instagram post January of this year that encouraged discussion of the solitary confinement in a Turkish jail of Abdullah Öcalan, a founding member of the Kurdistan Workers’ Party (PKK).
Due to the PKK’s use of violence, both the party and Öcalan were designated as dangerous entities pursuant to Facebook’s Dangerous Persons and Organizations Policy, thus the post was deleted and Facebook rejected his reinstatement.
However, in a resolution of the Facebook Content Advisory Council published this Thursday, it has warned that “By accident” a part of Facebook’s dangerous people policy was not moved, created in 2017, at review system used by its moderators since 2018.
This guideline, that “opens the debate on the conditions of detention of persons designated as dangerous“, caused the content to be allowed again on Instagram on April 23.
Likewise, Facebook informed the Advisory Council that it was “working on an update of their policies to allow users to discuss the human rights of designated dangerous individuals“The deleted publication did not support Öcalan or the PKK, only encouraged a debate on their human rights due to their imprisonment.
In its conclusions, the Council has assured that the decision to remove the content by Facebook “was not consistent” with its own rules, and it has been stated that is concerned “that Facebook has overlooked a specific guideline on a major policy exception for three years“.
The Council considers that, due to the error of Facebook, many other posts may have been mistakenly removed since 2018 and that Facebook’s transparency reports are not enough to assess whether this type of error reflects a system problem.
Therefore, independent experts have asked Facebook to Restore the Missing 2017 Standard Immediately, to evaluate its review processes for dangerous people and organizations and to publish these results, as well as to ensure that its moderators receive adequate training.
Likewise, it has also asked Facebook to add a clearer explanation in its policies on when do you think a post supports a dangerous leader and how users can make their intentions clear.