In Bulgaria, Russian Trolls Are Winning the Information War

1 year ago 66

A former Meta employee who worked on its content moderation systems and policy, and who spoke to WIRED on the condition of anonymity, says, however, that mass reporting could at least get certain pieces of content or accounts flagged for review. And the more frequently a certain type of content is flagged, the more likely the algorithm will be to flag it in the future. However, with languages where there is less material to train the algorithm, like Bulgarian, and AI might be less accurate, the former employee says that it’s possibly more likely that a human moderator would make the final call about whether or not to remove a piece of content. 

Meta spokesperson Ben Walters told WIRED that Meta does not remove content based on the number of reports. “If a piece of content does not violate our Community Standards, no matter how high the number of reports is, it won’t lead to content removal,” he says. 

Some moderation issues could be the result of human error. “There are going to be error rates, there are going to be things that get taken down that Meta did not mean to take down. This happens,” they say. And these errors are even more likely in non-English languages. Content moderators are often given only seconds to review posts before having to make a decision about whether or not it will stay online, an indicator through which their job performance is measured.

There is also a real possibility that there could be bias among human moderators. “The majority of the population actually supports Russia even after the war in Ukraine,” says Galev. Galev says that it’s not unreasonable to think that some moderators might also hold these views, particularly in a country with limited independent media.

“There’s a lack of transparency around who is who is deciding, who is making the decision,” says Ivan Radev, a board member of the Association of European Journalists Bulgaria, a nonprofit, which put out a statement condemning Bird.bg’s posting of employee information. “This sentiment is feeding dissatisfaction in Bulgaria.” This opacity can breed confusion.

The imbalance between the ability of coordinated campaigns to get content flagged, and that of individuals or small civil society organizations, whose reports go to human moderators, has helped to create an impression in Bulgaria that Meta is prioritizing pro-Russian content over pro-Ukrainian content.

Just over half of Bulgaria’s 6.87 million people use Facebook, which is the dominant social platform in the country. Bulgaria has long been a target of Russian trolls and pro-Russian propaganda, particularly since the beginning of the war in Ukraine. Both sympathetic local media and Russian disinformation operations have pushed a pro-Russia narrative, blaming the conflict on NATO.

Ezekiev, the BOEC member, told WIRED that he was never given an explanation for why his content was removed or how the choice was made. “If you raise your voice against propaganda and say something about the war in Ukraine, your account can be suspended,” he says. Meta’s own lack of transparency about its moderation processes, says Ezekiev, makes the entire situation murkier.

It is this frustration that drove BOEC to protest at Telus’ Sofia office, and led to employees—themselves largely powerless—being doxed and harassed, though there is no evidence that any Telus moderator deviated from Meta’s own instructions.

In February, Bulgarian media reported that Telus would be closing its operations in the country and moving the work to Germany. “As part of a consolidation of operations, the work Telus International does for Meta in Sofia will be moving to another of our sites,” says Telus spokesperson Michelle Brodovich. “Telus International continues to work successfully with Meta, ensuring the highest level of professional standards.” The company did not address whether or not the inquiries into its work in Bulgaria contributed to this decision.

Read Original