Tue 22 October 2019:
In the Summer of 2019, hundreds of thousands of Sudanese protesters were braving live fire from an embattled military regime that had recently imposed itself upon the country.
The military’s use of lethal force and terrorizing violence drew condemnations from across the globe. But as the massacres continued and Khartoum’s streets ran red, a mass outpouring of support for the military regime surfaced on Sudanese social media.
Accounts praising the soldiers for returning order to the capital and advancing a conspiracy that the Muslim Brotherhood was behind the demonstrations saturated Facebook, Twitter, Instagram and Telegram. They were then widely reshared by members of the Sudanese military regime.
An investigative team found that the pages and accounts creating the content were fake: orchestrated by a shadowy Egyptian company called New Waves, which was linked to the Egyptian government, and part of a collaborative effort between Egypt and the U.A.E. to further their shared geopolitical interests on social media.
New Waves and its UAE counterpart created over 300 fake pages, reached over 14 million Facebook users and spent almost $200,000 in ads in an effort to garner support for the Sudanese military regime.
It was a disinformation campaign.
And though the military regime in Sudan has ended, the disinformation campaign which temporarily sowed discord in Sudan highlights a growing problem for countries around the world. Disinformation campaigns are super-charging a brutal anti-drug dealer crackdown in the Phillipines and were one of the first tools to motivate a genocide against Muslims in Myanmar.
To understand the anatomy of a disinformation campaign and what can be done to effectively respond to one, Al Bawaba spoke with Ben Decker, a disinformation specialist who was part of that investigation which revealed Egypt’s role in constructing a false narrative in Sudan.
Decker is the founder and CEO of Memetica, a digital investigations consultancy firm, and was previously with Harvard University’s Shorenstein Center, which studies the relationship between media and democracy.
“One of the big things that has really transformed information warfare is the emergence of these hybrid threat actors,” Decker says, “which include state actors, private influence operators, grassroots trolls, and pure rent-seekers who are decentralized and doing things for financial motivations.”
According to Decker, disinformation campaigns can utilize a combination of actors, such as trolls whose message aligns with a particular regime, which in turn allows the regime to platform the troll’s content in an effort to legitimate their own propaganda.
“It’s very clear once you start to look at the motivations behind these that there is some sort of link between the state and non-state actors,” Decker explains, referring to the links between the Egyptian company New Waves and the Egyptian state itself.
In countries with restrictive media landscapes or “news deserts” that have little domestic media presence, state-backed disinformation campaigns are able to freely operate, offering carefully cultivated messages to give disenchanted locals a reason to support the campaign’s ends.
“People are looking for a framing that fits their worldview and a means of interpreting what’s around them. And clearly there’s a lot more a la carte option for them than there ever has been.”
To listen to the full discussion, including an in-depth explanation of how disinformation mutates and spreads, and what can be done to respond, click here:
Courtesy: Al Bawaba
Think your friends would be interested? Share this story!