Governments around the world are enlisting “cyber troops” who manipulate Facebook, Twitter and other social media outlets to steer public opinion, spread misinformation and undermine critics, according to a new report from the University of Oxford.
Adding to growing evidence of government-sponsored efforts to use online tools to influence politics, researchers found 29 countries using social media to shape opinion domestically or with foreign audiences. The tactics are deployed by authoritarian regimes, but also democratically-elected governments, the authors said.
“Social media makes propaganda campaigns much stronger and potentially more effective than in the past,” said Samantha Bradshaw, the report’s lead author and a researcher at Oxford’s Computational Propaganda Research Project. “I don’t think people realize how much governments are using these tools to reach them. It’s a lot more hidden.”
Online behavior of the government-backed groups varies widely, from commenting on Facebook and Twitter posts, to targeting people individually. Journalists are harassed by government groups in Mexico and Russia, while cyber troops in Saudi Arabia flood negative Twitter posts about the regime with unrelated content and hashtags to make it harder for people to find the offending post. In the Czech Republic, the government is more likely to post a fact-check response to something they see as inaccurate, said the report.
Governments also use fake accounts to mask where the material is coming from. In Serbia, fake accounts are used to promote the government’s agenda, and bloggers in Vietnam spread favorable information.
Meanwhile, government actors in Argentina, Mexico, the Philippines, Russia, Turkey, Venezuela and elsewhere use automation software – known as “bots” – to spread social media posts in ways that mimics human users.
“Cyber troops are a pervasive and global phenomenon,” said the report published by the group that is studying how digital tools are being used to manipulate public opinion.
Propaganda has long been a dark art used by governments, but digital tools are making the techniques more sophisticated, according to Bradshaw. She said governments over the past several years have taken note of the way activists have used social media to spread a message and build support, and are adopting some of the same methods. Online tools such as data-analytics software allow governments to more effectively tailor a message for specific groups of people, maximising its impact.
Bradshaw said that while Russia and authoritarian regimes get most of the attention for manipulating social media, Western democracies have been using similar techniques. In the UK, the British Army created the 77th Brigade in 2015, in part for psychological operations using social media. Bradshaw said democratic governments aren’t forthcoming about their digital propaganda efforts.
“They are using the same tools and techniques as the authoritarian regimes,” she said. “Maybe the motivations are different, but it’s hard to tell without the transparency.”Following the US election, Facebook and Twitter have been criticised for not doing enough to filter out fake news and offensive content. Facebook, which had no immediate comment on the report, has hired more human curators and partnered with fact-check organisations in an attempt to keep misinformation out of people’s feeds.
Twitter spokesman Ian Plunkett referred to a June a blog post that said the company “should not be the arbiter of truth,” and that others on the site do a better job of highlighting wrongdoing. The company has taken steps to crack down on the use of bots. Bradshaw said there isn’t an easy solution when balancing the benefits of sharing information across the Internet against the problems with spreading propaganda. She said one improvement would be tools that make it more clear when a government is involved. “There’s a fine line,” she says, “between free speech and censorship. “
The post Facebook, Twitter Manipulated by Governments to Shape Opinion, Study Claims appeared first on Informatiweb.