The Computational Propaganda Research Project (COMPROP) investigates the interaction of algorithms, automation and politics. This work includes analysis of how tools like social media bots are used to manipulate public opinion by amplifying or repressing political content, disinformation, hate speech, junk or fake news.
In their most recent report COMPROP have identified how organisations, often with public money, have created a system to help ‘define and manage what is in the best interest of the public.’
COMPROP have compared such organisations across 28 countries, created an inventory system and logged the kinds of messages, valences (positive or negative messaging) and communication strategies used. They have also catalogued organisational forms and evaluated their capacities in terms of budgets and staffing.
This article focuses on the use of cyber-troops.
Cyber-troops are identified as government, military or political party teams committed to manipulating public opinion over social media. Its findings include the use of cyber troops that are now a pervasive and global phenomenon. Many different countries employ significant numbers of people and resources to manage and manipulate public opinion online, sometimes targeting domestic audiences and sometimes targeting foreign publics.
The basic finding include:
The earliest reports of organised social media manipulation emerged in 2010, and by 2017 there are details on such organisations in 28 countries, including the US and UK.
Looking across the 28 countries, every authoritarian regime has social media campaigns targeting their own populations, while only a few of them target foreign publics. In contrast, almost every democracy in this sample has organised social media campaigns that target foreign publics, while political-party-supported campaigns target domestic voters.
Authoritarian regimes are not the only or even the best at organised social media manipulation. The earliest reports of government involvement in nudging public opinion involve democracies, and new innovations in political communication technologies often come from political parties and arise during high-profile elections.
Over time, the primary mode for organising cyber troops has gone from involving military units that experiment with manipulating public opinion over social media networks to strategic communication firms that take contracts from governments for social media campaigns.
The report mentions that “In January 2015, the British Army announced that its 77th Brigade would “focus on non‐lethal psychological operations using social networks like Facebook and Twitter to fight enemies by gaining control of the narrative in the information age”.
The primary task of this unit is to shape public behaviour through the use of “dynamic narratives” to combat the political propaganda disseminated by terrorist organisations.
The United Kingdom is not alone in allocating troops and funding for influencing online political discourse. Instead, this is part of a larger phenomenon whereby governments are turning to Internet platforms to exert influence over information flows and communication channels to shape public opinion.”
What is of concern in the report is that Cyber troops use a variety of strategies, tools and techniques for social media manipulation. Generally speaking, teams have an overarching communications strategy that involves creating official government applications, websites or platforms for disseminating content; using accounts—either real, fake or automated—to interact with users on social media; or creating substantive content such as images, videos or blog posts.
These teams engage in sending pro‐government, positive or nationalistic messages when engaging with the public online. Other teams will harass, troll or threaten users who express dissenting positions.
Other, more popular forms of individual targeting involves various forms of harassment. This generally involves verbal abuse, hate speech, discrimination and/or trolling against the values, beliefs or identity of a user or a group of users online. Of course, some governments will use this type of harassment during important political events, namely, elections.
In addition to official government accounts, many cyber troop teams run fake accounts to mask their identity and interests. This phenomenon has sometimes been referred to as “astroturfing”, whereby the identity of a sponsor or organisation is made to appear as grassroots activism (Howard, 2003). In many cases, these fake accounts are “bots”—or bits of code designed to interact with and mimic human users.
According to media reports, bots have been deployed by government actors in Argentina (Rueda, 2012), Azerbaijan (Geybulla, 2016), Iran (BBC News, 2016), Mexico (O’Carrol, 2017), the Philippines (Williams S, 2017), Russia (Duncan, 2016), South Korea (Sang‐Hun, 2013), Syria (York, 2011), Turkey (Shearlaw, 2016) and Venezuela (VOA News, 2015).
These bots are often used to flood social media networks with spam and fake news. They can also amplify marginal voices and ideas by inflating the number of likes, shares and retweets they receive, creating an artificial sense of popularity, momentum or relevance.
Some cyber troop teams create content to spread certain political messages. This content creation amounts to more than just a comment on a blog or social media feed, but instead includes the creation of content such as blog posts, YouTube videos, fake news stories, pictures or memes that help promote the government’s political agenda. In the United Kingdom, cyber troops have been known to create and upload YouTube videos that “contain persuasive messages” under online aliases (Benedictus, 2016).
Government‐based cyber troops are public servants tasked with influencing public opinion. These individuals are directly employed by the state as civil servants, and often form a small part of a larger government administration. The report finds that “cyber troops can be found across a variety of government ministries and functions.” GCHQ is one such department.
The Australian Coalition Party used social media during its 2013 campaign to manipulate the public by using fake accounts to artificially inflate the number of followers, likes, shares or retweets a candidate receives, creating a false sense of popularity.
In Israel, the government actively works with student volunteers from Jewish organisations or other pro Israel groups around the world (Stern Hoffman, 2013). In many cases these top-performing volunteers awarded scholarships for their work (Stern‐Hoffman, 2013).
The report concludes:
“There is no doubt that individual social media users can spread hate speech, troll other users, or set up automated political communication campaigns. Unfortunately, this is also an organised phenomenon, with major governments and political parties dedicating significant resources towards the use of social media for public opinion manipulation.”
“I don’t think people realise how much governments are using these tools to reach them. It’s a lot more hidden,” Samantha Bradshaw, the report’s lead author told Bloomberg, noting the prominence of social media manipulation among democratic governments.
“They are using the same tools and techniques as the authoritarian regimes,” Bradshaw said. “Maybe the motivations are different, but it’s hard to tell without the transparency.”
In the meantime, it should not be forgotten that whilst on the one hand governments around the world, including Britain are actively engaging in online public manipulation, Theresa May, the prime minister, has already asked governments to unite to regulate what tech companies like Google, Facebook and Twitter allow to be posted on their networks. The EU has already clamped down with calls that they are effectively shutting down free speech as apposed to curtailing hate speech, whilst engaging in exactly that – hate speech.
Whilst you might expect some governments around the world such as Azerbaijan, China, Israel and North Korea to be engaging cyber-troops to manipulate pubic opinion, you would not expect other western democracies such as the USA, UK or Germany to be doing so. But then again, these very same countries have built massive 360 degree mass surveillance systems without any public debate at all.
By TruePublica
The 4th Media