This activity has become professionalised, with private firms offering disinformation-for-hire services
Credit: Shutterstock. This activity has become professionalised, with private firms offering disinformation-for-hire services

Social media manipulation by political actors an industrial scale problem - Oxford report

Social media manipulation of public opinion is a growing threat to democracies around the world, according to the 2020 media manipulation survey from the Oxford Internet Institute, which found evidence in every one of the 80+ countries surveyed.

Organised social media manipulation campaigns were found in each of the 81 surveyed countries, up 15% in one year, from 70 countries in 2019. Governments, public relations firms and political parties are producing misinformation on an industrial scale, according to the report.  It shows disinformation has become a common strategy, with more than 93% of the countries (76 out of 81) seeing disinformation deployed as part of political communication. 

Social media manipulation of public opinion is a growing threat to democracies around the world

Professor Philip Howard, Director of the Oxford Internet Institute, and the report’s co-author says, ‘Our report shows misinformation has become more professionalised and is now produced on an industrial scale.  Now, more than ever, the public needs to be able to rely on trustworthy information about government policy and activity. Social media companies need to raise their game by increasing their efforts to flag misinformation and close fake accounts without the need for government intervention, so the public has access to high-quality information.’

Social media companies need to raise their game by increasing their efforts to flag misinformation and close fake accounts without the need for government intervention, so the public has access to high-quality information

Professor Philip Howard

The OII team warns the level of social media manipulation has soared, with governments and political parties spending millions on private sector ‘cyber troops’, who drown out other voices on social media. Citizen influencers are used to spread manipulated messages. These include volunteers, youth groups and civil society organisations, who support their ideologies.

OII alumna, Dr Samantha Bradshaw, the report’s lead author says, ‘Our 2020 report highlights the way in which government agencies, political parties and private firms continue to use social media to spread political propaganda, polluting the digital information ecosystem and suppressing freedom of speech and freedom of the press.  A large part of this activity has become professionalised, with private firms offering disinformation-for-hire services.’

Key findings the OII researchers identified include:

  • Private ‘strategic communications’ firms are playing an increasing role in spreading computational propaganda, with researchers identifying state actors working with such firms in 48 countries.
  • Almost $60 million has been spent on firms who use bots and other amplification strategies to create the impression of trending political messaging.  
  • Social media has become a major battleground, with firms such as Facebook and Twitter taking steps to combat ‘cyber troops’, with some $10 million has been spent on social media political advertisements. The platforms removed more than 317,000 accounts and pages from ‘cyber troops’ actors between January 2019 and November 2020.

This activity has become professionalised, with private firms offering disinformation-for-hire services

Dr Samantha Bradshaw

Cyber troops are frequently directly linked to state agencies. According to the report, ‘In 62 countries, we found evidence of a government agency using computational propaganda to shape public attitudes.’

Established political parties were also found to be using social media to ‘spread disinformation, suppress political participation, and undermine oppositional parties’, say the Oxford researchers.  

According to the report, ‘In 61 countries, we found evidence of political parties or politicians running for office who have used the tools and techniques of computational propaganda as part of their political campaigns. Indeed, social media has become a critical component of digital campaigning.’

We found evidence of political parties or politicians running for office who have used the tools and techniques of computational propaganda as part of their political campaigns....social media has become a critical component of digital campaigning

Dr Bradshaw adds, ‘Cyber troop activity can look different in democracies compared to authoritarian regimes. Electoral authorities need to consider the broader ecosystem of disinformation and computational propaganda, including private firms and paid influencers, who are increasingly prominent actors in this space.’

The report explores the tools and techniques of computational propaganda, including the use of fake accounts – bots, humans and hacked accounts – to spread disinformation. It finds:

  • 79 countries used human accounts,
  • 57 counties used bot accounts, and
  • 14 countries used hacked or stolen accounts.

Researchers examined how cyber troops use different communication strategies to manipulate public opinion, such as creating disinformation or manipulated media, data-driven targeting and employing abusive strategies such as mounting smear campaigns or online harassment. The report finds:

  • 76 countries used disinformation and media manipulation as part of their campaigns,
  • 30 countries used data-drive strategies to target specific users with political advertisements,
  • 59 countries used state sponsored trolls to attack political opponents or activists in 2019, up from 47 countries in 2019.

The 2020 report draws upon a four-step methodology employed by Oxford researchers to identify evidence of globally organised manipulation campaigns. This includes a systematic content analysis of news articles on cyber troop activity, a secondary literature review of public archives and scientific reports, generating country specific case studies and expert consultations.

The research work was carried out by Oxford researchers between 2019 and 2020. Computational Propaganda project research studies are published at https://comprop.oii.ox.ac.uk/publications/