Posts from bot-like accounts spreading disinformation and hate viewed more than 150 million times ahead of the UK election

Published:

The conversation on X ahead of the UK election is being influenced by accounts that appear to be bots, a new investigation by Global Witness has found. 

Ten accounts have shared more than 60,000 tweets that have been seen more than an estimated 150 million times in the last few weeks. Many of these tweets contain extreme and violent hate speech, disinformation, conspiracy theories and praise Putin.

The bot-like accounts have an oversize influence given how prolifically they tweet, and we found them amongst a very small sample size of accounts. 

Tweets amplified by the bot-like accounts have spread virulent Islamophobia, antisemitism, homophobia and transphobia, state that climate change is a “hoax”, that vaccines have created a “genocide” and that President Putin is “the greatest president ever.”  

All of the bot-like accounts we found have had days when they’ve shared more than 200 tweets, and four of them have had days when they’ve shared an extraordinary 500+ tweets. Together, they have spread more than 60,000 tweets since the election was called.

Ava Lee, Campaign Leader at Global Witness, said:

“Political discussion online is often toxic – we all know that. But when we go on social media, we believe we’re seeing what real people think. While we might not agree with it, we trust that what we see are genuine views held by other voters. When that’s not true, when the conversation may have been influenced by someone who has paid for bots to spread division or to get a particular party into power, our democracy is in jeopardy.

“The UK is going to the polls in under a week. The US in four months. Half the world’s population this year. X, and all social media companies, need to clean up their platforms and put our democracies before profit.”

We wrote to X to give them the opportunity to comment on these findings but they did not respond to our findings.

/ ENDS

Notes to editor:

[1] Bot accounts often post enormous numbers of tweets per day, rarely write any of their own content, have handles that end in a long string of numbers, do not have a profile picture that purports to be of the person running the account, and do not have many followers. We identified accounts as being bot-like if they had three of these red flags including that one of the red flags was that they tweet prolifically.

[2] We found the bot-like accounts by searching among hashtags on migration and climate change. The hashtags we used deliberately covered a wide spectrum of views, from #welcomerefugees through #migration to #stoptheboats and from #climatecrisis to #netzero to #endnetzero [2]. Because bots are often set up to work in a coordinated way, we also looked at the top political hashtag used by a shortlist of potential bot-like accounts that we had drawn up early in the investigation: #labourlosing.

[3] Two of the three accounts we found using #stoptheboats encourage people to vote for Reform UK. The one bot-like account we found using #climatecrisis encouraged people not to vote Conservative by, for example, including the hashtag #GetTheToriesOut in their account bio . All of the five accounts using #labourlosing promoted Reform UK.

[4] The red flags we used to detect accounts that look like they might be a bot are:

  • The account has tweeted more than 200 times a day in the last year
  • The account has tweeted more than 60 times a day on average over the lifetime of the account
  • The account retweets other accounts’ tweets more than 90% of the time
  • The account’s handle ends in a long string of apparently random numbers, indicating that the account holder used the default account name provided by X instead of creating their own unique account name
  • Accounts without a profile picture that appears to be of the person running the account or with a profile picture that shows signs of being generated by an AI tool or having been stolen from elsewhere on the web
  • The account has fewer than 1,000 followers

[5] In addition to the hashtags above, we searched #migrantcrisis, #smallboatscrisis, #ltn and #climatescam; none of those turned up examples of accounts with 3 or more red flags for bot-like activity.

[6] Nine of the bot-like accounts we found focus on UK politics; one focuses on US politics and the UK royal family but was uncovered through our searches of accounts using UK-related hashtags.

[7] One of the accounts we investigated had MBGA (Make Britain Great Again) alongside MAGA (Make America Great Again) and anti-migrant hashtags in their bio. On occasion it has posted nearly 600 tweets in a day and has increased the amount it posts since the election was called by more than 50%. It reposts anti-migrant, transphobic and far-right content and supports Reform UK. Another account described itself in its bio as being pro-Palestine, the environment and the NHS and is focused on stopping votes for the Conservatives.  It regularly posts anti-Farage memes.

[8] There is no evidence that any UK political party is paying for, using or promoting bots as part of their election campaigns.