TikTok is still failing to catch blatant election disinformation ads in Ireland, as voters prepare to head to the polls for the general election
In May 2024, Global Witness investigated TikTok’s ability to detect election disinformation around the EU parliamentary elections.
We found their content moderation to be so poor that we submitted a complaint to the EU regulator asking them to investigate whether the platform breaches the Digital Services Act.
TikTok said they “instituted new practices for moderating ads that may be political in nature to help prevent this type of error from happening in the future” as a result of our investigation.
Six months later, Ireland is going to the polls again in a snap general election. We tested TikTok’s ability to detect election disinformation again, this time in both English and Irish.
We included Irish because TikTok reported this year that they do not have any content moderators dedicated to content moderation in this official EU language.
We found significant failings in TikTok’s ability to moderate content in Irish and an improvement in their ability to moderate content in English from our last test.
Neither result is good enough: both show major weaknesses in what should have been a very easy test for them to pass.
TikTok approved more than half the election disinformation ads we submitted to them in Irish, and more than 20% of the election disinformation ads in English.
Tracking TikTok’s ability to detect election disinformation
We developed 14 short pieces of election disinformation content relevant to the upcoming general election in Ireland and translated them into Irish.
We turned each statement into a short video, with black text clearly displayed over a plain coloured background. We used a simple form of "algospeak" in the text, such as substituting zeros and ones for "o"s and "i"s and adding special characters instead of certain letters.
We submitted all 28 videos (14 in English, 14 in Irish) to TikTok as adverts and allowed the platform at least 48 hours to complete their review process. TikTok states that typically most ads are reviewed within a day.
We deleted the ads before they could go live to ensure they were not shown to anyone using TikTok.
The type of disinformation that we submitted violates TikTok’s community guidelines and stated commitment to protecting the integrity of elections.
Not only does TikTok ban all political ads, they also ban misinformation about electoral processes in organic posts.
TikTok’s weak spot in Ireland
TikTok approved three of the 14 election disinformation ads for publication in both English and Irish. These ads falsely stated that:
- You need to provide proof of two COVID-19 vaccinations to be allowed to vote
- You can vote by post after the polls have closed on the day of the election
- You can cast your vote on Facebook
TikTok also approved a further five of the 14 Irish language election disinformation ads for publication, making a total of eight Irish language ads that were approved.
Platforms sometimes tell us that further checks are carried out once an ad goes live, implying that in practice they would have performed better than our test results indicate.
To test the extent to which this might be true, we carried out a further test.
We submitted to TikTok a political ad in English and Irish that said that the date of the general election is being 29 November. This statement is true but nonetheless breaches TikTok policies, which ban all political ads including those referencing an election.
Despite this, TikTok approved the Irish language ad for publication and allowed it to go live. This finding suggests there are also gaps in TikTok’s review process at the point of an ad going live.
During our previous test for the EU parliamentary elections in May, TikTok failed to detect any of the 16 adverts we submitted to them that contained blatant election disinformation.
What TikTok needs to do next to tackle election disinformation
TikTok must ensure that its systems protect human rights and democracy, including by assessing how it manages content on the platform.
Very large online platforms such as TikTok are required by law in the EU to mitigate the risk of election interference.
Indeed, the EU’s guidance to platforms says they should make sure they “are able to react rapidly to manipulation of their service aimed at undermining the electoral process and attempts to use disinformation and information manipulation to suppress voters.”
Yet, our findings come at a time when TikTok is reportedly planning to lay off staff responsible for keeping the platform safe, including at least 125 people who work in the UK.
The test we set for TikTok should have been easy for them to pass as their policies ban all political ads, not just ones containing disinformation.
The disinformation and calls to violence that we included should have been immediately clear to any human reviewer. In real life, disinformation is often much more subtle and well-disguised than this.
To effectively tackle election disinformation, we call on TikTok to:
- Properly resource efforts to uphold election integrity in all the countries they operate in around the world, including paying content moderators a fair wage, allowing them to unionise and providing them with psychological support
- Robustly enforce policies on election-related disinformation, for both organic and purchased content, before, during and after elections
- Publish information on the steps they've taken to ensure election integrity, broken down by country and language
In response to our findings, a TikTok spokesperson confirmed that all of the ads we submitted violated TikTok’s advertising polices and said that they have conducted an investigation into why some of the ads were not rejected.
They said that ads may go through additional stages of review as certain conditions are met, such as reaching certain impression thresholds or being reported by users once the ad has gone live.
They stated that they are “focused on keeping people safe and working to ensure that TikTok is not used to spread harmful misinformation that reduces the integrity of civic processes or institutions” and said that they do this by, among other things, “enforcing robust policies to prevent the spread of harmful misinformation.”
This investigation, however, suggests that they do not enforce their policies well enough.