We are submitting complaints to the Equality and Human Rights Committee (EHRC) and the Information Commissioners Office (ICO), calling for investigations into whether the algorithms used to promote job ads infringe the Equality Act (2010) and the data protection laws outlined in GDPR.
Facebook users in the UK may be excluded from viewing job ads based on protected characteristics, such as gender and age, according to our new investigation. These findings form the basis for submissions to the Equality and Human Rights Committee (EHRC), authored by leading human rights and employment lawyer, Schona Jolly QC [1], and the Information Commissioners Office (ICO), authored by data rights specialists AWO Legal [2]. Both submissions call on the regulators to investigate and report on the potential that the social media giant is:
a) Failing to prevent discriminatory targeting by employers posting job ads; and
b) Discriminating against users through their algorithms that target ads at certain users based on data the company collect and process
As part of its investigation, the NGO found that posts advertising jobs that are historically considered more suited to one gender than another, e.g. nursery nurse or mechanic, were highly disproportionately viewed by either women or men. This was not due to manual targeting (i.e. the person submitting the advert opting to show it to one type of user over another), but instead due to the algorithms the social media platform itself offers through its ‘optimisation for ad delivery’ system.
On posting real job adverts in this way, we found that [3]:
- 96% of the people shown an ad for mechanic jobs were men;
- 95% of those shown an ad for nursery nurse jobs were women;
- 75% of those shown an ad for pilot jobs were men;
- 77% of those shown an ad for psychologist jobs were women.
Naomi Hirst, Head of the Digital Threats Campaign at Global Witness, said:
“It’s really shocking that Facebook’s own algorithm appears to target job ads in such a discriminatory way. Targeting adverts for nursery workers at women and mechanic jobs at men – what century does Facebook think we’re living in?!”
We first began looking at the possibility of discriminatory targeting of job ads in April 2019, when data from the Facebook Ad Library showed that only 3% of those who had viewed an advert for job openings at Facebook itself were 55 years or older, despite nearly 20% of Facebook users in the UK being in this age bracket at the time. Further investigation showed that 62% of those who were most likely to have seen the ad were male, and that it was most frequently shown to men between 25 and 34 years of age.
Ms Hirst continued:
"The strapline for Facebook's workplace culture is 'Move Fast. Be Bold. Be Yourself.' But clearly that only applies if 'being yourself' means being a man aged between 25 and 34, as that's the person most likely to have been shown job ads from the social media company. This seems very unfair to over 40% of Facebook users who are women and the majority who fall outside that age bracket.
“Equalities
legislation is designed to try to prevent this kind of discrimination and
ensure more opportunities are open to all."
We also found that Facebook approved job ads which deliberately excluded women or those over the age of 55 from seeing them. The ads were pulled from Facebook before they were published so that no discriminatory adverts were posted. Although we were requested to confirm compliance with Facebook’s non-discrimination policy, it appears no checks were in place to prevent discrimination in the targeting choices made.
This is particularly striking given civil rights organisations sued Facebook in the US over this issue and, as part of the settlement of five of the lawsuits, Facebook now prevents advertisers from targeting housing, employment and credit offers to people by age, gender or ZIP code - but only in the US and, later, Canada.
Ms Hirst concluded:
“It seems clear that something is going badly wrong in the way Facebook is both allowing job ads to be targeted through their lax oversight, and in the way their algorithms are working to channel this content at certain users, based on their gender and age. In pursuit of clicks Facebook appears to be replicating the biases we see in the world, riding slipshod over legislation designed to protect users from discrimination.
“The algorithms that dictate the content we see on our Facebook feeds affect billions of people’s lives but remain far too opaque. We need much better regulation of the way they are used, including a legal requirement for tech companies to assess the risk that their algorithms discriminate and for these assessments to be made public with mitigation efforts overseen by an independent regulator.”
In her assessment of this evidence, Schona Jolly QC, who authored our submission to the UK Equality and Human Rights Commission said, “Facebook’s system itself may, and does appear to, lead to discriminatory outcomes."
She added that “the facts as found and collated by Global Witness…give rise to a strong suspicion that Facebook has acted, and continues to act, in violation of the Equality Act 2010”.
This is not an issue exclusive to the UK. Algorithm Watch and academics have shown that Facebook’s ad delivery algorithm is highly discriminatory in delivering job ads in France, Germany, Switzerland and the US.
A Facebook spokesperson said "Our system takes into account different kinds of information to try and serve people ads they will be most interested in, and we are reviewing the findings within this report. We've been exploring expanding limitations on targeting options for job, housing and credit ads to other regions beyond the US and Canada, and plan to have an update in the coming weeks."
We are calling for:
- The UK Equality & Human Rights Commission to investigate whether Facebook’s targeting and ad delivery practices breach the Equality Act (2010).
- The UK Information Commissioner’s Office to investigate whether Facebook’s ad delivery practices breach the GDPR.
- Governments to require tech companies that use algorithms that have the potential to discriminate against users to assess and mitigate those risks to the point that they’re negligible. The risk assessments should be made public and mitigation efforts overseen by an independent regulator with the powers to conduct their own audits.
- Governments to require tech companies to make the criteria
used to target online ads transparent, at the same level of detail that the
advertisers themselves use.
- Facebook to roll out the
changes it has made to housing, job and credit ads in the US
and Canada to the rest of the world. While these changes are not
sufficient to address any discrimination caused Facebook’s own algorithms, they
are a start in addressing discrimination by advertisers.