Meta is engaging in prohibited discrimination based on gender, according to a ruling from the Netherlands Institute for Human Rights announced today. The Dutch independent human rights body found that Meta Platforms Ireland Ltd., which is responsible for providing Facebook advertisements in Europe, is not fulfilling its duty of care to its Dutch users by employing an advertising algorithm that has a discriminatory effect.
The ruling responds to a complaint filed by women’s rights organisation Bureau Clara Wichmann in collaboration with the investigative campaigning organisation Global Witness. In 2022 and 2023, investigations discovered a clear gender bias in Meta’s algorithm, which decides who to show job adverts to. Certain job postings, such as electrician, hairdresser, mechanic and primary school teacher, were overwhelmingly shown to either men or women in the Netherlands. The job advertisement for the position of receptionist was shown to female Facebook users in 96% of cases (2022) and 97% of cases (2023). The job advertisement for mechanic was shown to male Facebook users in 96% of cases (2022 and 2023). This unjustifiably perpetuates and reinforces existing inequalities and stereotypes.
Berty Bannor, staff member at Bureau Clara Wichmann, said: “Today is a great day for Dutch Facebook users, who have an accessible mechanism to hold multinational tech companies such as Meta accountable and ensure the rights they enjoy offline are upheld in the digital space.”
Rosie Sharpe, Senior Campaigner on Digital Threats at Global Witness, said: “This ruling marks an important step towards holding Big Tech accountable for how they design their services and the discriminatory impact their algorithms can have on people. We hope this ruling can be used as a springboard for further action, in Europe and beyond."
Thomas van der Sommen, lawyer at Prakken d’Oliveira representing the organisations, adds: “Meta has failed to convince the Institute that its advertising algorithm does not discriminate based on gender. Its algorithms harm women in particular and this needs to stop. This decision sets a clear and convincing precedent for how algorithms are regulated by anti-discrimination laws.”
Algorithm engages in prohibited discrimination based on gender
This is likely the first time an official European human rights body has ruled that a social media platform’s algorithm discriminates by gender. The ruling comes shortly after Meta’s announcement to end its internal Diversity, Equity and Inclusion program, and lower the threshold for what counts as hate speech and discrimination on its platform, including speech relating to gender. It follows an earlier case in the US where Meta agreed to change the way their algorithms work. Together with Fondation Des Femmes, Global Witness also filed a similar complaint to the French equalities regulator.
According to existing anti-discrimination legislation in the Netherlands and Europe it is unlawful to make a prohibited distinction based on gender when displaying job adverts, to avoid perpetuating gender inequality and gender discrimination in the labor market.
The Institute ruled that Meta has a duty to monitor the degree to which its algorithm promotes stereotypes and take measures to neutralize this effect. Yet, Meta failed to demonstrate that it is fulfilling this duty of care. Meta now needs to change its advertising algorithm to prevent further discrimination.
Bureau Clara Wichmann and Global Witness are considering possible next steps. They are also open to a constructive dialogue with Meta about creating a better and fairer system.