Artificial Intelligence and Inherited Prejudice

Artificial Intelligence (AI) has long promised to be the industry-defining, lead tool in recruitment processes. It has been able to automate tedious and time-consuming administrative tasks, monitor training of new employees and assist with employee networking.

One of the biggest and most discussed promises of using AI in recruitment is that it will work to not only reduce but eradicate unconscious bias in hiring processes. AI is said to be able to ‘diversify the workplace’ if programmed right, and sells ideas of an inclusive eutopia, where no one has to question their own biases, because a robot does it for them.

A worthy discussion is whether streamlining recruitment services through this technology will ultimately fail if the unconscious biases of AI programmers are left unaddressed. If the programmer is prejudiced, the algorithm used to assess candidates will more than likely learn that prejudice in and of itself.

In 2017, Amazon introduced an AI systemin the hopes of making more effective hiring decisions. Though the technology had been in development from 2014, it failed to acknowledge the lack of context AI considers when it is learning.

For example, individuals who have addressed their unconscious bias may look at the past 10 years of Amazon’s hiring trends and see that most of the hires have been white men. They may attribute this trend to men’s disproportionate participation in the tech industry, the unconscious bias of previous hiring managers, a lack of flexible working options or bias in job advertisements, in that they are sometimes written to appeal to men.

AI technology, however, does not see the all-male trend in context. Instead, the biased mistakes of the passed caused the machine to learn to associate male resumes –and particularly one’s using traditionally ‘masculine’ language—as superior.

We often make the mistake of assuming computers have the ability to be objective, and this can partly be attributed to men being the typical designer of this technology, and the fact that men’s experiences are generally considered to be universal. Instead of Amazon’s AI learning objectivity it learned the prejudice that we failed to account for in the years before, and then perpetuated it. Had the this not been identified, and the technology mainstreamed, women in the workplace may have seen years of hard work un-done.

It has been argued that if you program AI to be prejudiced, you can also program it to be fair and ethical. Suggestions have been put forward that if you audit the algorithm’s created by AI, you can mitigate bias. There has also been a call to arms for diversity in AI and tech, in order to mitigate the blind spots of an industry that is otherwise homogenous.

The need to mitigate algorithmic bias through increasing diversity in AI industries is clear, but how deep do these blind spots run considering gender bias is embedded in language itself?

For example, in most literature, female characters are written as submissive, sexualised and financially dependent. Women are also referred to as ‘Miss’ or ‘Mrs’, which suggests they are defined by their relationship to men. This may seem inconsequential in any average person’s life; however, it contributes to the way AI writes algorithms, and therefore influences decisions AI may make concerning recruitment.

Whether you’re adopting AI or not, unconscious bias has a unique, often detrimental and unfair effect on the lives of women. Not only this, but organisations are likely to suffer when they are missing the perspective of half the population. Discovering and addressing implicit associations is essential for recruiters and programmers alike, and the future of AI will remain unstable until organisations and leaders can unpack and address their own unconscious biases.

Cognicity have developed an implicit association test that you can currently trial for free for up to five employees to see where you sit. Contact us today to get started on your organisation’s journey to being adaptive, diverse and inclusive, or read more about what we offer.

 

Article written by Cognicity.

Sign up to our mailing list for blog and newsletter updates: