Ditch the term unconscious bias and use technology to combat the ‘bugs in our brain’

2 minute read

Most bias is not invisible and we all have it, argues psychologist Dr Julia Shaw. By understanding this and using technology where relevant, our organisations can be the best they can

Sian Harrington

Dr Julia Shaw on bias at work

Our brain is like an operating system that is full of bugs. But this doesn’t mean we can absolve ourselves from responsibility when it comes to bias.

So says Dr Julia Shaw, a psychologist, honorary research associate at University College London and founder of artificial intelligence-based reporting tool Spot.

“Unconscious bias, it’s the term of the moment. It allows us to say, don’t blame me, blame my brain. It lets us off the hook. Most bias is not that invisible – we see it in the data, in the pipelines, in the boardroom,” she told delegates at the CogX Festival of AI and Emerging Technology in London.

There are hundreds of biases in the workplace every day. We often make decisions based on incomplete or erroneous information. For example, ‘in-group’ and ‘out-group’ bias, where we prefer to work with people who are like us and tend to look at those who are not through a stereotypical lens, which can result in a hiring decision based on ‘people like this are not going to fit’.

Or confirmation bias where we disproportionately remember information that confirms our views.

The point is that bias exists and it is impossible to get rid of it totally in humans. All you can do is try to reduce the negative elements.

“We need to be ready for the manifestation of bias. It’s about making sure you are prepared for it and how you tackle it,” she said.

With bias in artificial intelligence and data a big issue, Shaw argues that technology can help, not hinder, the removal of bias. By replacing the human at the most relevant point in a process you can remove the inherent human bias, for example in recruitment.

Bu, she warns against the perfect solution fallacy. In other words, thinking that if you can’t get technology 100% perfect then don’t do it at all. After all, humans are certainly not 100% perfect.

As co-founder of Spot, Shaw is using technology to improve on a human process. Spot is an anonymous chatbot that uses AI to help employees remember and document details of what happened when harassed or discriminated against in the workplace.  It is based on cognitive interviewing,  shown to be the best technique for reliably extracting accurate information about important emotional events.

Employees feel safer communicating with a non-judgemental bot, says Shaw. “How many times have you heard employers say they have a hotline for reporting, and then follow this statement with… and nobody uses it?

“In over 70% of cases reported to Spot it is the manager who is doing the harassing, and yet often this is the first point of contact for reporting an issue. Also,  more than 60% of cases have at least one witness. When asked, these witnesses say they don’t report as everyone knows about it.”

Everyone aside from HR, it seems.

Published 12 June 2019

How many times have you heard employers say they have a hotline for reporting, and then follow this statement with… and nobody uses it?

Enjoyed this story?
Sign up for our newsletter here.