Long ReadJul 6 2023

The financial industry must combat data bias

twitter-iconfacebook-iconlinkedin-iconmail-iconprint-icon
Search supported by
The financial industry must combat data bias
Photo: leungchopan/Envato

The impact is of course data bias, the idea that there is an inherent bias input into the machine because computers and databases are not programmed in a vacuum, or by a fictional, unbiased human.

If you leave the computers to self-regulate, you risk embedding this bias permanently. 

This is particularly alarming when we switch off to human error and societal bias because it could not possibly be coming from a machine. Could it? 

The problem is it can. And it does. An article from tech magazine Mashable describes how ChatGPT is unable to answer if Bessie Smith influenced Mahalia Jackson without extra information.

Bessie Smith is widely regarded as “one of the most important Blues singers in American history … said to have influenced hundreds of artists, including the likes of Elvis Presley, Billie Holiday, and Janis Joplin”.

Data gaps like this are not the only way bias creeps into machine learning. 

In fact, it starts before the computer is even involved. Databases are created with fields determined and filled with data collected by humans with biases.

What fields we choose, where we source our data, and historical prejudices in the data collection methods are all ways in which bias can find itself replicated by programmes designed to be unbiased.

If we are unconscious of our biases, how can we expect computer systems to be conscious of and remove that very bias?

The impacts of data bias are being felt all over the world. The very nature of these programmes is that they are designed to improve efficiency and are therefore very likely to be centralised.

An insurer headquartered in the UK, or a New York-based bank might have pricing algorithms that are employed for customers based all over the globe, and people are increasingly becoming aware of the impacts.

Whatever our background and wherever we are in the world, data bias is a challenge that is impacting us, whether we realise it or not.

What are the impacts?

The impacts of data bias are coming through in a range of ways. An algorithm created by Google to detect hate speech was found to discriminate against black Americans and more frequently banned them for hate speech. 

In my personal experience, when searching for "professional hairstyles for women", I was confronted by a series of images of Caucasian women (it is worth nothing that this has since been fixed following a major user backlash on social media).

Google’s history on this is in fact a fantastic example of how in some of the largest algorithms to exist, bias is incredibly prevalent. This article in Time magazine from 2019 has some astounding findings on the representation of black women in Google searches.

The examples are enough to fill a book. Literally. Caroline Criado Perez in 2019 wrote Invisible Women: Exposing Data Bias in a World Designed for Men

The book is jam-packed with examples of concrete ways data bias negatively impacts women. It includes examples as mundane as voice recognition software being trained on recordings of mainly male voices and therefore being less effective at recognising female voices.

At the other end of the scale she devotes a section to the lack of sex-disaggregated data in the aftermath of natural disasters leading to higher levels of sexual assault and rape of women.

Those same prejudices within data can be felt in financial services where there is mounting pressure to deal with the issues as more companies in a variety of industries are increasingly exploring their AI options to streamline their processes, including London based law firm Allen & Overy, which earlier this year outsourced memo-writing to an AI named Harvey. 

In my role at Equisoft, I advise a range of financial services companies, many of which are closely monitoring the use of AI amid concerns over potential data bias.

The last thing fintechs need is an embedded, non-removable programmed bias that cannot be sent on an HR course to understand the impacts of its comments. If you leave the computers to self-regulate, you risk embedding this bias permanently. 

How can we fix it?

If we act now, we can make a significant difference and stop data bias in its tracks. 

To eliminate (or at least minimise) data bias, the fundamental change that needs to take place is a centring of diversity, equity and inclusion principles in the workplace.

We need diversity to challenge processes that exist just because that is how we have always done it.

We need equity to remove obstacles put in place by historic prejudice and lack of representation.

And we need inclusion to shift the dial on these statistics and start to collect new data that will reduce future occurrences of bias. Only then will we be able to seek, collect and treat data in a less biased way. 

Beyond the implementation of DEI we need to be conscious of data collection. Nobody is exempt from unconscious bias, which is why diversity matters.

If we work collaboratively within teams and involve a trusted third party to justify data-collection decisions and methods, we can troubleshoot the common pitfalls of diversity and proactively eliminate data bias before it has a chance to establish itself.

On a larger scale, processes and initiatives need to be reconsidered with a view to analysing how far they may perpetuate the issue of data bias.

For example, the idea of linking executives’ bonuses to customer satisfaction survey results would inevitably have an impact on the survey questions signed off by said executives.

If you are systematically discarding data throughout your processes, have you questioned why that is the case? Is the data discarded because we do not need it or because we think we do not need it?

Within the financial services industry we have some reflecting of our own to do. The culture of maximising efficiency and being constantly driven by tight deadlines has made us lose track of the critical element: fairness.

We are driven by customer interest, and if they come to realise that their insurer, their accountant or their lawyer is using systems that maximise efficiency but also negatively impact their outcome, where will that leave us?

The theme of International Women’s Day this year was "embrace equity". Channel that: sometimes speed is not everything, and it is better to take the time to prioritise fairness. 

Grace Ata is assistant vice-president, product development, at Equisoft