Open banking is widely hailed for its ability to revolutionise financial services at a technological level, but it could also be used to tackle one of the industry’s most deep-rooted and insidious problems – that of data bias.
Women have long been financially excluded as traditional forms of credit rating and decision making, and the data on which they are based, have prioritised men – who are more likely to earn higher salaries and less likely to have a career break thanks to childrearing.
The impact of this is that females are less likely to be approved for mortgages and other retail credit, despite often having a near-identical FICO rating which lenders use to assess credit risk. Men also tend to have more debt, according to Experian.
“Whether people are aware of it or not, there is a fundamental data bias when it comes to credit. Men are likely to get higher credit limits and this data bias exists,” says Emma Steeley, CEO of AccountScore.
This imbalance is reflected in the world of business lending too. Oliver Wyman’s 2020 Women in Financial Services report found that while women make up 40% of entrepreneurs worldwide, they are also 30% less likely to have access to sufficient funding for their businesses compared to men.
And so long as banks and other financial services providers continue to make lending decisions based on decades-old methods and data sets – such as by gender or whether someone is on the electoral roll – this discrimination is only likely to continue, regardless of technological innovation.
A disturbing recent example is the story of Jamie Heinemeier Hansson, who was granted permission to borrow 20 times less on her Apple Card than her husband David was. This was despite her having a better credit score, as well as the couple filing a joint tax return and having an equal share in their property.
The Apple Card incident highlighted that computers are not impartial. Artificial intelligence may well be able to digest vast amounts of information and identify patterns far beyond the capability of humans, but the historical data from which such systems “learn” in order to draw conclusions can be biased, even if it is unintentional.
So a system can make a discriminatory decision about a woman’s credit rating due to inherent bias in its training – for example, as women were less likely to have been granted credit, the algorithm continues that pattern – despite having not specifically asked her gender.
However, many believe that while technology can perpetuate these biases, it could also be used to address them, particularly in the open banking era. “I genuinely believe technology can level the playing field fundamentally,” says Sam Seaton, CEO of Moneyhub.
She believes that up-to-date and granular financial information based on a person’s transactions and live account data is a far more appropriate way to determine risk than traditional methods, such as by providing three months worth of bank statements.
Steeley agrees with her fellow Open51 co-founder. “We can now start to address this issue by using open banking and open finance technologies to start to close that gender data gap and change that data bias,” she says. “Open finance APIs becoming available is what solves this. But data has to be modelled appropriately and understood. It can’t be a data dump.”
One way in which her own company has sought to address gender data bias is through last year’s launch of the Financial Health Index, a joint initiative with Equifax. AccountScore’s credit risk index for the lending sector is based upon transactional data found within a consumer’s bank account, a move made possible through open banking.
The index can be then used alone or combined with traditional credit risk metrics in order to obtain a fuller understanding of a consumer’s affordability or creditworthiness – to the benefit of more than just women. “Just because somebody is thin on bureau data does not mean they’re thin on transaction data and they are not creditworthy and they cannot afford products,” she explains.
However, while the vast amounts of data enabled by open banking can undeniably help better inform financial decision-making, experts urge caution around its collection. Luke Scanlon, head of fintech propositions for Pinsent Masons LLP, warned: “Bias can creep in, particularly as a result of data collection practices.
“If the data is not representative, that can be an issue,” he explained. “There are three related concerns here. One is you want accurate data, so you want the data to say what it is. Then you have your privacy concerns, which can mean the data may not be as accurate as it could be if you didn’t have those privacy concerns,” he says.
“Then you have the need to collect a lot of data to prevent bias. Because if a certain group is being discriminated against – and you don’t have details as to their gender and their age – and then it so happens you’re making decisions against a particular group, that can also be an issue from your data collection practices.”
So while more data acquired through open banking can address the issue, institutions and their tech teams need to put processes in place to ensure they’re safeguarding against all those different issues, adds Scanlon.
Such guidance was echoed by a UK Finance spokesperson: “Under GDPR and FCA rules, firms must ensure that decisions made with algorithms achieve a high standard of fairness, transparency and accuracy for customers. Firms thoroughly test models before they are deployed and monitor them over time to ensure they are performing correctly.”
Whatever the challenges, tackling gender bias is a fight worth fighting. Because levelling the playing field will enable companies to build better businesses, better products and better services – to everyone’s benefit.