Unconscious bias in algorithms, AI and machine learning could be the world’s biggest drivers to inequality, warns fintech Finastra as it urges all financial providers to address the problem to ensure a level playing field for everyone when it comes to borrowing money.
Financial services providers increasingly rely on algorithms – sequences of instructions to perform a computation or solve a problem – to make quicker decisions on how to price a policy or whether to provide credit and at what price, with the level of human involvement in these decisions reducing as a result.
The use of algorithms in financial services includes credit scoring, rate setting and insurance pricing, in addition to back-office functions, such as credit risk and fraud risk management.
A new report by consultancy KPMG, commissioned by global fintech Finastra, has examined the size of global consumer lending markets and the potential impact of algorithmic bias in society.
Last year, consumer lending and transactions across key financial products, including credit card and mortgage/ home lending, reached more than $6,110billion in the US. According to Finastra’s research, even if just one per cent of the lending market in the US was impacted by algorithm bias that would be the equivalent of a whopping $61billion.
Understanding algorithms
Algorithms run off data sets with thousands of variables and billions of records aggregated from individual usage patterns, habits and transactions.
In the past decade, the financial world has been digitalised through the introduction of artificial intelligence, particularly forms of machine learning, boosting efficiencies and automating processes, resulting in many parts of banking, lending and insurance decision-making processes now being made by algorithms.
The Covd-19 pandemic has also accelerated the use of these technologies.
The bias problem
Algorithms can only be as ‘fair’ and unbiased as the data sets that are used to build them.
But historic data they process, and even the programmers who create them, could be biased, often unintentionally. Equally, machines can draw conclusions without asking explicit questions, such as discriminating between men and women despite not asking for gender information.
If left unchecked, biased algorithms could lead to decisions which can have a disparate impact on certain groups of people even without the intention to discriminate. Meaning access to fair and equitable financial solutions, such as lending, isn’t equally available to people of all genders, races and cultures.
According to Finastra and KPMG’s research, women are less likely to be insured, approved for a mortgage or receive business funding, and more likely to be charged higher interest rates than men because algorithms used to set limits might be inherently biased against women.
Dr Leanne Allen, director at KPMG Financial Services Tech Consulting, said: “Consumer and public trust are critical success factors for financial services. The findings in our report for Finastra make it clear that providers need to take care when designing, building and implementing these algorithms to ensure innovation can continue to advance in a safe and ethical way.
“The report brings together recent thinking on algorithmic bias, with specific applications to financial services and the potential for biased decision-making. Mitigating bias is vitally important in our digital and data-led world. Not doing so could run the risk of serious financial harm to the consumers who use these services.”
Finastra’s plan
Finastra argues that the industry must check if the biases that exist in society are being repeated through the design and deployment of these technologies.
The fintech firm, which supplies technology to financial institutions including 90 of the world’s top 100 banks, has outlined a five-step plan to tackle algorithm bias.
1) Reforming Finastra’s developer agreement
Finastra has released updated developer terms and conditions for FusionFabric.cloud, its open platform and marketplace for developers. Developers and partners will be expected to account for algorithmic bias and Finastra can inspect for bias within any new application.
2) Creating new proof of concept technologies
FinEqual is a digital tool designed to spot potential bias in credit decisions. Currently at proof-of-concept stage, the fintech aims to make the tool available to customers within the next 18 months
3) Hacking for good
Finastra has launched its annual global hacking competition for 2021, which invites female-led teams to develop AI and machine learning initiatives.
4) Workplace equality
It has committed to reaching 50:50 male to female ratios across all its teams. This includes increasing women among its top 200 leaders and engineers from 30 per cent to 40 per cent by 2025 and to 50 per cent by 2030
5) Work with regulators
The fintech says it is working with regulators in multiple markets and has urged the financial industry to ‘come together to take action and build a fairer society’. It will work with partners and ecosystem to ‘drive the change the industry needs to make – collectively and collaboratively’ to ‘redefine finance for good and open it up to all’.
Shuki Licht, SVP and senior chief innovation officer at Finastra (pictured), told The Fintech Times: “You can’t use attributes like gender, race, ethnicity, and certain pieces of demographic information in your algorithms as something that you can use to identify whether someone can get a particular credit score. This is something that the regulators already ask us to ensure.
“However, if something happened in the past that influenced the fairness of a decision in your data and you continue to keep it in the database, then any algorithm that will be briefed later will include this biased information, this bias will grow and grow, and eventually move into more automated systems.
“KPMG’s research suggests there can be bias in data and we want ensure there is always human intervention to identify these trends in the data to identify any anomalies. We want to make sure that we are not creating any bias in anything that can influence data and future decisions. Furthermore, for anyone coming in and partnering with us and using our APIs, we also ask them to respect the basic needs of data care around bias and fairness to make finance more accessible to more people.”
Students, fintech enthusiasts, financial institution developers and fintech founders have until 4 April to submit projects to Finastra’s Hack to the Future.