digital transaction
AI Australasia Banks Feature Stories Trending

Commonwealth Bank Publicly Shares AI Model in Bid to Tackle Technology-Facilitated Abuse

Technology-facilitated abuse continues to pose a problem across the globe. In light of this, new research revealing that over one in four Australians aged 65 and over (28 per cent) have experienced financial abuse or know someone who has, comes as no surprise.

In response to this, Australia-based multinational bank Commonwealth Bank is taking steps to help reduce technology-facilitated abuse internationally by making its artificial intelligence (AI) and machine learning (ML) techniques available for free, to any bank in the world.

The bank’s AI model helps to identify digital payment transactions that include harassing, threatening or offensive messages – referred to as technology-facilitated abuse. It currently detects around 1,500 high-risk cases annually.

Angela MacMillan, group customer advocate at Commonwealth Bank
Angela MacMillan, group customer advocate at Commonwealth Bank

Angela MacMillan, group customer advocate at Commonwealth Bank, said: “Financial abuse occurs when money is used to gain control over a partner and is one of the most powerful ways to keep someone trapped in an abusive relationship. Sadly we see that perpetrators use all kinds of ways to circumvent existing measures such as using the messaging field to send offensive or threatening messages when making a digital transaction.

“We developed this technology because we noticed that some customers were using transaction descriptions as a way to harass or threaten others. By using this model we can scan unusual transactional activity and identify patterns and instances deemed to be high risk so that the bank can investigate these and take action.

“By sharing our source code and model with any bank in the world, it will help financial institutions have better visibility of technology-facilitated abuse. This can help to inform action the bank may choose to take to help protect customers”.

Supporting vulnerable customers

Commonwealth Bank research also found that nine in 10 of those surveyed believe there would be barriers for those experiencing financial abuse in seeking support, and most (six in 10) would not be confident that they would know how to help someone experiencing it.

Of those who are experiencing financial abuse, more than half (52 per cent) believe they can deal with the behaviours of financial abuse themselves.

By using AI, Commonwealth Bank hopes to create a safer banking experience for all customers, especially those in vulnerable circumstances.

The model and source code are available this week through the bank’s partnership with AI firm H2O.ai on GitHub, the world’s largest platform for hosting source code.

As part of its extensive efforts to combat technology-facilitated abuse, the bank has implemented an automatic filter that blocks abusive, threatening or offensive words in digital payment transactions. So far it has blocked nearly 1 million transactions since it was implemented in 2020.

This announcement follows the bank’s pilot with the NSW Police earlier this year to refer perpetrators of financial abuse to the police, with customer consent.

Author

  • Tom joined The Fintech Times in 2022 as part of the operations team; later joining the editorial team as a journalist.

Related posts

Open Banking has arrived and is here to stay but how should banks take charge?

Mark Walker

Global Financial Innovation Network Opens Applications for Cross Border Testing

Polly Jean Harrison

Reminder: Web3 Communication Delivered to the BNB Chain With Push Protocol Integration

Tyler Pathe