UCD Smurfit School Professor fights elder financial exploitation with artificial intelligence
Financial exploitation robs older people in the US of anywhere from $2.9 billion to more than $36 billion each year. In Ireland, elder financial exploitation complaints made to the Health Service Executive have risen by 18 % in the past two years. However, as the majority of cases go unreported, the exact scale of the issue is difficult to measure.
New federal regulations have been introduced in the US to encourage financial institutions to report suspected fraud and financial abuse against older customers. The Bank of Ireland recently launched a new unit to support vulnerable account holders. Nonetheless, financial institutions still walk a tightrope between protecting customers and respecting their rights to privacy.
“If banks fail to tackle elder financial exploitation, they can face prosecution and financial penalties. However, if they act on suspicions, which later prove to be false, they can upset customers and risk reputational damage. They may even face legal action,” says Cal Muckley, Professor of Operational Risk in Banking and Finance at UCD Michael Smurfit Graduate Business School.
Many banks now use artificial intelligence (AI) to detect financial crimes such as money laundering and credit card fraud. These machine learning algorithms look for patterns in past data to predict future outcomes and can often spot suspicious activity human fraud analysts cannot. However, the same technology is not used to detect elder financial
“Banks still rely on systems which are crude and inefficient by comparison,” Muckley argues.
Professor Muckley and UCD Smurfit School colleagues Gaurav Kumar, Linh Pham and Darragh Ryan, programmed an algorithm to recognise key indicators of elder financial exploitation and put it to the test against a major US bank’s existing fraud detection system. In a sample of more than 250 million transactions involving customers aged 70 and over, the bank’s system flagged 19,000 instances of suspicious activity. Of these, just 74 were determined to be potentially fraudulent by human analysis. By comparison, the AI instantly recognised 57% of the original red flags were not suspicious. It also identified 66 of the same suspected cases as the bank’s system and detected three others it missed.
“This technology is well-established and no more complicated than that used in search engines or predictive text. However, it has the potential to improve the detection of elder financial exploitation significantly,” says Muckley.
The bank is already using the researcher’s AI alongside its existing systems, to provide an additional level of scrutiny. If adopted as a primary fraud detection system, Muckley believes it could give human analysts more time to investigate and report suspicious activity. However, he says data holds the key to realising the technology’s full potential to fight elder financial exploitation.
“Sharing information about suspicious customer account activity between financial institutions is an obvious first step. Access to medical records could also help to identify older customers who are more vulnerable to fraud due to cognitive decline and poor health,” he explains. “However, this would have to be strictly regulated to ensure individuals’ data privacy is respected.”