
"Around 40 years ago, a bank branch manager probably knew the name of every customer and was able to offer personalised advice and guidance. But as Ranil Boteju, chief data and analytics officer at Lloyds Banking Group, points out, in today's world, that model cannot scale. "In the world of financial planning, most people in the UK cannot afford to see a financial planner," he says."
"None of these things are acceptable in a sector regulated by the Financial Conduct Authority (FCA). What excites Boteju is the ability to scale the 40-year-old model of a bank manager to meet current demand by using artificial intelligence in a way that provides the bank with confidence that the AI is able to understand what people need and give them the right guidance in a way that can be assessed and meets FCA guidelines."
Around 40 years ago a bank branch manager knew every customer's name and offered personalised advice and guidance. That model cannot scale in today's world. Most people in the UK cannot afford to see a financial planner and there are insufficient trained advisers to meet demand. Financial institutions are exploring generative AI to support customers directly and replicate personalised guidance at scale. Large language models from hyperscalers behave like black boxes and can produce incorrect responses known as hallucinations, which are unacceptable under Financial Conduct Authority regulation. Banks have long used machine learning for credit risk, fraud screening, and chatbots, but generative and agentic AI introduce new capabilities and regulatory challenges.
Read at ComputerWeekly.com
Unable to calculate read time
Collection
[
|
...
]