Artificial Intelligence

Generative AI and the impact on financial services

Sector-specific large language models perform better but have an issue with the transparency of training data, says Michael O'Reilly
Pro
Image: Pixabay

9 October 2023

In association with NTT Data

In 2020, I investigated the potential impact of AI on financial services, charting the meteoric evolution of AI and machine learning. This investigation culminated in an article I published on Forbes.com that outlined multiple use cases across the financial services ecosystem. I went one step further in postulating financial service organisations had the essential ingredients to enable a successful AI initiative: “lots of sample data, well-defined problem space and the ability to know what ‘good’ looks like”. The article also argued that the evolution will continue at a pace presenting many opportunities to “enhance decision-making, automate everyday tasks and develop personalised financial products”.

With the latest advances in generative AI, we have seen an explosion of interest in the potential of this technology. The possibilities appear endless with the introduction of commercial large language models (LLMs) and more effective ways to train and integrate them into your products and services.

 

advertisement



 

Intelligent apps that have learned from thousands of similar journeys can guide your customer journeys to give you the right advice and direction when needed, whether making the right choices on a mortgage application or a quick settlement on a claim. The technology can support the external customer to achieve a great experience and the internal user to make better decisions. Even in its current form, platforms like ChatGPT will give you what appears to be reasonable advice but it is not a guided experience. To achieve this, you must train a model with your industry-specific data and embed the technology in your appropriate customer journey.

The one thing that the financial services industry has is lots of data about its products, services and customers. One downside is a lot of this data needs to be more structured; in fact, the industry estimates over 80% of data in an organisation is unstructured.

Unstructured data represents a potential untapped wealth of information you can harness to fuel your generative AI model. The good news is you can use the same technology behind generative AI models to analyse your unstructured data. Bloomberg recently published a research paper where it successfully developed an LLM for finance that leveraged its vast library of structured and unstructured data. Through a blend of public and domain-specific curated data, they train a model that performs better than equivalent benchmark models. More importantly, Bloomberg’s model can deal with numerous tasks and questions relevant to its domain.

As such, learning from Bloomberg’s experience, a financial organisation could pool its vast data resources with suitable public datasets to generate domain-specific models. Coupling these models with the right applications in products and service journeys could transform how your workforce makes decisions and customers experience your products.

Caution advised

There is a word of caution with this technology’s adoption in well-publicised misuse of LLMs and significant debate on the use of copyrighted material. There is a genuine concern with the potential leaking of sensitive data through training data utilised by LLMs. While the industry is taking steps to self-regulate with initiatives like the Frontier Model Forum, an umbrella group for the generative AI industry, we can expect more comprehensive actions.

The EU published its regulatory framework on AI in 2021, based on principles “that AI systems that can be used in different applications are analysed and classified according to the risk they pose to users”. Depending on the use case this could mean more or less regulation applies. In the case of generative AI, the key themes for the regulation will be transparency: Disclose the content was generated by an AI, design models that prevent creating illegal content and recognise the use of copyright material used in training.

The EU AI Act will come into force at the end of 2023 and will be the world’s first law on AI. In typical regulation fashion, there will be a period of time for you to comply and failure to comply will have serious consequences. The financial service industry is all too familiar with complying with regulations and dealing with risk. This familiarity could give banks and insurance companies an advantage in adopting AI and coping with the regulatory requirements.

The debate on AI will continue and new regulations will help focus the discussion. The challenge is that AI is evolving at an ever-increasing pace, so can industry and the law keep up.

Michael O’Reilly is CTO, international business, with NTT Data

Read More:


Back to Top ↑

TechCentral.ie