top of page
Arjun Tiwari

Big Banks to Deploy LLMs Trained on Internal Data

Updated: Jun 21


Big Banks to Deploy private LLMs trained on internal data

The BFSI sector is undergoing a paradigm shift after recognizing the importance of innovations like Large Language Models (LLMs) and Generative AI and aligning their operations with evolving technologies. Even the biggest in the sector now require technology to develop irreplicable customer relationship moats that build sustainable competitive advantage. 


Some of the biggest commercial banks in India like HDFC Bank, and its primary rival Axis Bank are in the advanced stages of developing private large language models, trained specifically on their internal knowledge repositories, for in-house teams to access data on the go and build more responsive and intuitive customer experience and drive efficiencies. 


Large language models are the foundation for Gen-AI applications like OpenAI’s ChatGPT run. LLMs are time and cost-intensive to build, driven by the aspects that help sharpen communication and improve information clarity. 


Banking In-house LLMs: Banks like HDFC and Axis Bank have deployed private LLM tool

HDFC Bank is improving internal operations and customer experience with LLM


Retail-focused HDFC Bank is set to launch its private LLM-powered website in the upcoming quarters. The chief information officer and group head of IT, HDFC Bank, Ramesh Lakshminarayanan confirmed the implementation. The website is in beta stage. 


LLM brings the ability to convert buying through a lot of information that organizations have and throw up intuitive write-ups that a human would do. That’s fundamentally where the real big advantage is,” Lakshminarayanan said.

This website will enable customers to find product information through simple prompts that will simplify their journey and easily access the information they are searching regarding bank statement details or any other bank-related details.  


Later, the bank will also leverage a private LLM model for analysts and research teams. It will be used to write business requirement documents, credit assessment models, and so on that analysts generally have to drag through lots of data to write. 


Executives say private LLMs have the potential to transform both customer experience and internal operations.


Axis Bank is set to use Gen-AI-based assistants to enhance customer experience


HDFC’s private sector rival Axis Bank is rolling out Generative AI-based virtual assistants to guide customers throughout their journey and using its inference capabilities to automate daily operations to cut down the time and effort involved.


We are targeting to implement private LLM for specific use cases by the end of FY24. The bank is looking at private LLMs to propose new solutions and improve existing solutions” Avinash Raghavendra, president, and head, of IT at Axis Bank said

Generating content, whether code or marketing, using LLM as well as decision-making to automate operations across streams will help bring better business outcomes at the same cost, A Raghavendra said. He added the bank is actively collaborating and engaging with some top software-as-a-service providers and cloud service providers (CSP) to explore options.


For both untrained and pre-trained models, the compute requirements do not make financial sense to be running in one’s own data center,” he added. “On the other hand, the cloud provides state-of-the-art computing when needed, and that’s how we see ourselves progressing.

Technical Partners of Banks helping in deploying LLMs


Some IT companies like Tech Mahindra are helping the industry deploy private LLMs. “The financial industry’s vast and diverse data highlights the value of LLMs for businesses. By refining existing LLMs, we are assisting our customers regarding on-premise solutions to address data privacy concerns,” words by Nikhil Malhotra, Global Head - Makers Lab, Tech Mahindra.


Banks with private LLMs could be in a position to perform accurate predictive analytics, disseminate knowledge, fraud detection, and prevention which leads to better risk and fraud management. With newer technologies and more data availability, this can be a great opportunity for banks and tech partners to work with account and data aggregators, and overall financial institutes to elevate their business.


Big Global Banks Deployed LLMs

Morgan Stanley’s generative AI assists financial advisors 


One of the world’s largest wealth management and investment firms, Morgan Stanley unveiled their internal generative AI for financial advisors and support staff members. The AI model is developed using OpenAI’s GPT-4, to let financial advisors and researchers find answers to their queries from massive internal knowledge repositories. Not only this, but it also condenses the client meeting content and generates follow-up emails immediately. 


Called the AI@Morgan Stanley Assistant, the model gives financial advisors easy and speedy access to the bank’s “intellectual capital,” a database of more than 100,000 research documents and reports. 


By reducing financial advisors' and customer-facing employees' time when it comes to queries regarding markets, internal processes, and recommendations, this AI assistant frees them to engage more with clients. 



J.P Morgan introduces DocLLM working with complex documents


J.P. Morgan introduced DocLLM, a generative language model designed to read and understand multimodal documents. DocLLM stands out as a lightweight extension of LLM to analyze an organization’s repositories, reports, invoices, and spanning forms that carry complicated semantics at the intersection of spatial and textual modalities. 


DocLLM handles unstructured and irregular content in visual documents by using the pre-training objective that focuses on understanding infill text segments. Unlike other available LLMs, DocLLM avoids expensive image encoders and focuses only on information in the bounding box. 


The researchers highlight the major contribution of their work, including


  • Proposing a novel architecture that integrates graph neural networks and pre-trained language models to improve document presentation.

  • Proposing a link prediction strategy for document graph reconstruction.

  • Representing that the proposed graph neural features enhanced performance and elevated convergence in the learning process.  


Citigroup reads 1,089 Capital Rules Pages using Gen AI


Citigroup Inc., again one of the largest financial firms known to leverage the Gen-AI model to train their in-house team to be more productive. The bank uses Gen-AI technology to generate essays and sentences based on the user’s prompts. The model is first trained with diverse pre-existing knowledge repositories. 


When federal regulators published 1,089 pages of capital rules for the US banking sector, Citigroup Inc. thoroughly read the document word by word using Gen-AI. 


Citigroup has also been seen using private large language models to summarize the regulations and legislation in all operating countries to comply with those rules.


ABN AMRO’s Gen-AI answers customers’ queries in real-time


ABN AMRO is the first Dutch bank to deploy Generative AI to supercharge their call center agents with live call suggestions and automated call summarization. 


Traditionally, bank agents have to take notes during a customer call to summarize the call later. But now, the bank uses ChatGPT to generate post-call summaries, while the agents monitor accuracy afterwards.  


Now agents can instantly answer customer queries by searching the internal knowledge documents using GPT. Feedback from the agents has been impressive; indicating agents are now more focused on helping customers journey better and making their jobs more enjoyable. 


Goldman Sachs uses AI to assist developers with writing codes


Goldman Sachs is experimenting with Gen-AI tools to assist its developers in automatically generating and testing codes. 


In some cases, the developers have been able to write as much as 40% of their code using generative AI. Developers are also using the solution to generate and test new code. 


If you actually have a GPT-like technology that tests the code, or you generate the tests for the code, you’re creating this dualism where you test the machine and get the machine to test your work, Marco Argenti said.

SouthState Bank using LLM to empower knowledge sharing


SouthState Bank adopted a large language solution trained specifically on the bank’s internal documents and data to enable employees to access the bank’s internal data or records on the go. With this solution, employees are becoming proficient in handling data. 


My team uses it for about 50% of the searches it used to use Google for. The solution is far superior in a lot of ways. The difference is that you can drill down and refine your prompt, which you cannot do with Sinippets,” said Chris Nichols, Director of Capital Marketers at SouthState Bank. 

The solution also helps staff members automate multiple tasks including generating expense reports, composing emails, and detecting suspicious activity. Since the solution was deployed, the bank has witnessed a significant boost in productivity. For instance, a task that previously took 10-15 minutes to complete now takes bare seconds. 


Why Organizations Need Private LLMs

Large Language Models (LLMs) have been proven effective and productive for so many organizations, that there is no doubt left that every enterprise should get private language models. From automating daily tasks to disseminating insights among employees, LLMs can handle efficiently. The use cases of LLMs for organizations can be tailored to meet operational requirements. Seeking sources from market giants, we can make lots of use cases of LLMs depending on the organization’s and market needs. To stand out in a market full of competition, one should stay ahead by leveraging technologies like Gen-AI or LLMs.


Get yourself a Private Large Language Model

Introducing Salevant.ai, an LLM model trained specifically on the organization’s internal documents for in-house teams to access valuable data and insights helpful to guide the customers. This LLM-powered model lets your customer-facing and sales team handle live customer queries and respond quickly without trawling lots of documents for a solution.




Salevant.ai, trained on your organization’s knowledge repositories is a one-stop solution to disseminate valuable data or information to the last mile of customer-facing or sales teams. Handling customer queries on a call especially when they need to go through multiple documents to get an answer. Holding the customer for so long leaves a negative impact and leads to a bad customer experience. Here’s a Salevant.ai solution that helps you maintain an enriched customer experience. 





Comments


bottom of page