Bloomberg to develop GPT-based language

Bloomberg, develop, GPT, language, AI, artificial inteligence

Financial data provider Bloomberg is the latest firm to investigate the use of artificial intelligence (AI) for data processing and analysis.

The financial services conglomerate has developed BloombergGPT, an AI-based large language model (LLM) that will support natural language processing (NLP) tasks involving financial data.

The tasks include sentiment analysis, named entry recognition, news classification and question/answering.

The interest in AI and NLP has skyrocketed since the November 2022 release of ChatGPT, a chatbot originally developed by OpenAI in 2020.

While some banks and asset managers have banned ChatGPT as an internal communications tool, other firms have been researching how it could be used within the investment world. For example, Morgan Stanley has run various tests within its wealth management division to use ChatGPT with its financial advisors. And JP Morgan boss Jamie Dimon recently called the technology “extraordinary and groundbreaking” while disclosing that the bank has more than 300 AI use cases in production.

BloombergGPT was developed using a financial data archive over the last 40 years to build a 363 billion token dataset. This has been augmented with a 345 billion token public dataset from Google and Wikipedia to create a training corpus of over 700 billion tokens.

By comparison, ChatGPT was trained on a corpus of 500 million tokens.

According to Bloomberg, the BloombergGPT model outperforms existing open models of a similar size on financial tasks “by large margins while still performing on par or better on general NLP benchmarks”.

“For all the reasons generative LLMs are attractive – few-shot learning, text generation, conversational systems, etc. – we see tremendous value in having developed the first LLM focused on the financial domain,” said Shawn Edwards, Bloomberg’s chief technology officer.

©2023 fundsTech