ldprod - stock.adobe.com
Bloomberg will move historic trading data to cloud in support of clients’ machine-learning ambitions
Bloomberg’s CIO for enterprise, Tony McManus, says its clients’ changing attitude to using cloud is changing the way the firm makes its real-time trading data available to the financial services community
Bloomberg is gearing up to move all of its historical market data to the cloud so that its financial services clients can use the data to back-test algorithms and train machine learning models, its CIO has confirmed.
Tony McManus, CIO for enterprise at Bloomberg, said the organisation is increasingly fielding requests from clients for access to its historical datasets for big data analytics purposes and, in response, it is turning its attention to making it accessible to them via the cloud.
“That’s our next line of work,” said McManus, during a roundtable discussion about the financial services industry’s cloud usage habits. “Our customers want pricing [data] going back as far as they can get, they want corporate actions going back, security reference data and alternative datasets.
“Almost every time we talk to a hedge fund now about a dataset, the first question they ask is: do you have history? What is the quality of that history? Are there many anomalies in the data? Can you help us evaluate whether that dataset is appropriate for a machine-learning use case. That is the dialogue now, almost exclusively.”
McManus stopped short of outlining a delivery timeline for the project, but cloud is becoming an increasingly important part of the technology proposition Bloomberg offers its clients, whose appetite for off-premise technologies is increasing.
“Over the last couple of years, we’ve started to notice a change in the rhetoric that we’re hearing from clients in terms of their appetite for moving to the cloud,” said McManus.
“My view is that the capital markets have been somewhat slower than other segments in moving to the cloud… so there has been a cultural reticence, but we’ve seen a change.”
Some of this change has been heralded by organisations’ concerns over the security of cloud being largely allayed, while also feeling more comfortable with the practicalities of moving their infrastructure and applications off-premise.
“Trying to figure out how to lift and shift [workloads] and how much effort that would be probably slowed things down a little bit,” said McManus.
Bloomberg backs cloud
In line with this change in attitude, Bloomberg began its first foray into using the public cloud to serve its customers back in November 2018, with the release of its Amazon Web Services (AWS)-hosted version of its flagship real-time trading data feed, B-Pipe, for US clients.
“There are still some structural hurdles that mean the pace of movement to the public cloud is slow, but that change in rhetoric and sentiment is absolutely clear,” said McManus.
“We wanted to make that advancement and investment so we were ready for when our customers start to move. We didn’t want to be in a position where customers came said ‘we want to go now’, and then we would have to put a two-year project in place to support them through that process.”
The B-Pipe offering forms part of Bloomberg’s wider enterprise data product portfolio and is relied upon by finance and investment organisations to access real-time market data and information, accrued from a reported 330 trading exchanges and 5,000 external contributors.
Previously, end-user organisations could access on-premise or hosted variants of the service, but, since November 2018, its clients have been able use it via an AWS PrivateLink, so they do not have to rely on a public internet connection to run it.
“From a customer perspective, it’s exactly the same service they get in a deployed [on-premise] model,” said McManus. “So, if they have an application that is currently consuming real-time data from Bloomberg that is running in a customer datacentre and they want to move it to the cloud, then from the perspective of how they interface with Bloomberg, very little changes.
“And that’s really important to us because we don’t want customers to think there’s some sort of compromise to the service as a consequence of them moving into the cloud. It’s not in any determinable way slower.”
Read more about financial services and cloud
- The financial services community has gone from being one of the least likely sectors to adopt cloud to becoming one of its keenest users, as regulator attitudes to using the technology have become more accommodating.
- Nationwide Building Society is in the throes of a cloud and DevOps-focused effort to re-platform its digital banking and mortgage services, its director of mobile and digital, James Smith, tells Computer Weekly.
As detailed during the roundtable, Bloomberg is now expanding the availability of the AWS-hosted B-Pipe to clients in Europe and Asia.
“It’s absolutely identical,” said McManus. “We said, let’s roll that out in one region first, let’s get the adoption, iron out any wrinkles and then follow the customer demand to other regions.”
In line with that sentiment, the company is also working on bringing B-Pipe to the platforms of other major cloud providers, said Andrey Rybka, head of compute architecture in the office of the CTO at Bloomberg, during the roundtable.
“It just happens that the customer demand was mostly in AWS [at that time] and that’s where we started [the roll-out],” said Rybka. “But we are definitely investing in all the major cloud providers and will be wherever the customer demands rises.”
In a statement to support the geographical expansion of B-Pipe’s availability, Scott Mullins, head of worldwide financial services business development at AWS, said that capital markets, in particular, are increasingly seeing the benefit of running more of their applications and workloads in the cloud.
“The ability to connect quickly and effortlessly to Bloomberg’s enterprise-quality market data in AWS regions around the world and immediately bring to bear our industry-leading data analytics, data management and machine learning tools is a competitive advantage in an industry where the quality of data, and the speed at which you can derive insights from it, matter,” said Mullins.