With increasing speed and data availability, front office systems for risk and portfolio management are now being moved to cloud environments. At the same time, the petabytes of data the individual firms are handling has made it more of a necessity for them consider the cloud for some of their mission-critical activities.
The move to cloud has been enabled by an ecosystem of new cloud startups popping up seemingly every week, disrupting both legacy infrastructures and applications. Meanwhile, public cloud providers are expanding services in IaaS, PaaS, and data and analytics. Amidst all the pressures, both buy-side and sell-side firms are migrating front-office applications to cloud.
New firms that are not burdened with legacy systems are now, in some cases, operating entirely in cloud, while many others are working in a hybrid model.
Firms generally start with less time-sensitive functions such as end-of-day pricing but are quickly moving up to more latency-sensitive applications that may require sensitivity to timing down to the microsecond or nanosecond level.
“It’s rapidly transforming for the last three to four years. The trend that’s really coming to fruition now: all workloads are available in the cloud,” said Cory Albert, head of enterprise data cloud strategy at Bloomberg.
Security: New technology, new concerns
Using cloud technology requires fundamental shifts in how technologists configure systems, creating some uncertainty. At Bloomberg’s recent Data & Tech Summit in New York, 65% of attendees cited security concerns as the biggest obstacle in moving mission-critical front- to middle-office workflows to the public cloud. Coming in a distant second was the complexity of interconnected systems, followed by latency concerns.
CTOs often find themselves in the role of helping others get comfortable with the idea of cloud or hybrid cloud technology. Speaking on a panel at the summit, technologists acknowledged that concern about security is more about lack of familiarity. They observed that in addition to providing a wealth of software security tools, public cloud providers such as AWS or Microsoft Azure often invest more in the physical security of their data centers. Panelists also said they’ve focused on security by limiting access points and segmenting, so businesses that don’t talk to one another don’t need to be connected.
Creating barriers between businesses – like watertight compartments on a ship – helps contain damage in case there is a security breach. “Even if there is a leak, you’ve limited and contained it to that specific area,” said Arun Kumar, chief architect at Citadel LLC. He added that segmenting is also helpful in other ways, such as billing and cost controls.
Even though barriers are essential for security, interconnectivity is also crucial to cloud architecture, whether it’s connecting on-premise systems to the cloud or linking multiple cloud providers.
In the past, data availability was chief among those barriers to cloud adoption: the task of feeding data into cloud-based applications fell to the firm consuming the data. Bloomberg is working to ensure all datasets are accessible from the cloud, and has made the Bloomberg Market Data Feed (B-PIPE) available through cloud-native delivery via AWS.
Cloud-native delivery eliminates many of the latency limitations of working with an internet connection-based API.
The growth of cloud is changing the nature of the business, with data providers increasingly catering to platforms rather than people.
Cory Albert, head of enterprise data cloud strategy at Bloomberg
Enabling business to move faster
The shift to cloud is creating some fundamental changes to the industry, including the cadence of development. Cloud is freeing up technologists to spend more time on development work and less time “keeping the lights on,” as is required with large legacy systems.
Cloud also enables faster innovation. What used to take months for a traditional on-premise installation can now be done in hours.
“Before, you had to provision the machines, institute the processes, and take months to get a proof of concept done. Customers get to experiment much faster and spin up machines down and up and more. It’s so easy and you don’t have wars with admins and provisioning machines,” said Mauricio Gonzalez Evans, chief executive of BCC Group.
That pace is likely to speed up further as more application providers emerge on the cloud – giving technologists even more tools with which to improve the pace and scale of software deployments. Stephen Muench, Chief Technology Officer at Murex North America, told the panel about a time he dealt with a client that found latency issues with a managed cloud service that would be used by some applications they were planning to move into the cloud. However, within a year the services had improved so much that the latency was not only much improved, it was better than the client had on-premise.
Size and agility
The experience of the move to cloud varies dramatically based on the size and structure of the firm and the legacy of their existing systems. Kaveh Ghahremani, Head of Platform Engineering, Architecture and Cloud at TD Securities, commented on his experience: “TD is a sizeable company: seventy thousand people, many departments, thousands of applications and systems, and for us just jumping on the cloud was not an option. We had to work our way into it.” Prioritizing which of hundreds or thousands of applications might move to the cloud is something that requires careful planning and needs to be done in a stepwise fashion.
Still, both large and small firms have adopted a cloud-first strategy for new technologies. Market pressure is also pushing both buy-side and sell-side firms to offer new services that can help their users make quicker and better-informed decisions. All these factors are pushing faster development, and more pressure to move the front office, and other parts of the business, to cloud.