VMblog recently connected with Scott Gnau, VP of Data Platforms at InterSystems where we learned more about InterSystems IRIS data platform, solving challenges related to data management, and tips for capital markets firms
undergoing digital transformation.
VMblog: What are the biggest challenges related to data
management for capital markets firms and how does InterSystems IRIS address
these challenges?
Scott Gnau: Some of the largest data management challenges that
capital markets organizations face are access to, and interoperability of
accurate and timely data. Trading tools rely on massive amounts of data from
millions of sources, varying in structure, to complete transactions
successfully and to drive reporting, compliance and analytics applications. Increasingly
large volumes of data are assembled, integrated, and processed in real time to not
only drive the transactional business, but also to provide analysts, traders,
and brokers with a full view of their operational landscape. Even the smallest delays
in data latency, gaps in processing efficiency, inaccurate or incorrect
information will skew results and cause incorrect outcomes and increased
latency. We call this "unhealthy data."
By having access to a solution such as InterSystems IRIS® data platform that can
clean and normalize data easily while eliminating silos, capital market
organizations can power their applications with the latest insights quickly and
accurately.
VMblog: Why is a transactional and analytical database
like InterSystems IRIS important for financial applications?
Gnau: Today's finance industry is moving at an accelerated
rate in which immense amounts of data are being created every day and accurate
decisions must be made in a matter of micro-seconds. For mission-critical
applications such as financial trading platforms, having a transactional and
analytical database in place that supports agility, scalability, security, and
speed are vital to deploying a high-performing application that can handle
these vast amounts of data.
The notion of simplicity driving efficiency applies
here. In many instances, InterSystems IRIS is able to manage applications
inside a single data platform as opposed to legacy architectures that are
deployed with multiple solution engines including persistent databases,
document databases, in memory databases, cache distribution engines and
streaming solutions. InterSystems IRIS, as a multi-model storage engine
integrated with an enterprise cache and integration engine, can deliver the
performance and scale to replace multiple solutions. This means overall data
latency and solution supportability are improved out of the box.
With its proven speed, scalability, and reliability,
InterSystems IRIS continues to power a wide range of mission-critical,
high-throughput applications throughout the front, middle, and back-office for
leading global banks. If organizations don't have the correct infrastructure in
place, they put themselves at risk of mishandling customer demand. This can
include anything from increased downtime during updates and maintenance or
worse, an outage caused by unprecedented spikes in usage.
VMblog: What tips do you have for capital markets firms
undergoing digital transformation? What capabilities should they prioritize in
this process?
Gnau: As the finance industry continues to digitally
transform, developers must take the lead in driving innovation by making
healthy data a priority, having the infrastructure in place to avoid costly
outages, and implementing necessary internal workflows to efficiently build and
deploy next generation applications.
DevOps will also play an increasingly large role. To
successfully meet the real-time demands of the finance industry, development
teams must adopt a DevOps mentality that enables them to be agile and flexible
in their work, and to build, test, and deploy innovations quickly, with zero downtime.
VMblog: How are current market volatility and remote work
settings forcing new trends in data infrastructure management for capital
markets firms?
Gnau: As global markets face high levels of volatility and
trading volumes due to COVID-19, financial services organizations have had to
adjust quickly and digitally transform at record speeds. This, on top of an
increasingly dispersed workforce, has made accurate, reliable, and real-time
data sharing more important than ever before. For analysts, advisors, brokers,
traders, and others in the industry, failing to accurately keep up with
incoming data can lead to miscommunication, downtime, and significant financial
loss. Organizations are finding that data platform infrastructure is critical
as extreme volatility will stress complex solutions and bottlenecks can create
outages at precisely the wrong time. They've also found this transition to be
much easier if they were able to adopt and deploy new tools that enhanced
collaboration and on-demand services during these turbulent times.
VMblog: New applications help enterprises sustain and
grow their businesses, but how can organizations ensure that they're fully
leveraging all of this data being produced?
Gnau: When
it comes to a company's data, many organizations are only focused on the outcome
and what they can glean immediately from their data. While the immediate output
is important, it's also vital for businesses to understand that data and the
ongoing enrichment of that data is also an asset and should be treated as such.
So for companies who are deploying new applications, my recommendation would be
to adopt the mindset that data is valuable in the long term as well, and take
the appropriate steps to maintain and protect it.
VMblog: With the amount of data available only set to
increase, how do you think the data management space will evolve in the future?
What types of questions do businesses need to ask about their data strategy in
order to optimize it going forward?
Gnau: With
the increasing amounts of diverse data streaming in from a myriad of sources,
organizations will need to continue to think about recasting their data
management toolset to address this growing complexity - it's no longer a "one
size fits all" environment. Organizations should reevaluate their storage
capabilities to make room for this surge in data, look into connectivity and
traceability tools, and strategically think about the different kinds of
solutions that need to come together to optimize the value of their incoming
data.
##