Virtualization Technology News and Information
DataTorrent 2018 Predictions: Big Data Analytics Strategies Take a Speedy Turn

VMblog Predictions 2018

Industry executives and experts share their predictions for 2018.  Read them in this 10th annual series exclusive.

Contributed by Guy Churchward, CEO, DataTorrent

Big Data Analytics Strategies Take a Speedy Turn

Data volume, velocity and variety increased in 2017 with the birth of new technologies such as sensors, smart devices, social platforms or autonomous vehicles, and it's expected to continue thriving throughout 2018. With a growing influx of data comes new business opportunities that's accessible to many, but the type of approach to harnessing and analyzing this data will impact how advantageous the insights extracted will be towards delivering value to each of the businesses. 

2018 will be the year that businesses will realize the need to get ahead of taking action on data while it's in motion rather than analyzing data at rest to unlock the most valuable outcomes. Here are the top three reasons why:

1. Customers, not vendors, are driving innovation

Technology innovation is happening at a dizzying pace with myriad technology companies, large and small, tracking credit for these innovations. However, it's really the customers who are the driving force behind many of the game-changing solutions that are developed. In our world, big data analytics, customers know what outcomes they need to be more competitive and, in turn, deliver value to their customers. The problem is, more often than not, they can't find what they need and, as a result, innovation is happening at the customer site, creating their own data science recipes and exploring advanced development utilizing building blocks the industry enjoys. In 2018, we'll see that model further evolve: more customers will seek out an accelerated development lifecycle model that allows them to keep innovating and evolving in smaller and more rapid increments. This might not sound much of a shift but it's simply not the vendor community breaking new ground anymore. The innovation baton is firmly in the hand of the customers triggering vendors to both deliver outcome-based rapid assembly building blocks and finding ways of weaving the wealth of originated ideas to a wider audience with monetary reciprocity. 

2. The data lake is recognized as an antiquated catch-all analytics approach and Achilles heel for fast insight and actions needed to compete, Et Tu Batch!

With the advent of the Internet of Things, data growth is set to accelerate. Data sources are moving from humans to machines as sources move from web to mobile to machines. This has created a dire need to scale out data pipelines in a cost-effective way. Big data and cloud ecosystems realized that they cannot be just about search indexing or data warehousing. They need to service all enterprise data flow, be it human (web, mobile) generated or machine generated. The ability to respond in real-time provides a dramatic competitive advantage. An enterprise that can do predictive analytics in real-time will gain a competitive edge over one that does not. Big data needs to be real-time, agile, and operable. Moreover, to get real-time, agility, and even to some extent operability back, data lakes cannot be in between this data flow. The data lake served companies fantastically well through the data "at rest" and "batch" era, back in 2015 it started to become clear this architecture was getting overused, but it's now become the Achilles heel for REAL real-time data analytics. Parking data first, then analyzing it immediately puts companies at a massive disadvantage. When it comes to gaining insights and taking actions as fast as compute can allow, companies relying on stale event data create a total eclipse on visibility, actions, and any possible immediate remediation. This is one area where ‘good enough' will prove strategically fatal!

3. The commoditization of enterprise applications is on the rise

Long gone are the days when critical enterprise applications were developed in multi-year cycles. The rapid speed at which companies, and their adversaries (ransomware, account takeovers, etc!), are evolving requires the technology industry to change its enterprise application development and delivery model. The market is moving so much faster now. Enterprise-grade applications needs to be deployed in weeks or months and certainly not years, but new apps generally don't have the ‘bake in' time that gives them that industrialized robustness. A general rule of thumb is if a customer knows they need change then they're already late. Commoditized enterprise applications give customers critical building blocks at enterprise grade getting them to an outcome-based result within a quarter or two at most. The demands are such that there won't be any leeway for availability or timeline so let's just get used to developing in the digital economy.


About the Author

Guy Churchward

Guy Churchward is President and CEO of DataTorrent, prior to DataTorrent, Guy was at Dell EMC where he served as President of their Core Technologies Division, responsible for redefining the modern data center with a comprehensive portfolio of core storage and data protection solutions. Prior to joining EMC, Churchward was President and CEO of LogLogic, an enterprise log and security intelligence platform company. He has also served as Vice President and General Manager of the Data Protection Group at NetApp and Vice President and General Manager of BEA's WebLogic Products Group.  In addition, Churchward has held senior management positions at Sun Microsystems (formerly Tarantella Inc.), The Santa Cruz Operation (formerly IXI), Accenture (formerly Binder Hamlyn) and Olivetti. - See more at:

Published Tuesday, January 09, 2018 7:43 AM by David Marshall
There are no comments for this post.
To post a comment, you must be a registered user. Registration is free and easy! Sign up now!
<January 2018>