Virtualization Technology News and Information
Datameer 2018 Predictions: Big Data in the New Year

VMblog Predictions 2018

Industry executives and experts share their predictions for 2018.  Read them in this 10th annual series exclusive.

Contributed by Datameer Management Team

Big Data Predictions - 2018

2017 has been an exciting year for big data industry trends as big data in the cloud continues to take off. And, as always, we're hopeful that 2018 will give rise to even more breakthroughs and innovation in the big data ecosystem.

So, what can we look forward to in 2018? We persuaded four of the experts at Datameer to give their predictions for what we'll be seeing in the coming year.

Keep reading to learn more!

Frank Henze, Vice President of Innovation

Frank Henze 

  1. Businesses will increase the agility of analytics by moving more workflows to the cloud. This will neither replace on-premise storage nor computation solutions, and those vendors, which can bridge between both worlds and can run analytics where the data resides will get significant attraction.

  2. Self-service data preparation and ensuring data quality will become more and more an integrated part of the analyst's toolbox. AI will support analysts to cleanse data, to recommend next actions and even to auto discover patterns, trends, and insights.

Raghu Thiagarajan, Vice President of Product

Raghu Thiagarajan 

  1. The move to Hadoop hasn't been as effective as anticipated because of the challenge of distilling and extracting value from the data lake. Instead, data gravity is shifting (and will continue to shift) to the cloud and will be driven by two trends. The first: Getting quality data into alternative stores on the cloud through self-service data prep. The second: Enabling easy access to that data through the emerging class of cloud data warehouses.

  2. Because of the separation of compute and storage the cloud will continue to become even more attractive for large-scale data processing and access.

Andrew Brust, Datameer's Advisor for Marketing and Innovation, and CEO of Blue Badge Insights

Andrew Brust 

  1. Data volumes will continue their increase on such a tear that we'll just accept it as normal. Because of this, platforms and tools based on robust scale-out architectures will be essential. Big Data won't be a big deal anymore, but small data tools, used on their own, will become unacceptable aberrations.

  2. The industry will start clamoring for machine learning standardization. There are so many libraries, platforms and approaches that ML's state of the art is just too fragmented.  Algorithm libraries may need to shake out, so we can focus on two or three key ones. Moreover, though, tools will need to create an abstraction layer over these libraries, so that they can be used and integrated in a uniform fashion.

  3. AI, meet politics. Given the way analytics and targeted messaging have changed the way political campaigns are run, it seems inevitable that artificial intelligence will enter the fray as well. While this will only just get started in 2018, expect to see AI used in campaign strategy, ad buys, rally logistics and more. AI won't replace humans in these roles, but will become an important tool in their work.

John Morrell, Senior Director of Product Marketing

John Morrell 

  1. The gap between BI and big data will finally be bridged in a manner that will allow people to take advantage of, explore and use all of their big data in the most effective manner.

    Big data platforms will expand to unify the data engineering, management and access facilities, providing ONE place where big data can be curated, explored and consumed by traditional BI tools. This eliminates the need to re-create the EDW stack on big data, and at eliminates the need to use slow SQL access methods on Hadoop.

    This will enable BI users to finally explore new areas with different datasets to answer new age business questions.

  2. An integrated approach to governance, data security and metadata will emerge that takes into account what the data is and how it is used, and apply policies based the real business aspects of the data. Up to today, most approaches have revolved around securing the data as opposed to governing it based on how the business used it.

    Big data is changing the ways companies use data - enabling reuse of data assets in a variety of different ways.Security and governance policies will not be directly applied to datasets, but rather applied to metadata tags and constructs. The result will be a much more holistic approach to data management, and making the data curation, governance and stewardship processes more efficient and effective.

  3. I see four different big data in the cloud usage models evolving:
  • On Demand Bursting: Most processing will take place on-premise, but overflow is directed to the cloud (e.g.: Retailers during holiday season).
  • Hybrid: Some data processing will take place in the cloud, and some will occur on-premise. It will then be brought together in a hybrid fashion to deliver an end result.
  • Targeting Cloud Data Warehouses: Data will be taken to the cloud or on-premise, and the Enterprise Data Warehouse will move to the cloud.
  • General Purpose: All processing will take place in the cloud with full flexibility and scaling as-needed (e.g:. A large financial asset management firm with a variety of business units that wants to use the processing power of the cloud but might not know the extent of their processing needs. As the business grows, they can scale vertically and horizontally as needed).


Published Tuesday, December 26, 2017 6:47 AM by David Marshall
There are no comments for this post.
To post a comment, you must be a registered user. Registration is free and easy! Sign up now!
<December 2017>