Virtualization Technology News and Information
Article
RSS
VMblog's Expert Interviews: Gary Oliver of Blazent Talks Data Quality Management and Latest Research Findings

Interview 

New findings from a 451 Research report commissioned by Blazent, reveal a sharp disconnect between the C-level and data scientists attitudes towards data quality, despite the potential negative impact it has throughout the enterprise.  In fact, less than half (40%) of execs were discovered to be "very confident" in the quality of their organization’s data, with 94% citing lost revenue (42%) and bad decision making (39%) as two of the main areas affected by inaccurate data.

To find out more about this subject and the research report, I spoke with Gary Oliver, CEO of Blazent. 

VMblog:  What motivated Blazent to conduct a survey around this specific topic, and are you surprised at what you found?

Gary Oliver:  We deal with these issues routinely as part of our product and service deployment. The sense was that the problem was probably a lot more pervasive than we were seeing, and we wanted to quantify the extent of the issue. We were not completely surprised with the survey results; anyone in business who has dealt with IT services or Operations has run into the effect of poor data quality, as it has a direct impact on their day to day.

VMblog:  What's the cause of the disconnect on the severity of data quality and its impact on ROI throughout the enterprise?

Oliver:  Likeliest scenario is poor reporting, or reporting that is not geared to the information needs of business users. IT and LoB will look at the same data and see very different things, and most Data Quality tools are geared to the needs of IT, not the business user. The effect of using tools that are not designed to address both technical and business needs combined with poor alignment between IT and LoB leads to a very pronounced disconnect.

VMblog:  Despite the fact that 94% of enterprises recognize the importance of accurate data for creating business value by unearthing new revenue streams and lowering costs, why are so few respondents looking to implement data quality management tools in the next 12 months? 

Oliver:  Most people would intuitively assume good data leads to good decisions, but the issue is that these same people may not know they have data issues; from our work over the past several years we have found roughly 40% of data in use is poorly aligned, inconsistent, or incomplete, and our customers were not aware of the depth or breadth of the problem. When we show that to them, the urgency grows to fix the problem as it has a tremendous impact on operational performance and risk. 

VMblog:  How can/should the enterprise remedy this situation? 

Oliver:  Better alignment between LoB and IT would be a big step in the right direction, together with tools that are geared to the information needs of business users. The impact of improved data quality will be most noticeable on the business side in areas that are critical to the success of the business such as CSat, profitability, time to value, etc. When that side of the equation starts pushing hard for improvements in data quality, then the needle will start to move. In order for that to happen, you need to have quantification of the depth and breadth of the problem. Reports like the 451 survey are extremely helpful in this regard.

VMblog:  What statistic or finding jumped out at Blazent the most?

Oliver:  Probably the perceived lack of faith in existing DQM initiatives. Either the initiatives aren't meeting expectations, or expectations are so low that any information that comes into the business side of the equation is filtered to assume a worst case scenario.

VMblog:  How do you expect data quality management to evolve in the next 12 months as the enterprise expects the number of data volumes/sources to triple with the rise of big data, machine learning, predictive analytics, etc.?

Oliver:  In the short term data quality issues are likely to get worse.  As an obvious example, the consumerization of IT has, for better or worse, opened up a Pandora's box of data and workflow interaction. Social media is now closely coupled with e.g. transactional data, so there is an inconsistent deluge of data coming in from a vast array of relatively unstructured sources, and the operating structure of most IT departments (which have a lot of legacy elements that pre-date social media) is not set up to handle this variety or volume of data. Social media is moving faster than IT's ability to infrastructure for it, and as long as the two are operating in the same ecosystem, there's going to be a challenge.

And that's just one example; you can apply the same logic to nearly any aspect of business that is data centric (CRM, Mobility, etc.) and all of these are major drivers on the need for better machine learning tools. Data Quality will improve over the long term as IT continues adjusting, but in the short term companies will struggle to keep up. Blazent's entire focus is on solving the problem of getting complete and accurate data in context to drive good decisions, and as companies see the impact this has on business performance, we expect data quality initiatives will get the priority and focus they deserve.

##

Once again, thank you to Gary Oliver, CEO of Blazent, for taking time out to speak to VMblog.com.  And if you'd like to read their full report, you can find it here.

Published Thursday, January 28, 2016 6:31 AM by David Marshall
Filed under: ,
Comments
There are no comments for this post.
To post a comment, you must be a registered user. Registration is free and easy! Sign up now!
Calendar
<January 2016>
SuMoTuWeThFrSa
272829303112
3456789
10111213141516
17181920212223
24252627282930
31123456