2019 survey reveals disconnect between data trust and data quality

Forty-seven Percent of Respondents Report Untrustworthy or Inaccurate Insights from Analytics Due to Poor Data Quality

The results from a Syncsort survey exploring data quality and organizations’ confidence in data across their enterprise reveals a disconnect around understanding and trust in the data and how it informs business decisions.

While sixty-nine percent of respondents stated their leadership trusts data insights enough to inform business decisions, hey also said only 14 percent of stakeholders had a very good understanding of the data. Of the 27 percent who reported sub-optimal data quality, 72 percent said it negatively impacted business decisions.

"This disconnect raises questions about whether executives that rely on data for insights are in fact being properly informed," said Gary Allemann, MD at Syncsort partner, Master Data Management. "Bias and inaccuracies in reporting ultimately lead to a break down in trust, and may, for example, be a factor in the relatively low adoption of artificial intelligence and machine learning techologies in Southern Africa."

Multiple Data Sources, Governance and Volume Are Top Data Quality Challenges

  • The top three challenges companies face when ensuring high quality data are multiple sources of data (70%), applying data governance processes (50%) and volume of data (48%).
  • About three quarters (78%) have challenges profiling or applying data quality to large data sets.
  • Twenty-nine percent say they have a partial understanding of the data that exists across their organization, while 48 percent say they have a good understanding.

Data Profiling Tool Adoption Low, Leading to Lack of Visibility into Data Attributes

  • Fewer than 50 percent of respondents take advantage of a data profiling tool or data catalog.
  • Instead, respondents rely on other methods to gain understanding of data, with more than 50 percent using SQL queries and over 40 percent using a BI tool.
  • Of those who reported partial, minimal or very little understanding of their data, the top three attributes respondents lacked visibility into were: relationship between data sets (63%), completeness of data (56%) and validation of data against defined rules (56%).

The Consequences of Poor Data Quality Are Wide-Ranging, from Customer Dissatisfaction to Barriers to Emerging Technology Adoption

  • Of those who reported fair or poor data quality, wasted time was the number one consequence (92%), followed by ineffective business decisions (72%) and customer dissatisfaction (67%).
  • Twenty-five percent of respondents who reported sub-optimal data quality say it has prevented their organization from adopting emerging technology and methods (such as AI, Machine Learning (ML) and blockchain).
  • Only 16 percent of respondents are confident they aren’t feeding bad data into AI and ML applications.
  • Seventy-three percent are using cloud computing for strategic workloads, but 48 percent of them have partial to no understanding of the data that exists in the cloud. Twenty-two percent rate the quality of their data in the cloud as fair or poor.

“This survey confirms what we’ve been seeing with our customers – that good data simply isn’t good enough anymore,” said Tendü Yoğurtçu, CTO, Syncsort. “Sub-optimal data quality is a major barrier, especially to the successful, profitable use of artificial intelligence and machine learning. The classic phrase ‘garbage-in, garbage-out’ has long been used to describe the importance of data quality, but it has a multiplier effect with machine learning — first in the historical data used to train the predictive model, and second in the new data used by that model to make future decisions.”

“There is so much data being produced today, and it’s creating a significant number of new opportunities and challenges,” said Tendü Yoğurtçu, CTO, Syncsort. “We are seeing cloud and hybrid cloud as the mainstream trends, which is consistent with the results of our 2018 cloud survey. With the gravity of data shifting, organizations are trying to take advantage of the cloud’s elasticity and gain the ability to analyze and deliver trusted data into application pipelines as quickly as possible. These are the precursors to improving data accessibility and taking advantage of the emerging technologies, like machine learning and streaming analytics, that will help deliver more value out of data.”

For more information on the study results, read our blog.

Our Privacy Policy

MDM is committed to the protection of personal information.

For our Privacy Policy, click here

Free Joomla! templates by AgeThemes