Blazent Study Reveals Disconnect Between Data Quality and Enterprise Readiness to Pursue Machine Learning and Analytics

Data_qualityBlazent, a leader in IT data quality management, released new findings of its report, The Role of Data Quality Management in Machine Learning and Predictive Analytics, conducted in partnership with 451 Research. The report, which polled the opinions of 200 C-level and Senior IT leaders, revealed that despite the 67 percent of respondents that have a strong appetite for advanced analytics technologies, more than half (60 percent) have low confidence in their organization’s data quality management practices. Given IT’s reliance on data to perform these initiatives, until data quality processes are properly managed, businesses will continue to suffer the limitations of garbage in, garbage out results.

Based on this new research, the pace of adoption for more sophisticated algorithmic analysis is accelerating more quickly than originally anticipated, with 22 percent of respondents already having a program in place and 42 percent planning to implement one in the next 12 months.   However, to capitalize on this promise, the enterprise must move data quality management initiatives to the top of the list of both IT and C-suite priorities – a far cry from where it currently stands, with nearly 40 percent still relying on rudimentary manual data cleansing processes.

Advanced analytic technologies, which were once thought to be cutting edge, are now moving towards mainstream adoption, but enterprises are still struggling to put the proper data management processes in place,” said Carl Lehmann, Research Manager, Hybrid IT Architecture, Integration & Process Management at 451 Research. “As the research reveals, the promise of more advanced technologies like machine learning and predictive analytics will remain a pipedream until organizations have remedied their Data Quality Management issues.”

Primary contributors to the data quality issues discovered in the research include:

  • In addition to the 67 percent of respondents who will use machine learning for predictive analytics, 66 percent will use it to enhance their recommendation engines and 59 percent for analysis segmentation.
  • More than 50 percent aren’t sure if all of the data sources required for their business purposes had been integrated before the cleansing process.
  • Nearly 45 percent of organizations are in a reactive mode – relying on reports to find data errors then hoping the proper corrective, often manual, action takes place.
  • Nearly 10 percent of organizations have avoided data quality management completely, opting for a “hope for the best” approach.
  • Half of respondents believe that the DQM practices put in place and therefore the quality of data used were only satisfactory/good enough – but admittedly not great.

As foundational as data quality is to an organization’s success we continue to see that a majority of IT Execs are not confident in their data quality management practices.  The pace of change and seemingly never ending increase in the amount of data and data sources are significant drivers of this lack of confidence. And, critical business decisions are made without a complete and accurate picture,” said Charlie Piper, CEO at Blazent. “These findings further validate how crucial it is for IT and the C-suite to continue to prioritize data quality, employing an organization-wide streamlined process for data management.”

Conducted by 451 Research, the survey was comprised of 200 C-level, senior IT, and key business decision makers from companies with at least $500 million in annual revenue, from industries including technology, manufacturing, financial services, healthcare, etc.

 

Sign up for the free insideAI News newsletter.

Comments

  1. Check out marklogic for strong data provenance and governance leading to high data quality.