Data Quality

What Is Data Quality Management? – An Introduction

Graham Needham
May 17, 2022 | 7 min read
Bad data has a serious impact as it leads to bad decisions. And it is a well-known fact that it costs businesses time, effort, and money regardless of industry, location, or organization size. So, what can be done about it? In this blog post we present a primer on solving this challenge with data quality management.

What is data quality management?

To define data quality management (DQM) is not easy, but we can start with that it is a set of principles, processes, guidelines, roles, responsibilities, and tools enabling an organization to achieve data quality management and it can be used as a part of their overall data governance strategy. We have just published a new whitepaper How to Establish a Data Quality Management Framework and below we have an overview of some of its content.

Data quality management can directly influence how data consumers (external consumers, or customers, as well as internal consumers) perceive the organization. They are sensitive to bad data as it creates frustration and distrust so managing data quality will improve consumers’ satisfaction. In short, implementation of data quality management mitigates threats to an organization’s internal and external “image“, gives credibility to the outputs, and significantly improves added value and benefits to the data consumers.

What is data quality?

The definition of data quality is highly dependent on an organization’s business, context, and needs but specific issues (or “data quality dimensions”) can be categorized as:

  1. Accuracy – How well does your data describe reality?

  2. Consistency – Do various data storages have the same matching records?

  3. Completeness – Is your data as comprehensive as you need it to be?

  4. Uniqueness – Is your data free of duplicate records?

  5. Timeliness – Is your data acceptably up to date?

  6. Validity – Does your data exist in the right format and data type?

These cover common data quality problems and there are some famous examples of getting it wrong. In 2019 Hawaiian Airlines charged customers’ credit cards, ranging from US$17,500 to $674,000, instead of using their airmiles for free tickets – thus a customer claiming a free ticket which should have used 150,000 airmails was instead charged $150,000!

Worse still, in 1999, NASA’s US$327 million Mars Climate Orbiter slammed into the planet on arrival and was completely lost simply because one team (Lockheed Martin) was using the imperial measurement system while the other (NASA) was using metric, and this discrepancy was never picked up during quality control.

Poor data governance and bad data cause disgruntled consumers and cost money.

What produces bad data and how can data quality be improved?

There are many possibilities for bad data production, but several typical root causes of data issues are the human factor, reports that are created based on correct data and contain correct insights but bring misleading information or lead to incorrect decisions due to the lack of business understanding, and technical issues e.g., a broken sensor.

Data quality management brings you a universal, effective, and targeted framework to identify the specific root cause for the individual data issue and assess its impact. Simultaneously, it gives you proper information to be able to decide about the way of correcting the issue correction and its priority to do so. The path to data quality improvement is highly dependent on the way an organization’s processes and checks are set up, the issue’s ultimate root cause, its impact, priority, or, for example, the amount of budget dedicated to data quality improvement.

Absolutely crucial for data quality management is to know your data – you need to know the path of your data from the beginning (the real capturing) to the end (calculation in the reports). Data cataloguing and business glossary capturing tools, such as one of the Accurity platform’s solutions, are very helpful to collect, maintain, and manage such information in an organization. They can combine information about the data sources (data catalog), business meaning of the data (business glossary), and their relations, to clearly capture the data flows and describe the data well.

Simultaneously, it is necessary to educate, support, and lead the data users and creators. In general, you need to continuously build and increase the awareness about data quality management and the importance of it across the organization. It is always more effective and usually cheaper to prevent the creation of the data issues, rather than finding and mitigating already existing ones in the later stages of the data handling process, via data quality measurement and correction.

You can now get a free trial version of our data quality and data observability solution.

The Data Quality Management process

The recommended Data Quality Management techniques and process steps are:

  1. Data profiling and assessment

  2. Requirements management

  3. Rule design

  4. Rule implementation

  5. Rule testing

  6. Metrics definition

  7. Aggregated metrics

  8. Rule execution

  9. Reporting and monitoring

  10. Data issues management

  11. Incident management

  12. Managing data quality continuously

Learn and discover how to establish your own data quality management framework

Our new whitepaper will help you plan for and achieve your own data quality management framework. It covers everything we have mentioned here in a lot more detail and also has significantly more in-depth DQM process steps. This DQM whitepaper is a free download from the Accurity website.

Or let us show you how our Accurity all-in-one data intelligence platform can help you utilize high quality data to make business-critical decisions with confidence – book your own free, personalized 1-on-1 demo online.

Graham Needham
Content Editor