6 Data Quality Metrics You Can't Afford To Ignore

Written by
Pavel Najvar
Pavel Najvar

‘Poor quality destroys business value.’

That’s according to a Gartner report that reveals businesses attribute up to $15 million in losses to poor data quality every year.

So, from the outset, it’s clear that improving data quality is key for driving real business outcomes. And with statistics like this in your pocket, championing the importance of mature data quality management just got a little easier.

But when faced with budget constraints, bureaucracy, complex systems, and an ever-growing list of security and compliance regulations, how do you improve data quality within your organization?

Well, to begin, you need to track the right data quality metrics so we’ve listed six we believe you can’t afford to ignore.

1. Completeness

This measures whether all the necessary data is present in a specific dataset. You can think about completeness in one of two ways: at the record level or at the attribute level. Measuring completeness at the attribute level is a little more complex, however, as not all fields will be mandatory.

An example metric for completeness is the percent of data fields that have values entered into them.

2. Accuracy

How accurately does your data reflect the real-world object? In the financial sector, data accuracy is usually black or white – it either is or isn’t accurate. That’s because the number of pounds and pennies in an account is a precise number. Data accuracy is critical for compliance and regulatory approval in large organizations, where the penalties for failure are high.

An example metric for accuracy is finding the percentage of values that are correct compared to the actual value.

New call-to-action

3. Consistency

Maintaining synchronicity between different databases is essential. To ensure data remains consistent on a daily basis, software systems are often the answer. Transaction consistency, for example, is where a system is programmed to detect incomplete financial transactions, so that these errors can then be rolled back and both balances remain in check.

Clover helped one client improve the consistency of their data by ensuring email addresses and phone numbers were consistent across all its databases. This helped to save over $800,000 for the client. Not bad for a simple adjustment to their data quality strategy.

An example metric for consistency is the percent of values that match across different records/reports.

4. Validity

Validity is a measure of how well data conforms to required value attributes. For example, ensuring dates conform to the same format, i.e., date/month/year or month/date/year. If we take our previous case study as an example, the client heavily relied on direct mail, but addresses in the database weren’t formatted correctly, which made it hard to identify members of a household or employees of an organization. Improving their data validation process eliminated this issue for good.

An example metric for validity is finding the percentage of data that have values within the domain of acceptable values.

New call-to-action

5. Timeliness

Timeliness reflects the accuracy of data at a specific point in time. An example of this is when a customer moves to a new house, how timely are they in informing their bank of their new address? Few people do this immediately, so there will be a negative impact on the timeliness of their data.

Poor timeliness can also lead to bad decision making. For example, a banking reward scheme that is returning strong data results in the initial 3 months can be used as evidence to continue the scheme. But you shouldn’t use the same data (from the initial 3 months) to justify the schemes extension after 6 months. Instead, update the data to reflect the 6-month period. In this case, old data with poor ‘timeliness’ will hamper effective decision making.

An example metric for timeliness is the percent of data that can be obtained within a certain time frame, for example, weeks or days.

6. Integrity

To ensure data integrity, it’s important to maintain all the data quality metrics we’ve mentioned above as your data moves between different systems. Typically, data integrity is broken when data is stored in multiple systems.

For example, as client data moves from one database to another, does the data remain the same? Or, equally, are there any unintended changes to your data following the update of a specific database? If the answer is no, the integrity of your data has remained intact.

An example metric for integrity is the percent of data that is the same across multiple systems.

Moving from data quality metrics to actionable fixes

When the stakes are so high, it's important not to make things worse by applying the wrong processes to data quality management. While it’s not brain surgery, a careful hand is still required to keep your data clean so you end up seeing healthy data metrics across the board.

If you combine data monitoring with the right validation and cleansing processes, you’ll be on the right path to improving your data quality.

But there’s one piece to the puzzle that can make all the difference…

Let software do the leg-work so you don’t have to

Because of bad quality data, the Financial Conduct Authority issued fines to twelve firms in 2018, with the damage totaling $42 million. But it’s not only about avoiding fines – reducing data errors will also help grow your business and deliver on its desired outcomes.

We talked earlier about how proactive data validation and cleansing helped one business save over $800,000Well, in the same stroke, we also increased their orders by 12 percent. All by improving data quality. I guess the lesson is to never underestimate the power of a good spring-clean.

While strong data quality management starts with understanding and monitoring the metrics we’ve discussed above, doing this manually is problematic. Not only is it difficult to get right, it also saps time from your development team that could be better spent on implementing strategic change.

However, with the right data quality tools, you can automate the validation and cleansing process at the source, reducing errors, complexity and the time it takes to deliver real value from your data. While poor data quality will continue to drain money from your organization, good data quality will revolutionize your decision-making processes and increase your chances of generating profit.

For more insights and an in-depth look at what proactive management can do for your business, check out our complete data quality guide. 

New call-to-action

Posted on May 31, 2019
DISCOVER  CloverDX Data Integration Platform  Design, automate and operate data jobs at scale. Learn more <>

Related Content

Subscribe to our blog


Where to go next