Black Tiger Insights
6
min read

The Hidden Cost of Poor Data Quality

Black tiger

High-quality data is essential for effective decision-making and compliance. This blog explores how data quality impacts business success, costs of poor data, and real-world cases like NASA’s Mars Orbiter and GDPR. Companies can ensure data quality through governance, technology, a data-aware culture, and regular audits.

Imagine you’re the CEO of a Fortune 500 company. You’re about to make a decision that will shape the future of your business for years to come. You’ve dug through the reports, analyzed market trends, and consulted with your top executives. You’re confident in your choice. But what if I told you that your decision was based on a single misplaced decimal point?

This isn’t a hypothetical scenario. In 1999, NASA lost a $125 million Mars orbiter because one engineering team used metric units while another used English units for a key spacecraft operation. A simple data discrepancy led to one of the most expensive typing errors in history.

While not all data quality issues result in spacecraft-destroying mistakes, the principle remains the same: the quality of your data directly impacts the quality of your decisions.

The critical role of data quality in business

Today data acts as the backbone of companies, just like electricity powers the machines around us. It fuels innovations, informs strategic decisions, and drives profitability when used correctly. However, for data to be a true asset, its quality must be non-negotiable.

According to Forrester, companies that prioritize data quality are 58% more likely to exceed their revenue goals compared to those that don’t. The reason is simple: high-quality data leads to better insights, which drive more effective strategies and operations.

However, achieving high data quality is a significant challenge for most organizations. Harvard Business Review reveals that only 3% of companies’ data meet basic quality standards.

But what exactly do we mean by “data quality”? It is defined by data that is:

1. Accurate: Correctly representing the real-world entity or event

2. Consistent: Free from contradictions across different systems or reports

3. Valid: Properly structured (format, type, range) and contains the values of its definition

4. Complete: Containing all necessary information (including metadata)

5. Timely: Up-to-date and available when needed

6. Unique: Avoiding duplication of data records

To these, we should add one more dimension:

7. Compliant: Following data standards and regulations

When data meets these criteria, it becomes a powerful tool for decision-making. However, when it falls short, the consequences can be very expensive. The HBR article points out that poor data quality is not just a technical issue – it’s a business problem that affects the entire organization.

Moreover, HBR emphasizes that improving data quality is not a one-time project but an ongoing process. It requires a combination of technology, processes, and a data-aware culture where every employee understands their role in maintaining data quality.

The hidden costs of poor data quality

While the Mars Climate Orbiter incident is a dramatic example of the consequences of poor data quality, the everyday impact on businesses can be just as significant, though less visible.

Gartner suggests that poor data quality costs organizations an average of $12.9 million annually. This figure includes decreased productivity, missed opportunities, and damaged reputation.

But the costs go beyond immediate financial impact. Bad data erodes trust within organizations. When employees can’t rely on the information they’re given, decision-making slows, innovation stops, and morale suffers.

Real-world consequences: case studies

Let’s dive into some real-world examples illustrating the consequences of poor data quality:

1. The $6 billion Excel error: In 2012, JPMorgan Chase reported a trading loss of $6 billion, partly due to errors in their value-at-risk (VaR) model. A small error in their Excel spreadsheet led to a major underestimation of potential losses.

2. The misplaced patients: In 2017, the UK’s National Health Service faced a crisis when it was revealed that 709,000 patient documents had been misplaced. This data quality issue put patient lives at risk and cost millions to fix.

3. The tax office blunder: In 2014, Amsterdam’s tax office gave out €188 million to around 10,000 households in government rent subsidies instead of €2 million. The reason? Government software calculated payments in cents instead of euros. This error cost an additional €300,000 to resolve.

GDPR and the legal risks of bad data

The introduction of the General Data Protection Regulation (GDPR) in 2018 added a new dimension to the data quality challenge. Under GDPR, organizations are required to ensure personal data’s accuracy and relevance.

Article 5 of GDPR states that personal data must be “accurate and, where necessary, kept up to date.” This means that maintaining high data quality is not just a business imperative - it’s a legal requirement.

The consequences of non-compliance can be severe. GDPR allows for fines of up to €20 million or 4% of global annual turnover, whichever is higher. In 2019, British Airways was fined £183 million for a data breach affecting 500,000 customers. This case underscores the potential scale of GDPR penalties.

Poor data quality can also lead to:

1. Inability to fulfill Data Subject Rights (DSRs): If you can’t find or verify an individual’s data, you can’t comply with requests for access, rectification, or erasure.

2. Breach of data minimization principle: Retaining unnecessary or outdated data violates GDPR’s data minimization requirement.

3. Reputational damage: Data quality issues that lead to GDPR violations can drastically damage public trust in an organization.

Best practices for ensuring data quality

Given the high stakes, how can companies ensure they maintain high data quality? Here are some best practices:

1. Implement a data governance framework: Establish clear policies and procedures for data management. Assign Data Stewards responsible for data quality in their domains.

2. Use data quality tools: Systems that automatically validate data against predefined rules. Top tools also offer data quality scores.

3. Foster a data-aware culture: Train employees on the importance of data quality, encouraging everyone to see themselves as stewards of organizational data.

4. Regular audits: Periodically review data assets to identify and rectify quality issues.

5. Invest in the right technology to ensure consistency across all data sources and systems.

Conclusion: The future of business decisions

As we move into the age of AI and machine learning, the importance of data quality will only grow. These advanced technologies are only as good as the data they’re trained on. Poor quality data can lead to biased algorithms and flawed AI models.

The future belongs to organizations that recognize data quality as a strategic asset. By investing in data quality, businesses avoid costly mistakes and build a foundation for innovation, growth, and competitive advantage.

In the words of William Edwards Deming, the father of quality management: “In God we trust. All others must bring data.” Today we might add: “And that data better be good.”

Weekly newsletter
No spam. Just the latest releases and tips, interesting articles, and exclusive interviews in your inbox every week.
Read about our privacy policy.
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.