Data Quality: Be Practical, Not Idealistic

Visualizing data quality as low-hanging fruit can be misleading

The PDP bill in India, akin to GDPR, which is still being debated will bring in a lot of regulations for global firms and GCCs in India concerning data, and data leaders need to increase their ambit of data quality to cater to the requirements

A lot of data leaders I have been speaking to started off visualizing data quality as a low-hanging fruit believing that there would be no one who would not wholeheartedly support the need for clean data. But often, after committing to ensure complete data quality, they find the needle is moving at a glacial pace, more often driven by the customer or regulators rather than a proactiveness to do the right thing.

Ram Narasimhan


Why does this dichotomy exist when there is so much to gain by having great data quality?

The challenge when it comes to having the right data quality more often than not boils down to priorities. Business teams have their hands full running their day-to-day tasks - they have a set sequence of activities to perform and over the years these activities have expanded to include workarounds to mitigate bad data. They are used to IT or other teams cleaning up data temporarily just enough to carry on with their daily tasks. 

Another reason is the trade-off on cost vs value. Often the cost of fixing data quality permanently has a high upfront cost vs the incremental value it generates over the years. The time and effort needed to fix data quality upfront is often prohibitive and tedious especially if the value is intrinsic and not readily visible. This lack of visibility and the glacial pace of value realization often prompt businesses to focus on tactical data quality rather than invest in a long-term solution. In my interactions with the business, the most often question I get is on data quality, but on delving a bit deeper, I realize that the need from the businesses is more transactional rather than just so that they can get on with their daily tasks. 

We also need to be able to quantify the value of data quality in different ways rather than the conventional markers of revenue, cost, and efficiency. When it comes to data quality value markers, let's widen the metrics to include cycle times, automation (ex: data availability through APIs), regulatory compliance, customer experience and satisfaction, data security, and protection. For data quality does not only result in great analytics it also helps in a lot more automated operations.

One of the little-known impacts of data quality is in the areas of data protection and security. With countries across the world bringing in their own version of data protection bills, GDPR-like regulations, and the increased focus on PII data, the ambit of data quality has increased to include areas like keeping personal data up to date, data retention, data segregation, archiving, and audit. The PDP (Personal Data Protection) bill in India, akin to GDPR, which is still being debated will bring in a lot of regulations for global firms and GCCs in India concerning data, and data leaders need to increase their ambit of data quality to cater to these too.

The importance of strategic data

It has made me wonder, does strategic data quality exist, or is tactical data quality the need of the hour? Especially in the world of telecom, where the disparate application stacks across BSS and OSS often span across generations of technology, reams of regulations, and wafer-thin margins on products leading to teams just wanting to put in whatever it takes to just get an order provisioned and delivered.

Considering the challenges stacked against a long-drawn-out data quality exercise, a smart and optimal strategy that may be realistic is to abandon 100% data quality as a vanity exercise and rather focus on the small but significant goals to deliver incremental value. The approach should morph into a mix of strategic and transactional data quality initiatives.

Let's split the focus into a hypothetical matrix of data quality choices that may apply to almost all organizations but to telecom companies in particular that may be an alternate high-value approach 

  1. Use a hypothetical matrix of data quality choices to prioritize mandatory data such as master and referential data, as well as essential data for business acceleration, including customer, product, and equipment data. Focus on strategic data quality initiatives for these entities
  2. Use a tactical approach to clean operational data that needs immediate attention, including incremental changes to prevent future data corruption.
  3. Leverage transformation programs, application rationalizations, and M&A activities to push for data quality improvements.
  4. Implement a program akin to bounty hunting to incentivize data quality. For example, if teams who enter or modify data accurately can receive a percentage of the savings from reduced data cleaning costs.
  5. Automate data analysis using AI/ML to help identify and resolve data issues quickly and efficiently, ultimately improving data quality

The right approach

The above approaches offer a mixed bag of results when it comes to ensuring long-term data quality. While data leaders may still find gaps in perfectly clean data, these approaches can provide businesses with much better data, leading to increased productivity and support for better data governance and quality in the long run.
One aspect of data quality that is often overlooked is its maintenance over time. This requires a cultural shift among users and constant efforts to minimize the mixing of unwanted data with clean data. A river analogy can be used to explain this issue. Although it's impossible to completely prevent it, various approaches like incentivization, monitoring and cleaning cycles, ongoing user training, increased ownership, and automation can help minimize the problem. While no approach is foolproof, taking an incremental approach can help data leaders focus on larger goals of driving value from data and transforming their organization into a data-driven one.
- The author of this article is Chief Data Officer at Colt Technology Services

Add new comment