Tectonic Business Models And The Inevitability Of Poor Data Quality

One of the recurring questions I’ve been faced by business leaders in the past is that given the huge sums of money spent on applications, data management, staff and tools, why are you convinced we will have poor data quality if we don’t invest in a data quality management strategy?

It’s a fair point, one must always empathise with people who sign off IT and information management budgets only to discover that their data is of substandard quality.

One of the principle reasons I believe poor data quality is all but guaranteed when data quality management is omitted lies with the often tectonic shift that takes place in the business models of all organisations.

Let’s take a telecoms company as an example as they often suffer from extremely poor levels of data quality in my experience.

I routinely find in telecoms companies that no sooner has an application been commissioned and launched it almost immediately starts to experience growing pains. Over time a gap emerges between the business model the telco wants to innovate and deliver services against compared to what their data management landscape actually supports.

If you’ve watched the John Owens Business Modelling for Data Quality webinar series you will know that data is not created by processes but by functions. 

The set of functions performed by the various service units and customer interactions make up the current business model of the organisation. However, as business leaders devise new business models we experience major shifts in business function requirements that can appear almost tectonic in nature with periods of relative calm punctuated by almighty upheavals as our current applications and data struggle to meet the new business model head on.

In one telecoms company I witnessed this phenomenon first-hand over a period of 5 years as constant regulatory and competitive pressures forced the company to innovate ever-changing new business models.

This constant change is why data quality levels can gradually deteriorate over time without even being noticeable. Workers simply adapt by modifying local practices, developing new standard operating procedures to cope with these warps and buckles in the current business functions until their applications and data no longer support the new world they have been steered towards. At this point whole new systems are commissioned, migrations take place and the whole cycle repeats.

This is why poor data quality is a scientific certainty without data quality management.

Every company, regardless of size and sector, will experience this disconnect between business models, functions, systems and data. It is incredibly difficult, if not impossible, to keep them 100% aligned. They only need to be out by a fraction for poor data quality to arise.

We need data quality management to control these gaps and tectonic shifts around our organisations so that we can monitor and react accordingly. 

Sometimes we simply have no options but to cleanse and transform our data to ensure that data from the old business model can be shoe-horned into the newly revised model. It may be contentious to some but it’s still a fact of life for most companies.

It’s also the reason why continuous data quality assessment and monitoring should not be scorned as a needless cost-centre. As business models change and adapt we have no option but to monitor the impacts on our data and ensure that any gaps are managed accordingly.

When a business leader next asks the question “Why should I fund your continuous data quality management initiative?” perhaps we should respond by stating there is absolutely no need at all, provided you refuse to innovate and adapt the way we deliver value to the marketplace.