10 Reasons Why Data Quality and Data Governance Initiatives Fail

With her extensive experience in leading Data Quality and Governance initiatives in different domains, Purvi Ramchandani, Certified Information Management Professional Expert (CIMP-Ex), describes ten situations when Data Quality and Data Governance initiatives have a higher probability of failure.

purvi ramchandani.jpg

1. When the focus is on working in a reactive mode instead of a proactive mode

According to a report by Gartner titled 'Measuring the Business Value of Data Quality', 40% of business initiatives fail due to poor data quality.

When asked for support, some leaders claim that the organization has many fires to put out and Data Quality is not a very high priority since they do not hear Business complaining about it.

The industry is receptive to Data Governance and Data Quality because of competitive pressures, the desire for financial transparency, and the mandate for compliance with regulatory directives such as Sarbanes-Oxley, Dodd Frank, CCAR, Basel III, BSA/AML and HIPAA. However, data seems to be governed in a reactive mode, after an issue becomes a large problem that can no longer be ignored.

In today’s era of data-driven decision making, good data is the king, and data quality gives power to the king. A king without power is of no use, and data without quality cannot serve any purpose. If this is true, shouldn’t data be governed for quality in the first place?

2. When people leading these initiatives have a lack of knowledge, experience, and passion

Data Governance and Data Quality are different and relatively new disciplines, compared to other established functions like Data Warehousing, Architecture and Application Development. Hence they cannot be managed in the same way as other traditional functions.

Often, when people with no previous experience in these disciplines are asked to lead them, they try to apply a 'one-size fits all’ strategy that becomes a recipe for imminent failure.

For example, some leaders feel that Data Quality is a straightforward process, simply consisting of identifying rules and fixing exceptions. They feel it is a trivial effort that can be delivered by any third party consultant.

However, Data Quality is not just identifying rules (this is one of the first steps of the process). Data Quality is the end-to-end process and framework to assess, monitor, and improve the data quality by cross-functional collaboration between many different business and IT teams.

People attempting to implement Data Governance and Data Quality read white papers and hire outside experts to help them. However, Data Governance and Data Quality cannot be accomplished just by listening to PowerPoint presentations or reading high-level documents.

Instead, Data Quality comes from continuous and copious amounts of effort.

In the end, it is the execution that matters the most.

You cannot underestimate the amount of work and commitment involved in making these processes a success. To put forth a plan with a realistic expectation, you need people with knowledge and proven experience to guide and sustain the ongoing efforts as employees.

3. When the right people with the right skills are not involved in the execution of the initiative

Data Quality and Data Governance initiatives require specific skills and training for people to understand the concepts and apply them to ensure successful execution and ongoing support.

Simply adding headcount is not enough.

If the right person is hired, then the cost of a dozen inexperienced resources can be saved.

The onshore-offshore consulting model does not work effectively for Data Quality initiatives. While we can leverage the offshore development capabilities to develop the rules and scorecards (a very small fraction of the DQ initiative), the majority of effort is working closely with the business, understanding the processes and effective resolution to improve the data quality. This level of engagement requires folks with a business/analytic mindset.

We often see IT driving Data Quality initiatives, and there is a constant confrontation with the business and technology teams.

On the business side, remediation stewards need:

  • The knowledge of business processes

  • The ability to collaborate with different teams

  • The authority to make important decisions

  • The capacity to identify root causes of issues

  • The ability to recommend process/system improvements that prevent recurring issues

Some of the soft skills that help make Data Quality and Data Governance initiatives successful are described in-depth in her previous post: Implementing an enterprise data quality initiative: Interview with Purvi Ramchandani

4. When Data Governance initiatives are driven without Business involvement and collaboration

The business community needs to be engaged and partnered in all the important decisions from the outset of all Data Quality and Data Governance initiatives.

Data Quality is everybody’s issue, and it can be resolved conclusively by a clear separation of duties between the Business and IT teams, wherein IT needs to focus on the technology delivery aspect, and Business needs to focus on the quality improvement aspect.

The success of Data Quality depends on the active collaboration between IT and Business working towards a shared goal of building trust in data. Frequent and clear communication can be a very useful tool in keeping everybody informed. IT and business leaders need to work together to manage, monitor, and improve the quality of data.

5. When there is a lack of Business engagement and commitment

High-level business sponsorship is a start.

However, the Business has to be committed to not only initiating the Data Governance effort but also living it on a day-to-day basis because the Data Governance initiative has a higher probability of success when it is driven by the Business.

Data Governance needs to be led by an experienced business leader and associated business-facing teams who need to be trained to embrace the tools and processes as day-to-day responsibilities.

The Data Governance Leader needs to act as a change agent; actively encouraging people to do things differently than what they are used to. If this change is not implemented correctly, it may do more harm than good.

Additionally, business folks need to accept Data Governance and Data Quality processes as aids to help them perform their jobs more effectively, not as an additional responsibility over their day job.

Good data helps everyone!

6. When there is a lack of clarity on expectations, accountability, and ownership

The concept of a ‘fluid organization’ does not work effectively in the case of Data Governance and Data Quality initiatives. They require a clear definition and assignment of ownership, accountability, roles, and responsibilities.

There needs to be a clear separation of duties between Business and IT to avoid a duplication of effort, potential confusion, and inevitable chaos.

IT can focus on enabling Business with proactive data profiling and implementing tools/processes, as well as on training business stewards on the usage of tools and data lineage.

On the other hand, the business can focus on the actual improvements in the data by fixing the data in the source system, monitoring the trends in quality, identifying the root causes of issues and recommending the process/system improvements.

7. When the expectation is to accomplish too much too soon without a realistic execution plan and resources to help execute the plan

Data Governance and Data Quality are processes that span several years.

You cannot go from not having any data governance program to having an enterprise-wide implementation overnight. You need to start with a project plan that identifies priorities, tasks, resources, and responsibilities.

Data governance requires discovery (profiling), a definition of business rules and documentation phases with a lot of discussions, interviews and meetings. Business and IT have to achieve a common and holistic understanding of what data represents and how it is used to make business decisions.

Business people representing different functions and processes need to agree on the data definitions and business transformations used to support business decisions. This is a time-consuming and a people-intensive process. If you short-circuit this process, it is unlikely the definitions and rules will be accepted or used by the Business. We cannot boil the entire ocean! Prioritization is important to break the huge amounts of work into manageable iterations.

8. When these initiatives are treated as a project instead of an ongoing program

Data Quality and Data Governance are both ongoing efforts. Hence they cannot be handled as a short term project since projects have specific end dates. They need to be handled as ongoing programs with appropriate funding and resources to accomplish the prioritized goals.

The tools, technology, and processes need to be set up with the long-term benefit of the organization in mind. The goal should be to implement the end-to-end framework and functionality, instead of simply implementing rules.

The end-to-end framework should cover the assessment (measurement), monitoring, improvement, and the publishing improvement metrics aspects.

9. When there is a lack of transparency

Transparency is synonymous with providing insight into which processes work, and which do not work. Since regulators and stakeholders both require confidence in the data, transparency can offer a clear view of the quality of the data, and how it has been measured.

When the authoritative sources of master and reference data are shared across the enterprise freely, both the productivity and performance improves. Use of automated tools to manage metadata, lineage, and business glossary can help with common understanding of enterprise critical data.

When providing transparency reveals that something is not working properly, the situation can be addressed quickly. Otherwise, the problem could go on undetected and eventually lead to errors and inferior data.

10. When the right tools, technology, and processes are not used

Data Quality Management and Certification is not sustainable without reliable, automated tools to measure and monitor quality with trending over a period of time.

Integrated and automated tools allow you to reuse and share data quality rules throughout the organization.

It also requires the capability for the easier update of rules to meet new regulations, new market risks, or change in business process.

There is also a need for investment in the right automated tool with the workflow functionality required to foster collaboration between different business teams to resolve the exceptions and fix data in the source system.

Studies have found that business users can spend up to 50% of their time fixing data quality errors. The right tools can help with saving time and reducing the cost of delivering Data Quality with a scalable platform that can be leveraged across the organization.


About the Author - Purvi Ramchandani

Purvi Ramchandani is a Certified Information Management Professional Expert (CIMP EX) with over 20 years of experience in various information management disciplines.

In the last 10 years, she has been living her passion around Data Quality / Data Governance / Data Stewardship and focusing on leading enterprise data quality profiling, monitoring and improvement initiatives. She has led Enterprise data quality improvement, and data analysis/analytics teams at SVB (Silicon Valley Bank), NetApp and Washington Mutual Bank (Now JPM Chase) respectively.

She led efforts to establish data quality framework, standards and practice. Not only monitoring and fixing data, but she is also passionate about finding gaps in systems and processes and driving process improvements at the source to eliminate the problems from recurring. She believes that cross-functional collaboration is the key to the success of any data quality program. With her passion for leadership and ability to influence without authority, she has successfully driven many cross-functional business and IT initiatives.

Previous
Previous

7 (Low Cost) Ways to Improve Data Quality in Data Entry

Next
Next

Managing Data Governance in the First Six Months: Interview with Nicola Askham, The Data Governance Coach