Interview with Larry English, Creator of TIQM

Larry English is recognised as one of the ‘Founding Fathers’ of information quality management.

His ground-breaking "Improving Data Warehouse and Business Information Quality: Methods for Reducing Costs and Increasing Profits" was first published over a decade ago and became essential reading for everyone looking to master information quality. 

More recently Larry published an updated to his earlier work in the form of  "Information Quality Applied: Best Practices for Improving Business Information, Processes and Systems", creating some changes to the original methodology and providing various new sections.

We recently interviewed Larry to put forward a range of questions that were supplied by members of Data Quality Pro. Many thanks to all our members who submitted questions for this interview, your assistance was much appreciated.

larry-english-data-quality

Data Quality Pro: "Improving Data Warehouse and Business Information Quality” is still one of the most popular publications on information quality. Have there been any changes in your approach to information quality management since the book was originally released almost a decade ago?

Larry English: In my new book, "Information Quality Applied: Best Practices in Business Information, Processes and Systems" I have made some changes, mostly how to use proven techniques from Manufacturing Quality Systems to support Information Quality Management.

The most significant change is an expanded treatment of how to establish a sustainable Information Quality Culture. In Chapter 3” Implementing and Sustaining an Effective Information Quality Environment (TIQM P6), I address why TQM and IQ initiatives have failed in the past and how you can avoid those failures as you implement an Information Quality Culture across the Enterprise.

Another key change is rearranging the Process Numbers and precedence in TIQM Processes 4 and 5. The former TIQM P5 "Improve Information Process Quality” has become TIQM Process P4, and made the former P4 "Reengineer and Correct Data” has become P5 "Correct Data in the Source and Control Redundant Data.” This change establishes that "Information Process Improvement” is the core competency of an effective IQ Culture and Environment. An organization should never correct data before identifying the root cause of the information defects and defining process improvements that eliminate the causes of the defective process. Data correction is a major COST of Poor Quality Information Processes.

It is the habit of Continuous Process Improvement that earns an organization the right to use the term "Quality” in your IQ Function name.

I have also added a new process step to TIQM P3 "Measure Costs and Risks of Poor Quality Information.” This process step TIQM P3.7 "Measure Return-on-Investment of Information Process Improvement.” This is an essential part of the Business Value Proposition that when compared with the costs of the status quo profit. It demonstrates that IQ Management is a Profit Center when one concentrates on "designing quality in”, and not inspecting and correcting quality out.

I also describe how to use several of the proven Quality Tools and Techniques from the proven Quality Systems, such as Deming, Juran, Crosby, Imai, Baldrige and Six Sigma.

These tools include:

  • The SIPOC (Supplier-Input-Process-Output-Customer) Chart: This is used to capture downstream Knowledge Workers’ Information Quality Requirements

  • QFD (Quality Function Deployment): QFD captures the Customers’ Requirements for new products or services, such as Consumer Goods or Information Systems. I describe how to apply QFD to Information Systems Engineering in Chapter 15.

  • The FMEA (Failure Mode and Effects Analysis) Chart: This tool is effective for analyzing potential risk in major decisions, such as Mergers and Acquisitions or Strategic Investments in new products or services.

Data Quality Pro: The TIQM framework you developed consists of several interrelated components. How do organisations typically implement these components? 

For example, is there a natural tendency to start on one particular component to gain traction or do organisations need to adopt a more substantial implementation of the framework by implementing several components in unison?

Larry English: The TIQM Quality System is a holistic System for Total Quality Information Management, a term which means that Information Quality has become a natural part of Management and Staff behavior.

Organizations often start with a data profiling tool to analyze problems in some key data set. Most often, they follow that with "Data Cleansing” [sic. "Data Correction” or "Information Scrap and Rework” activities]. This is NOT the recommended approach.

The most successful organizations will conduct an IQMM Assessment first, in which we analyze the current degree of maturity of the organization’s ability to apply Information Management and Information Quality Management principles, processes and habits in their organization and the results of their applying these management tools. The biggest barriers that IQ professionals face are cultural. These include performance measures, accountability (or lack of), Management by Specialization (Business Area or Functional Management) as opposed to Management of the Enterprise as a Single System, treating Information Systems as "Products” and not as the automation of work.

After you have conducted an IQMM Assessment, generally with an outside organization, you will have prioritized Information Quality issues and barriers and have focused on a Critical-to-Quality (CTQ) set of information where known problems are high. They will assess the "Information Product Specification” data with a TIQM P1 Assessment for correctness of data names, completeness, correctness and clarity of data definition and business rule specification. This is followed up, not using data profiling techniques, but by measuring Accuracy, non-Duplication, Completeness, and Information Presentation Quality (lack of bias, in proper media), and other CTQ IQ Characteristics.

Once you have discovered the current degree of defects, you should measure the costs of the poor quality information (TIQM Process P3) on the downstream processes. This will provide you a business case argument. By measure the costs of the recovery from process failure, and the information scrap and rework or workarounds are performed, you establish a baseline of the costs of poor quality information. The next step is to conduct a TIQM P4 (Improve Information Process Quality) to eliminate the causes of the defects. Finally you come back to P3 and Conduct a P3.7 "Measure the ROI of the Information Process Improvement.” Now you have the complete Value Circle of finding the defects, improving the process and measuring the positive ROI of TIQM.

Data Quality Pro: What approach do you typically adopt when attempting to obtain senior executive approval for an enterprise information quality program?

Larry English: I take the approach I mentioned in the previous question. The Executive Leadership Team (ELT) is a critical stakeholder and audience. I seek to find out where their hot buttons are. I have specific questions for the Executive Leaders as input to the IQMM Assessment. I help the ELT understand the 180 degree Paradigm Shift from the Industrial Age "Management by Specialization” to the Information Age "Manage the Enterprise as a Single System.” These are fundamental concepts to Quality Management as espoused by Peter Drucker (Management as a Symphony Conductor), W. Edwards Deming (Manage Enterprise as a System), and Masaaki Imai (the next process is the Customer).

Data Quality Pro: It's not just "Big Business” which benefits from information and data that is Accurate, Complete, Consistent, Understood, Relevant, Timely and Trusted.

Government, society as a whole, and indeed the Planet would all clearly benefit. Given this is principally and logically easy to understand and agree with, why do you think Business, Government and society have been extremely slow to learn, demand change and adapt?

Larry English: My estimate is that only two percent of the Private Sector and only one percent of the Public Sector institutions have the Vision and Maturity to implement successful TIQM Quality Systems. 

Shortly after the inauguration of the new Administration, I submitted an Op-Ed piece to the Washington Post, calling for the Administration to establish an Information Quality Czar to oversee implementing an Information Quality Culture in the Federal Government. 

It was not published. 

Shortly thereafter, I keynoted an IQ Conference attended by a number of federal agency personnel. I gave the attendees a copy of my Op-Ed piece and asked them to seek out people at high enough level to get me in front of the right people to see this happen. 

Not one of the attendees got back with me on this matter!!! 

The tragedy of this is that the federal agencies are among those with the highest rates of costs of Information Scrap and Rework - all paid for by innocent Tax Payers.

During the Industrial Quality Revolution in Japan, Kaoru Ishikawa, the great Japanese Quality Leader, who gave us the Cause-and-Effect Analysis tool, was able to get Quality Management established as a national priority.

Data Quality Pro: Traditionally, when times get tough from an economic point of view, data quality projects are often the first ones to be dropped in order to "save money." However, in relatively recent years, we have experienced positive results in regards to data quality associated to major efforts, such as MDM for example. Have you seen these results affect those trends?

Larry English: Management that does not understand the Value Proposition of Information Quality Management will tend to make short term decisions to "save money.” I tell Management that the best costs to cut are the costs and wastes of resources caused by Poor Quality Information. One of my Telco clients saw their entire IQ Team laid off during the Telco overcapacity crisis a few years ago—even after the Team had discovered they were not invoicing some $50 million of services. About three years later, the Leader of the IQ Team saw me at an IQ Conference and told me that they were all brought back in. Within a year, the Team had recovered another $8-9 million per year by improving and error-proof the processes to prevent future failure.

Data Quality Pro: Do you think that the global credit crunch is an opportunity for information and data management? If so, why and what do you think needs to be done to ensure the opportunity is not lost?

Larry English: The best way forward is to help management measure the costs of poor quality information in their own organization. Conduct Information Process Improvements to prevent recurrence of the problems and keep measuring the ROI of your Process Improvement Initiatives. You must help management feel the pain of the status quo, and then you must demonstrate delivered Value to the Enterprise and its Stakeholders.

Data Quality Pro: Based on your experience, what is your recommendation on the optimal organisational positioning of the Information Quality Function within a company?

Larry English: The optimum function placement is at the top of the enterprise with a Chief Information Quality Officer. If you have a Chief Quality Officer with a sound Quality System in place, it would be optimum to have the IQM function aligned with it. If you cannot start with the executive placement, then find the highest level of management you can find who is feeling the pain and work with them to create a Center of Excellence. Start by identifying a major IQ issue and help them solve that problem by measuring the Costs of the Status Quo, and using the Plan-Do-Check/Study-Act Cycle to solve the problem. Measure the ROI and report it to the ETL. Then you will get support to elevate the IQM capability. The secret to sustainability is that you must keep delivering value with every IQ initiative. Also, you must remember that every dollar you spend on Data Correction (Information Scrap and Rework) is waste, because you have not error-proofed your processes to prevent Information Defects.

Data Quality Pro: What are your current views on data quality technology. Is it finally mature enough to support your approach to information quality management or are more innovations required?

Larry English: The current data quality technology does relatively well what it does. Unfortunately, the technologies focus on data profiling, assessment, and data cleansing (Information Scrap and Rework), which are costs caused by poor quality information. There is no value in measuring IQ. That simply tells you the extent of the problem. Data correction is also a cost item caused by allowing the status quo process to continue to produce defective information. Both inspection and correction are the costs of poor quality, whether in manufacturing or information. Your goal is to continually reduce the costs of inspection and data correction for they are waste of the status quo.

There are limitations to data profiling, assessment and data correction. Data profiling simply tells you how poor your data standards and quality controls are. Data Assessment can only measure IQ characteristics of Existence, Valid Value, conformance to defined business rules, value ranges, unique value, potential duplication.

I am disappointed that some DQ Assessment suppliers allow their assessment reports to report "Data Quality” without identifying which characteristics are assessed. Without knowing you are seeing a measurement of Validity, a Knowledge Worker might assume that this means accuracy. Even worse are the KPIs that attempt to combine multiple IQ Characteristics into a single Metric. This blinds you to the real IQ causes.

The core competency in IQ Management is "Process Improvement.” However, I am distressed that those suppliers that do allow source application programs to access business rule and valid value sets to conduct validity tests at the source, rarely promote them.

The goal that the IQ/DQ Software providers make should be to eliminate the need for data correction by implementing edit and validation rules to test data as it is created to prevent validity and valid value defects at the source.

There is a limitation in DQ software that cannot be solved electronically. The software cannot measure "Accuracy” nor can it correct data to an accurate value electronically. For example, software can determine that an address is correct to the postal authority, but cannot ensure that the person associated at that address is still there. The DQ software cannot tell you that the value for the distance from a house to the fire hydrant is accurate. Some one must physically take a tape measure and compare the data provided to the actually distance from the fire hydrant to the baseline point of the house. These assessments and data capture at the source and subsequent data correction must be verified with the person themselves.

Data Quality Pro: What do you feel are the fundamental differences between information quality management and data quality management?

Larry English: The more you study the proven Quality Systems, the more you realize that the proper term for our discipline is Information Quality Management. Does any reader of this Interview know of a sound, proven Manufacturing Quality System that calls itself "Raw Material Quality Management”? Does any reader know of a legitimate Quality System that calls itself "Raw Material and Product Quality Management” or "Product and Raw Material Quality Management”?

It is true that valid Manufacturing Quality Systems do address quality of raw materials used to produce the finished product, but the consumer is buying the finished product, not a collection of component parts.

Data in a database is inventory of component parts. It is not until the data is retrieved from the database, combined with other data and presented to the Knowledge Worker in an intuitive and clear way that you have an Information Product.

My main concern in developing the TIQM Quality System has always been that if we follow the tried and true techniques that come from the proven Manufacturing Quality Systems, we will have effective IQ Management Systems. These include tools like the Statistical Quality Control Chart, the Shewhart Process Improvement Cycle of Plan-Do-Check/Study-Act for analyzing root cause and defining improvements to prevent recurrence and put the process in control, Root-Cause Analysis, SIPOC, QFD, FMEA and others.

Data Quality Pro: Virtually everyone in business today has some sense of the importance of data and information quality to their business. It seems, though, that the solutions [LE note: "solutions” is not a good term here] are from a frame of reference that includes only large organizations. When I talk with people from smaller organizations, they often appear to feel powerless to do anything on their own. They recognize that their problems are the same, but the solutions seem to be beyond their grasp. What suggestions can you offer to help the smaller business step up to the starting line?

Larry English: Actually, smaller organizations have a much better prognosis for implementing Information Quality Management. By being smaller, they have fewer organization units, often with more contact among them to feel part of the enterprise, not just part of a function.

Remember that TIQM is a Value System (I value my Information Customers), a mindset of Excellence in all producers and services, including Information, and a habit of Continuous Improvement in all core Business Processes of the enterprise.

Data Quality Pro: One of the big problems facing the industry is that data can receive a clean bill of health when assessed solely through data quality tools (ie. formatted correctly, de-duplicated etc.). However, when compared to reality the data can be incredibly inaccurate. Are there any techniques or innovations you have witnessed or taught that can help organisations improve the way they manage data accuracy?

Larry English: You have identified the limitations of DQ software assessment tools. One of my clients found no invalid values in the Marital Status Code of their Customers. 

An Accuracy assessment of the Marital Status Value was wrong in one-out-of-four Customer’s records. Yes, there are effective ways for ensuring accuracy at point of data capture. In Japanese the term is Poka Yoke, which literally means to "Error-Proof” a process. It is the same as the Carpenter’s Rule of Thumb—Measure twice, Cut once.” 

Accuracy is the most important "Inherent” IQ Characteristic. When a Call Center Rep takes a call for an Order, you error-proof the process by having the Customer spell their name, and you repeat the spelling back. You do the same with other Critical-to-Quality Attributes, such as social security numbers or addresses for example. 

If you are uploading new product sales prices for the check-out counters, you review and ensure the prices and effective dates are accurate with the Product Managers before uploading.

Data Quality Pro: There seems to be some confusion about the terminology relating to data governance and data stewardship. Some experts choose only the concept of data stewardship, some only data governance, and some even make a point of the clear distinction between the two. What is your view on this?

Larry English: As with Information Quality versus Data Quality, there is no such thing as "data governance” except in the DBMS that ensures that provided data is captured as presented and that two Information Producers do not inadvertently make simultaneous updates to the same record, overlaying one another’s data values.

"Information Governance” is the proper term. There already is an Information Governance body in place in every Enterprise—the Executive Leadership Team (ELT). They are making decisions that have an impact on Information as a routine during their Team meetings. These decisions may be good or bad depending on their understanding of the importance of the Information that is part of their decisions.

Efforts that seek to appoint a mid-level or senior-level management Information Governance body that is disassociated with the ELT will always fail. In large organizations, there is often an ongoing Operating Information Governance Team appointed by the ELT. This Body must consist of at least two of the ELT Executives to maintain continuity in the Information Policies and Information Decisions carried out.

With respect to Information Stewardship, I see nine distinct Business Roles of Information Stewardship that must be natural to a person’s Job Responsibilities. There are about 30 Information Stewardship roles associated with Information Systems person’s Job Responsibilities. I describe these in the "IQ Applied Book Web site: http://www.IQApplied.com

Data Quality Pro: Data Governance has clearly gained traction in recent years within many organisations but there is still some difference of opinion over how a data governance initiative and an information quality programme should co-exist. In your view, how should these two disciplines operate within the organisation?

Larry English: Information Governance is a natural and required part ofthe Executive Leadership Team. But the ELT must be properly educated as to its responsibilities and obligations with respect to Information Management and Information Quality Management (two separate disciplines). The first addresses the management of information as an enterprise resource. The second (TIQM) applies quality management principles to information as a product of the enterprise. 

Remember that Sarbanes-Oxley has correctly placed accountability for Financial Information on the shoulders of the CEO and CFO, with consequences of high fines and imprisonment for failure to accurately provide Financial Information to Investors and other Stakeholders.

Accountability for Information Quality is an inherent part of every job position in the enterprise. There is no conflict between Information Governance and Information Quality Management, even as there is no conflict between Business Governance and Product Quality Management.

Data Quality Pro: What is your view on the changes our profession has faced over the last decade and what challenges do you see for the profession both today and the next 10 years?

Larry English: I believe we are moving into the third decade of the Information Quality era. 

In 1992-93 we saw the introduction of Tom Redman’s first book, my groundbreaking Information Quality Seminars, and MIT’s start to establish an Information Quality Program. 

During the decade of the 1990’s was the storming stage of trying to understand the principles of what was then called "data quality.” 

Some found the principles in extensions to database technology. Others found principles based on data quality assessment software beginning to emerge. I, along with few others, focused on the proven Quality Systems of Deming, Juran, Crosby, Imai, Baldrige, Six Sigma and others.

During the second decade of 2000-2009, we saw the increased software capabilities, including data correction capabilities. Many contentious debates and discussions over IQ versus DQ terminology clouded the issues and kept the discipline from maturing.

At the same time however, those organizations that implemented IQ principles and methods based on the proven quality systems, found huge benefits as they improved their information processes and eliminated the high costs of information scrap and rework.

The next decade must be dedicated to fully understanding the Principles, Processes and Practices that enabled Manufacturing Quality Excellence. After all, Information is a product, and although information is intangible, the same principles and processes apply to manage Information Quality.

Data Quality Pro: You mentioned earlier, your new book "Information Quality Applied: Best Practices for Improving Business Information, Processes and Systems” – can you tell us more about the aim of the book and how it relates to your earlier publication?

Larry English: My new book is basically an extension of my first. In it I describe the six processes of TIQM, providing a Step-by-Step Guide to conducting them. I also describe Deming’s 14 Points of Quality and how they apply to Information Quality. From here I describe how to implement an effective and sustainable IQ Culture and Environment, putting the 14 Points into practice. I define 9 business roles of Information Stewardship (accountability) and the framework for Information Governance.

In Part III, "Information Quality Applied” I define the concept of Business Value Circles that represent the interdependent Business Processes that must be managed, not as separate processes, but as an integral part of the Single System of the Enterprise. I describe the unique Information Quality problems among these Value Circles, including:

  • Customer Care: "Prospect-to-Valued-and-Valuable Customer”

  • Product Development: "Product-Idea-to-Product-Retire”

  • Supply Chain Management: "Forecast-to-Satisfied-Customer”

  • Financial and Risk Management: "Budget-to-Profit”

  • Internet and e-Business: "e-Surfer-to-Satisfied-e-Customer”

  • Document and Knowledge Management: "Words-to-Wisdom” and "Ideas-to-Innovation”

  • Information Management and Information Systems Engineering: "’I-Need-to-Know-So-I-Can-Do’ to ‘I-Can-Do-Because-I-Know’”

Data Quality Pro: Finally, what do you consider to be your most important achievement in a long and illustrious career in information quality?

Larry English: I consider the most important achievement in my career to have inspired and enabled caring people in organizations across the world to implement quality management principles and processes that have reduced the waste caused by defective information processes, and thereby helped make the world a better place to live.


About Larry English

Larry English is an internationally recognized authority in information and knowledge management and information quality improvement. He has provided consulting and education in more than 28 countries on five continents. He was featured as one of the "21 Voices for the 21st Century" in the January, 2000 issue of Quality Progress. 

DAMA awarded him the 1998 "Individual Achievement Award" for his contributions to the field of information resource management. Mr. English's methodology for information quality improvement-Total Information Quality Management (TIQM®)-has been implemented in several organizations worldwide. 

He writes the "Plain English on Information Quality" column in the DM Review. Mr. English's widely acclaimed book Improving Data Warehouse and Business Information Quality, is also available in Japanese.

larry english.jpeg
Previous
Previous

Data Quality Rules Tutorial 1 of 4: Attribute Domain Constraints

Next
Next

Identifying Duplicate Customer Records: An MDM Case Study With Dalton Cervo