In this article I talk with Danette McGilvray of Granite Falls and author of Executing Data Quality Projects. Danette is a highly experienced practitioner, trainer and author of data and information quality.
Danette provides a range of insights including how she progressed her career, what mistakes organisations commonly make when starting out with data quality and how to create a data quality business case.
Data Quality Pro: Can you describe your background Danette? How did you get involved in the field of data quality?
Danette McGilvray: When I first learned about data quality, I was working at Hewlett-Packard (HP) in their Direct Marketing Organization in a group responsible for managing customer information.
My manager led a council consisting of other managers also responsible for customer information that supported the sales and marketing functions at HP. They found they had similar challenges – they knew high quality customer information was critical to the success of sales and marketing but were all having a difficult time showing the value of information quality in order to get funding and support for managing the information.
The council’s executive sponsor’s suggestion to fund a project to get help resulted in the council bringing in Larry English. The day before Larry arrived onsite my manager assigned me to work with him so the knowledge would not leave the company when he left.
That was my introduction to data quality and I have been working in this field ever since. And that was in 1993. As time went on, my experience expanded to include data quality in other areas such as finance, order management, procurement, manufacturing, etc. I was a practitioner, program and project manager, and internal consultant before starting my own consulting company.
All of this firsthand experience has served me well as I continue to work with clients in many industries.
Data Quality Pro: Most people will probably know you from your “Executing Data Quality Projects: Ten Steps to Quality Data and Trusted Information” publication which has certainly proven to be a bestseller in our online bookstore.
For those who haven’t yet seen the book can you briefly describe the Ten Steps and how they help organisations execute data quality projects?
Danette McGilvray: The Ten Steps methodology is made up of a framework, concepts, and processes for improving and creating information and data quality within any organisation.
The Ten Steps process contains concrete instructions for executing information and data quality improvement projects. To implement The Ten Steps process effectively, it is necessary to understand some concepts related to data quality; for this reason a Framework for Information Quality and several key concepts are presented in the early chapters.
The book operates under the “just enough” principle. Readers are given just enough background on the underlying concepts to understand the components necessary for information quality and to provide the foundation for the step-by-step instructions.
The instructions provide just enough structure for readers to understand what needs to be done and why. The beauty of the approach is that it provides just enough structure to know how to proceed, but flexibility so those using it can also incorporate their own knowledge, tools, and techniques. It was written to fill the gap between, yet be a complement to, existing books that provide higher level concepts or processes and other books that dive deep into specific subjects.
Organisations can choose the applicable steps, activities, and techniques for their situation and use the methodology:
- For information quality-focused projects, such as a baseline data quality assessment of a particular database or a business impact assessment to help determine appropriate investments in data quality activities.
- To integrate specific data quality activities into other projects and methodologies, such as building a data warehouse, implementing an ERP (Enterprise Resource Application) or migrating data for any application development project.
- In the course of daily work,whenever you are responsibility for managing data quality or the work you do impacts data quality.
- As a foundation for creating your own improvement methodology,or to integrate data quality activities into your organisation’s standard project life cycle or software/solution development life cycle (SDLC).
Use of this approach can save an organisation time and money because the foundational work for the process has already been done. Therefore, your time is spent determining how to make it work for your specific situation, taking action, and getting results!
Data Quality Pro: When I reviewed the book, I made the comment that it is constructed to be highly practical in nature, have you received feedback from any readers of the book on how they are adopting the methodology “on the ground”?
Danette McGilvray: Thank you!
I’m glad you see it as practical because that was one of my goals. Other feedback has echoed your experience. Following are just a few examples of how it has been applied:
- A high tech company took the Ten Steps and combined them into eight steps. They identified where they could use materials already available within the company. For example, other projects already had interaction maps and application diagrams that could be used for the data quality assessments.
- A Canadian oil and natural gas exploration and production company used the methodology and list of deliverables to create their own process for their data quality assessment projects.
- A biotech company used the process to assess their customer master data. Separately, a Data Services Center of Excellence was created and training on the process was integrated into data profiling tool training. Attendees got the benefit of how to use the tool along with a process to help them prepare to use the tool and then better act on the results from the tool.
- A water district applied some of the business impact techniques to show the value of their data governance work.
- An insurance company used one of the business impact techniques to prioritize and show the business why it was important NOT to spend time on particular data. Use of the root cause analysis techniques helped them direct their efforts to real causes, not symptoms.
- An oil company used the methodology to associate the data with the business needs. The result clearly showed how data quality impacted the human resources information which further supported the company strategy. The result was a high level of support and funding for the proposed data quality initiative.
- Several organisations have used the 12 data quality dimensions as a starting point for the dimensions most important to their business. They choose the dimensions to meet their needs and may modify the vocabulary or how they categorize the dimensions to fit what works in their environment.
Data Quality Pro: There is a great deal of data governance related content in the Ten Steps publication, can you articulate the difference between a data governance policy and a data quality framework because there often appears to be an increasing amount of overlap?
Danette McGilvray: Data governance outlines and enforces rules of engagement, decision rights, and accountabilities for the effective management of information. It implements the right level of policies, procedures, structure, roles, and responsibilities to manage information.
Data governance provides venues for interaction and communication paths to ensure appropriate representation from business, information, and technology to make decisions, identify and resolve issues, and implement changes. Appropriate representation means that you have the people with the knowledge and the right level of authority involved.
Information qualityis the degree to which information and data can be a trusted source for any and/or all required uses. It is having the right set of correct information, at the right time, in the right place, for the right people to make decisions, run the organisation, serve customers, and achieve company goals.
Data quality cannot be sustained without data governance. Let’s assume you have great support for data quality during your project. You have integrated data quality activities into your data warehouse project, your ERP implementation, or your new application development. All these efforts resulted in the right data being loaded according to schedule. Even with all this if there is no accountability for the data on-going, the moment the project goes live the data will start to decay.
The governance structure formalizes needed accountability for information quality in on-going operations.
Another example: data profiling is a data quality activity and out of it an organisation finds errors that need to be corrected and improvements that need to be implemented such as updating business processes to prevent the problems from happening again. Data governance provides the process to discuss the issues and ensures the right people act on those results.
Often data governance efforts are put into place to ensure data quality so it is natural to see a relationship between data quality and governance. Many of the data governance policies (the set of principles which guide actions) specifically support information quality activities. I show the connection to data governance in my Framework for Information Quality (FIQ) under the broad-impact component titled “Responsibility.”
The Ten Steps process provides the “how-to” when you have to deal with data quality issues. One aspect of that is how people and organizations impact data quality and that is another linkage point to data governance.
One last comment here, the book is focused on data quality, but it is true that much of what is written can be applied to data governance. For example, the business impact techniques are qualitative and quantitative measures for determining the effects of data quality on the organisation. These same techniques have also been successfully applied to determine the effects of data governance on the organisation.
Data Quality Pro: You mention Larry English at several points throughout the book, how instrumental has Larry been in shaping your career and business?
Danette McGilvray: As I mentioned in the answer to your first question, I got my introduction to information quality from Larry and I give him full credit for that. What I learned from him laid the foundation. I have also learned from and credit many others – both consultants and practitioners within organisations.
In addition to Larry, I directly reference in the book or include in the bibliography such people as Tom Redman, David Loshin, Richard Wang, Larissa Moss, Graeme Simsion, Peter Aiken, David Hay, Martin Eppler, John Zachman, Michael Bracket, John Ladley, Len Silverston, Arkady Maydanchik, Elizabeth Pierce, Gwen Thomas, Bob Seiner, Michael Scofield, Andres Perez, Jack Olson, Ron Ross, David Plotkin, Beth Hatcher, Mary Levins, Mehmet Orun, Wonna Mark, Sonja Bock, Eva Smith, Anne Marie Smith, Lwanga Yonke, Susan Goubeaux.
Well – you get the point – and the list goes on!
I have taken what I learned from others and adjusted as needed to fit my particular situation. That was one of the driving forces for me to write the book – to share what had worked for me and to provide an approach to give people the foundation needed to implement data quality within projects, yet knowing each person, organisation, and project will adjust and modify what is there to meet their specific needs.
Data Quality Pro: Through your consulting business, Granite Falls, you obviously work with a lot of organisations who are looking to improve data quality, what are some of the common mistakes you see organisations making when they’re starting out on the road to data quality maturity?
Danette McGilvray: Some common mistakes when starting out and how to avoid them:
- Thinking that data quality should be done for the sake of data quality. Often those who “get” data quality cannot understand why others do not see the importance of data quality. This causes frustration on all sides – those trying to improve data quality and those being asked to fund activities they don’t understand. To overcome this, ensure that data quality activities will address business needs and those things the organisation and management cares about. If you are not sure how to show value, use the business impact techniques in Step 3 to help you get started. Continue to increase your ability to communicate the business impact.
- Failing to analyze the information environment before conducting an in-depth data quality assessment. The temptation is to immediately extract some data and start analyzing it. A lot of time is then spent assessing data that is actually not associated with the problem they are trying to solve. This often results in multiple extracts and rework before getting to the relevant data. Avoid these problems by spending just enough time in Step 2 – Analyze Information Environment. Learn just enough about the data, processes, people and organisations, and technology to provide a foundation of understanding that will be used throughout the project by: 1) ensuring you are assessing the data associated with the business issues, 2) collecting requirements – the specifications against which data quality is compared, and 3) providing a context for understanding the results of the data assessments and helping with root cause analysis. The more you understand the environment that affects the data, the better you will understand what you see when analyzing the data and therefore you will be able to take better action. That being said, it is also important to keep yourself moving through this step and not get bogged down intotoomuch detail. Ask yourself the following questions: Will the detail have a significant and demonstrable bearing on the business issue? Will the detail provide evidence to prove or disprove a hypothesis about the quality of the data? Determining what is relevant, appropriate, and the right level of detail takes experience. Get the foundational information needed to proceed effectively, but don’t get too far into detail that may be unnecessary. You can always come back later in the project and get additional detail if needed.
- Failing to reuse information already available. For example, an in-depth data quality assessment requires information about the data – where it is stored (tables, column names, etc.) and an understanding of the structure and relationships. This information is often already available in a data dictionary or data model. Team members should expect that 80% of the information required in Step 2 – Analyze Information Environment already exists somewhere. Often it is a matter of collecting and updating anything that is not quite current. Supplement existing materials with original research only as needed.
- Failing to “naturally integrate” data quality activities into what is already going on in your organisation. If your project (e.g. to build data warehouse, implement an ERP, develop new application) is already using another methodology, project life cycle, or SDLC, get to know the project approach. Work to integrate the data quality activities into that approach. Make certain the data quality deliverables are in the project plan and fit into the schedule. Ensure the people working on data quality are recognized members of the team (whether core or extended resources), receive the same communication as other team members, and get invited to the necessary meetings.
- Failing to communicate enough. Communication is so vital that it is noted in the methodology as a step that should be done as a part of every other step. Data and information quality affect multiple organizations and many people. We cannot do the work alone and we must have the support of others. The projects will result in recommendations to modify business processes or cause people to change the way they are doing things. In short, we are instituting change. The whole idea of communication is a way to open the door and recognize the importance of the human factor in the success of our projects. Realize that communication and working with people is an integral part of data quality and allow time for it.
Data Quality Pro: Aside from writing the book, training and presenting you still perform data quality and data governance consulting services. What are some of the typical challenges your clients face?
Danette McGilvray: Lack of time, money, human resources and the pressure of project deadlines are typical challenges. Lack of understanding that a focus on data and information will actually help a project keep within budget, meet timelines, and yield successful results is another. These challenges lead into your next question.
Data Quality Pro: One of the most frequent questions raised by our members is one of how to establish a business case for data quality, are there any specific techniques from your methodology that can help with this issue?
Danette McGilvray: Step 3 – Assess Business Impact outlines eight techniques that can be used to help show the impact of data quality (or lack thereof) on the organization. Results from the assessment(s) can be used to establish a business case for data quality.
It is important for people to do something related to business impact. Don’t think there isn’t time. That is why I have the techniques on a continuum of relative time and effort – with the techniques moving from taking less time/ being less complex to more time/ more complex. Experience has shown that any of these techniques can be effective.
Use the techniques that best support your situation and fit within the time and resources you have available. Whenever building a business case I use the acronym WIIFT and ask myself the question “What’s in it for them?”
For those you are going to for support – do they care about impact on revenue, money saved, operational efficiency, headcount, risk, etc.?
Do they have personal or professional reasons for supporting your work?
Translate your business impact results to be most meaningful to those from whom you need support. Another point to consider: when approached about including data quality activities in projects the first questions often asked are how much will it cost?
How many people do you need and what kind of experience and skills should they have? Do you need any tools? How long will it take?
You can help build a business case by being able to answer these questions. You can answer these questions by considering the detailed steps throughout the methodology, choosing those which apply to your situation, and estimating the costs, people, tools, and time needed.
For example, choose the applicable data quality dimensions from Step 3 and create your estimates from there.
Data Quality Pro: What are your aims for the future Danette, do you have plans for any new publications or will you continue to focus on your consulting and training activities
Danette McGilvray: I will definitely continue to focus on consulting and training. The positive response to my approach continues to provide motivation and confirmation that it does work – in all kinds of organisations and with all types of data. I always have other ideas, but nothing that I’m ready to share at this point.
Finally, thanks for the opportunity to talk with the readers of Data Quality Pro, Dylan. Please encourage them to continue to provide feedback, share what they have done, and contact me if I can help. I can be reached at danette(-at-)gfalls.com.
Danette McGilvray is president and principal of Granite Falls Consulting, Inc., a firm that helps organizations increase their success by addressing the information quality and data governance aspects of their business efforts.
A skilled facilitator, program and project manager, she has worked with people in all levels of the organization and from all functional areas, giving her a valuable perspective on organizational challenges based on real-life experience. Critically, she also emphasizes communication and the human aspect of information quality and governance.
Danette promotes the intentional management of information assets with the same rigor used to manage other company assets such as products and human and financial resources.
In 2012, Danette received IAIDQ’s Distinguished Member Award, “in recognition of her outstanding contributions to the field of information and data quality.” She is an invited speaker at conferences around the world. She contributes articles to various industry journals and newsletters and has been profiled in PC Week and HP Measure Magazine. She was invited to the People’s Republic of China to discuss roles and opportunities for women in the computer field.
Danette is a founding member of the International Association for Information and Data Quality Professionals (IAIDQ) and an active member of DAMA International.
You may reach Danette directly by clicking here.