Information Quality Management Framework (IQMF): An Overview with Ismael Caballero

ismael caballero.jpeg

In this interview we talk with Ismael Caballero, one of the leading contributors to the Information Quality Management Framework (IQMF).

We find out more about the reasons for creating the framework, who was involved and how our readers and members can benefit from using it.


Data Quality Pro: For the benefit of our readers, can you briefly describe the core components of the framework?

Ismael Caballero: The Information Quality Management Framework (IQMF) is aimed at evaluating the maturity of information quality management in Information Systems of organizations.

The unit to be considered under evaluation is what we have named Information Management Process (IMP), which is a process pretending to bring together both kinds of manufacturing and (quality) management activities for data and information. In some sense, an IMP is a business process. It is worthwhile noting that an Information System can be seen as a set of different IMP’s sharing several resources, like organizational databases, and so on.

The framework is based on the idea of:

  1. Identifying the most critical IMPs of the organization (aka, those causing most of problems impacting on organizational performance)

  2. Assessing them against a reference model, IQM3 (Information Quality Management Maturity Model), by means of using MAIMIQ (Methodology for the Assessment and Improvement of Information Quality Management), which is founded on the idea of continuous improvement.

The idea of MAIMIQ is simple: considering IQM3 as a set of requirements that must satisfy an ideal IMP, then the aim of the methodology is to erase the gaps between the IMP under study and the ideal IMP described by IQM3.

Data Quality Pro: When was the framework first published?

Ismael Caballero: The first time that the first version of the framework was published was at the Proceeding of the International Conference on Information Quality in 2004 (http://mitiq.mit.edu/iciq/ICIQ/iqdownload.aspx?ICIQYear=2004&File=GettingBetterIQ.pdf). Its name was there CALDEA, in honour of the R&D project in which we were working.

Data Quality Pro: What was the main reason for creating the model?

Ismael Caballero: We were working on a R&D project, named CALDEA (http://alarcos.esi.uclm.es/interfaces/spa/proyectos/proyectos.aspx#CALDEA), as I previously said. CALDEA stands for (in Spanish) Calidad en Almacenes de Datos (Quality in Data Warehouse).

After studying several works previously published at that time, we noticed that solely assessing the quality of data stored in data warehouses/databases was insufficient as we understood that data quality is an organization issue, and not something isolated.

Given the fact that all of us are professors at Software Engineering fields, we thought about the idea of structuring all of data quality management requirements in the same way as the software process concept (used in CMMI, ISO/IEC 15504,…), and complementing it with the data quality issues (this is the idea under the IMP concept). We thought that this structure would satisfy the organizational view that we were looking for.

After this, we drew up the reference model (firstly named CALDEA), and then we added the evaluation methodology (firstly named EVAMECAL). In addition, we must say that the although the structure of IQM3 is thought to be stable, for each one of the activities we proposed, not imposed, organizations are free to use different tools and techniques based on their preferences. In this sense, IQM3 can be also seen like a conceptual map, with different practitioners and stakeholders using to help guide their work.

Data Quality Pro: Who was involved and in what capacity?

Ismael Caballero: Currently we are working on IQMF with several researchers.

Dr. Mario Piattini, leader of the Alarcos Research Group (http://alarcos.esi.uclm.es/defaultEng.aspx) is the main head of the research. Dra. Coral Calero, Dra. Angélica Caro and me are all working on refining the framework and applying it to several scenarios.

With us, there are other researchers trying to gather and complement the existing knowledge in the area, and all of us are trying to identify several research challenges.

We also try to be present at the most important international forums on Data and Information Quality to be as much aware as possible of the new incoming ideas to complement the framework in an effort to make it complete and usable.

Data Quality Pro: How long did it take to create?

Ismael Caballero: The framework is continuously being enhanced. I guess it can be said that since we begin to work on it until we published the first version of CALDEA, it took around three years. After the first version, we reviewed it, and we could say that now is more or less stable.

The last major review of IQM3 was in early 2008. Since the structure of IQM3 is more or less stable, new additions, must be allocated properly. For instance, in Fall 2008, when the ISO/IEC 25012 appeared, we addressed it as an artefact of one of the Key Process Areas of IQM3. The same is true for the family of standards of ISO 8000.

We are currently working on some measures to better represent the results of the assessment and improvement process described in MAIMIQ.

Data Quality Pro: How do you envisage people benefiting from the framework?

Ismael Caballero: In the same way that any software factory using the models CMMI and ISO/IEC 15504, any other organization could assess (and even certify) their IMPs. This certification of providers’ processes could indeed be used as warranty for customers of their data and information product are served under certain state of quality.

Data Quality Pro: What existing resources did you find of use when creating the model?

Ismael Caballero: Different projects and works were very valuable. Within the field of Data and Information Quality the most valuable works were those by Wang et al at the MIT (TDQM program), the research published in the proceedings of the different editions of the ICIQ, the work by Larry English, the works by Thomas Redman, and the work by David Loshin.

From the Software Engineering field, we found the CMMI framework to be very important and associated assessment/appraisals methodologies (like CBA-IP and SCAMPI).

Finally, from the Database field, overall the previous work on measurement by Dra. Calero and Dr. Piattini was of value to the project.

Data Quality Pro: The framework was created from members of the academic community, have any elements of the framework been used in a commercial context?

Ismael Caballero: We applied the framework to two kind of organizations: one public, a local administration, and another private.

Due to the nature of the results we were allowed only to publish those corresponding to the public organization in the ICIQ 2005 (http://mitiq.mit.edu/iciq/ICIQ/iqdownload.aspx?ICIQYear=2005&File=ImprovingIQMgntUsingCaldea%26Evamecal.pdf). 

Some companies have been interested in the framework, although they have used several approaches on their own.

Data Quality Pro: Where can people find out more about the information quality research and publications you are delivering?

Ismael Caballero: At this moment, the most recent and most updated publication of our work is the one by the Journal of Universal Computing Science (to whom we are very appreciative, especially to Dana Kaiser for their support).

We are finalizing another work on MAIMIQ, and we hope it can be published soon. In addition we are developing a tool for automating the process of assessment of the IMP and for generating some improvement advice.


Ismael Caballero

Ismael is an Associate Professor of Software Engineering, Quality and Measurement In Information Systems (at the UCLM’s Ph.D. Programme on Advanced Computing Techniques)

Specialties include: Data and Information Quality, Data and Information Quality Management

LinkedIn: http://es.linkedin.com/pub/ismael-caballero/7/221/b38

Previous
Previous

Integrated Modelling Method: An Introduction with John Owens

Next
Next

Interview With David Loshin, Author of “Master Data Management”