The Internet of Things (IoT): A Data Quality Manager perspective by Daniel Vincent

This is a guest post by Daniel Vincent, Head Of Data Quality for EMCOR Group.

The Internet of Things (Iot) offers a number of real opportunities to those of us who capture, analyse and present data from the physical world.

As a former IT Operations Manager, I am used to managing my IT estate using real-time data streams from monitoring tools such as Solarwinds, System Center or Nagios, or from cloud providers’ dashboards. These were all designed to mimic the closed control systems that have been in use for many years in heavy industry, power generation, manufacturing, and construction.

Who can forget the big banks of dials and flashing lights that were cinematic shorthand for impending calamity in 70s disaster movies?

Just as those dials and flashing lights have become interactive online dashboards, the closed control systems are being augmented, and in some cases replaced, by Internet-enabled devices. These devices can record the state, condition or telematic data which is then collated, aggregated, analysed and presented back in the form of insights. This is especially true within the Facilities Management industry where, until recently, mission-critical state and condition data has been collected in a manual and labour intensive manner.

On that basis, data collection using connected devices shouldn’t be viewed as doing something exotic and new. It showed be viewed instead as an opportunity to improve existing processes, allowing them to scale up in a cost effective way.

Knowledge workers are empowered to make better-informed insights with those scalable, repeatable measurements, often creating many more data points than would be available through other methods.

The proliferation of cheap, highly available, low powered devices also gives us the opportunity to leave sensors in place unattended for extended periods of time: these can often overlap with closed systems which may not be suitable or accessible to FM partners for logistical or commercial reasons.

So, the commercial arguments are clear, but where does that leave us regarding data quality?

1. More consistently measured data points = better data

Let say we need to measure the temperature in a large workshop area because sometimes it’s reported that the air conditioning is too cold, and sometimes too hot.

Taking a manual measurement can introduce a number of environmental variables:

  • Time of day
  • Location of measurement
  • Length of measurement
  • Calibration and consistency of measuring apparatus

When the engineer arrives to take a measurement, where does he take it? Where was the last measurement made?

Take all these factors into account and you find yourself (or your boss) asking the question:

How do you guarantee the provenance of your data?

Measuring from a consistent location, with sensors that have been adequately calibrated, at a frequency which will take into account the other environmental factors, allows an analyst to better observe trends and gain insight.

In our example, having multiple sensors in the same building would also enable us to spot outliers in the data set: perhaps one end of the room is heated up by the sun coming through the window?

2. Security

The leading IoT network providers all provide a reasonable level of encryption which can be “baked in” to the hardware. This gives us adequate security during collection, but we must be mindful of where the data goes beyond that.

Many of these same providers will aggregate the data on their own servers before making it available via API or web-based dashboards.

This aggregation process puts our data security strategy for IoT back into the same ballpark as cloud infrastructure. The tight governance of access control lists, regular rotation of keys (where the provider will allow it) and regular tests with third party specialists are all essential to ensuring the IoT collected data is not only secure but provable back to the device itself.

It is also important to understand your IoT data journey from a “safe harbouring” perspective. Although last year’s EU ruling on safe harbouring makes the legislative aspect of this something of a minefield, individual client organisations may still impose restrictions through their data security policies.

IoT partners, therefore, need to be transparent when it comes to the transport of data from capture through to presentation. Will your client be happy if their GPS data is stored (even temporarily) outside your country, or outside the EU?

3. Scalability, diversity and ubiquity = better insights

If we take a step back from individual use cases, there is a perceived opportunity that new causal relationships can be proven, with well-maintained diverse datasets.

For example, workforce productivity is considered by many to be the Holy Grail of commercial IoT analyses: measuring environmental conditions and biometric measurements alongside more traditional data gathering methods (surveys, employee attendance, etc.) allows employers to configure their workplace, and workforce, to give optimum efficiency, productivity and well-being.

As knowledge workers, we will inevitably find new correlations in our data as its size and diversity increases. We must, however, exercise caution and constraint while we prove causality.

Summary

IoT certainly feels like the new frontier with many new platforms and vendors entering (and exiting) the market. This, coupled with competing connectivity standards and incomplete network coverage, may dissuade some organisations from investing in IoT solutions in the short term.

The direction has been set, however, and although some of the details are yet to be finalised, it’s clear that IoT is going to be a big part of the data landscape in the years ahead.

From a data quality perspective, don’t expect things to change considerably: many of the challenges are not too dissimilar from those the advertising and PR industries faced at the turn of this decade with onboarding Social Media data. You will be dealing with multiple data sources; you will be dealing with a lot of data; you will need to negotiate a lot of subtly different formats.

You will need to stick to first principles:

  • Know the source of your data
  • Capture and store it securely
  • Present it in context

All of this will mean that IoT will be a great opportunity to make things bigger, better and dare I say – more fun.


Daniel Vincent - Head of Data Quality, EMCOR (UK) Bio

Daniel Vincent has worked in Management Information for 20 years, as an application and database developer, IT Operations Manager and IT Director.

As Head of Data Quality for EMCOR Group (UK) plc he works with an in-house data analysis team, as well as data platform and IoT partners, to create better insights for the Group’s UK clients.

Visit: http://www.emcoruk.com

About the Author

I am the editor and founder of Data Quality Pro. 20+ years experience of data quality initiatives.

Leave a Reply 0 comments