Asset Data Quality Management in the Rail Industry, Expert Interview with Ian Rush of Network Rail

How can you manage asset data quality in the rail sector? Image of York train station, creative commons.

How can you manage asset data quality in the rail sector? Image of York train station, creative commons.

Asset Data Quality Management in the Rail Industry

with Ian Rush of Network Rail, UK

So how would you manage data quality for a national rail infrastructure? 

Expert panelist Ian Rush has been focused on that exact challenge at Network Rail, the company that owns and operates the UK rail network.

Ian provides a fascinating insight into the techniques he and his team are applying across a large, complex and mission-critical set of data assets.


Dylan Jones: For the benefit of our readers, please describe your current role and the type of data quality activities you undertake?

Ian Rush: I currently hold the position of Asset Data Quality Manager within the Asset Information Directorate of Network Rail. 

Dylan Jones: For the last couple of years my team have been identifying the tactical data quality problems and influencing the business to make appropriate improvements. 

We have achieved this through measurement, specification, data management standards and increased awareness of impact to business processes resulting from poor data. 

Data quality, as a centralised activity, is relatively new to Network Rail so awareness of our activities and us has been a major focus for me.

Presently we are moving our activities away from improvement, as this sits better elsewhere in the organisation, and concentrating on the strategic thinking. 

There are always tactical issues to manage and resolve and this has prevented the root cause analysis that is needed.   Therefore, we are thinking more strategically and presently defining a data assurance framework concentrating on specification and governance activities. 

We measure data quality but with limited success.  We have concentrated on the areas that can be relatively measured easily such as completeness and validity. 

We are now looking at accuracy through verification so that we can improve the confidence we have in our asset register. Successful measurement will be achieved once we have completed the specification work and identified the roles and responsibilities other than the day-to-day input and management of data. 

Our next stage will be to provide assurance on the data sets through audit and measurement; not just the data but the processes too. 

Change management techniques and skills will play a key role in the successful transition.  Work is currently underway to define the what, how and who. 

Research, documenting policy and stakeholder management are the areas taking up a lot of our time at present.  We have procured the services of a consultant to give us an independent review of our activities which has helped us in our learning and management of stakeholder expectations. 

Dylan Jones: Describe your career progression, how did you become a senior data quality practitioner?

Ian Rush: I joined the company as a document controller when our records were paper based and physical signatures required confirming receipt of documentation.  I soon appreciated that a database was required to identify employees and their documentation needs as well as to track document movement and staff turnover. 

The benefits of electronic record-keeping was recognised by senior management and so my natural progression was into records management.  I spent seven years managing a records centre containing around 2million engineering drawings in various formats.  These records were also managed by a card index system or simply placed in specific order on miles of shelving. 

Again, I converted all the index cards into a database and catalogued those that were simply in sequential order. This provided cross-referencing capability and easy search functionality.

After a short spell working with engineers and learning about the physical assets I became part of an already established team on the population and implementation of an asset register.  The task was to create a register of all the infrastructure assets to enable the company to meet its regulatory requirements. 

After several years of managing data improvement and migration projects, the asset register was constantly evolving in content and software platform, I began my current role as Asset Data Quality Manager.  This came about as a result of moving from regional to central management of core-activities, including data quality. 

The last three years have been an ever changing role as the corporate reliance of data on assets has expanded.  Better data is now leading to further efficiencies through robust business plans.     

Dylan Jones: You mentioned that increasing the awareness of your data quality team is a major focus for you personally, what are you doing to increase awareness specifically? 

Ian Rush: Awareness is key to gain the support required to deliver the various activities.  A small team cannot change cultural behaviours on their own.  Furthermore, a small unit of people will struggle to make significant improvements without corporate support. 

The first step in increasing awareness was to widen the distribution of the regular reporting suite from those that need to know, as part of their day-to-day activities, to include those that should know.  The quality of the data impacts many business and decision making processes, it is those that make decisions I felt should know on what basis the decisions can be influenced by good or bad data.  The wider distribution had instant positive impact to our team’s awareness.  Invites to present issues, remedies and behavioural changes suddenly became the norm. 

This had positive results as blockers were suddenly removed by the senior leadership team: sponsorship of improvement projects became easier to secure. 

Data is maintained both centrally and locally, therefore the next logical step was to create a data quality forum as a communication tool to get people to understand and adopt the needs for change.  This also helped to break down barriers and local interpretations of corporate standards.  Those staff managing data in the field need to be able to share best-practice and just be able to have a voice.

The organisational position of a data quality team is fundamental as to how much importance is placed upon the work.  We are currently working with organisational change leaders to ensure the team get the focus it needs.  If you bury the team too deep within the organisation it can send the wrong message about the importance of data.

Dylan Jones: Some senior managers may see all the added effort and governance surrounding data quality as a cost centre, how are you demonstrating value to the business in order to ensure ongoing support for your team? 

Ian Rush: Managing data has high costs attached but the benefits can far outweigh these costs if the right activities are undertaken.  But measuring costs and benefits of good data can be difficult as often other processes can intervene so that poor decisions are made.  For example, engineers often identify data weaknesses and adjust the business decision accordingly.  These intervening activities can mask the underlying data issues. 

Data improvement benefits need to be measured in some way; this is not always financial as improved data can have other benefits such as safety risk reduction.   

A data improvement programme has been in place for some time.  This exercise started because we simply believed ‘It needed to be done’.  However, being challenged on resource utilisation meant that the costs must be identified and weighed up against any benefits.  One such method is the use of planning models. 

Recently we populated the models with data, from the asset registers taken at face value.  This was then compared to the same models but populated from sampled on-site verified data, the difference between the two outputs showed a potential huge business plan variance.  Therefore, this variance is used to measure improvement benefit.  This said, it is not always possible to easily access the assets therefore there will always be an element of targeted improvement based upon ‘gut-feeling’.

Dylan Jones: You mentioned that you’ve had limited success with measuring data quality, can you explain why? 

Ian Rush: Traditionally the measures undertaken have been those easily achieved via desk top methods.  There are many reasons for this; limited access to the physical assets, resources, cost of verification and because of the issue around repeatability.  Using secondary data sources is limited because of the secondary source may be less reliable than the master. 

To-date, we have concentrated data quality measures around attribute completeness, validity and consistency. These are relatively easy to achieve as desk top activities as well as being frequently repeated through the use of standard desk top applications. However, producing an Asset Data Quality Index (rolled up score) per asset type produces a false impression of overall accuracy when actually accuracy is not measured. 

As previously mentioned, it can be difficult accessing the physical assets for verification when they can be hidden (buried), nationally spread and/or increasing personal safety risk during operational times.

We have had to look at other ways to capture accuracy.  One potential method is at the point of inspection/maintenance.  This requires cultural changes around behaviours for the front line staff.  They need to check the data record and not just the physical asset and report back any corrections necessary; this involves extra paperwork or input where mobile devices are in use.

We have found that a good incentive to foster a data quality culture is in the use of league tables, this drives success by competition. No business unit leader wants to be at the bottom.  This method has ‘health warnings’ as it can drive excessive pressure to improve.  After all, it is not possible for everyone to be in the top five. 

The use of new technology is increasing our ability to view the assets without the need for human intervention.  Technology such as RADAR, video and ultrasonic equipment is fitted to engineering trains to capture condition data.  The same technology is being enhanced to capture missing data and verify record presence and location accuracy.  In the near future we will be in a position to use the outputs in our measurement reports.

Dylan Jones: As you rightly point out, completeness and validity are relatively easy to manage when compared to verification, what techniques are you adopting to have more confidence in the accuracy of data? 

Ian Rush: We have a ‘confidence grading’ process in place, and approved by the regulatory body, to assign a level of confidence to any particular data set or report output.  These have been in place for some time but the methodology however is quite subjective in its application. 

We have developed a new methodology that requires a more objective view of the detail in the used data.  By encouraging the business to use this new methodology it is easier to get their ownership in the identification and resolution of any weaknesses. 

When using the confidence grading we use the worst case scenario if we simply do not know e.g. the accuracy. 

This may result in lower confidence than previously thought but at least we do not have false hope.

Dylan Jones: Finally, you mentioned that in the past few years you’ve been focused on helping the company deliver tactical business initiatives by leveraging the support of your team; can you give some examples of how your team have helped to support another project? 

Ian Rush: As the team members have generally been working with asset data, in one way or another, over many years they know how the data is structured and accessible. This results in requests for help in the form of secondments, etc. to build report outputs and undertake analysis.  Also, when you know what’s wrong you are generally expected to put it right and this has a drain on resources and impacts the delivery of assurance. 

A current project to centralise the management of a core dataset from local responsibility has drawn upon process expertise from the team.  Added to some short-term improvement projects, requiring extra resources, our team is affected in its strategic development activities.  Whilst data improvement projects are currently business critical the root causes need  full investigation, therefore, remedial work can impact re-occurrence prevention. 

Some hard decisions have had to be made regarding willingness to supply expertise to improvement projects as well as fostering an awareness in the importance of strategic development.


Ian Rush

Ian Rush has spent the last 20 years in the Rail industry managing documentation, records and data. 

Ian has spent the last 7 years leading data quality for Network Rail and has successfully managed data migration, cataloguing and system disposition projects as well as data improvement activities in an asset intensive industry. 

He has excellent experience in designing and implementing data management processes.

http://www.linkedin.com/pub/ian-rush/15/35b/580

About the Author

I am the editor and founder of Data Quality Pro. 20+ years experience of data quality initiatives.

Leave a Reply 0 comments