How does data governance in academic organisations compare to commercial organisations?
This is just one of many questions I recently put to the past Director of Data Governance for UNSW Australia and long-time data governance practitioner, Alan Duncan.
Alan is an executive-level leader and evangelist for information management, analytics and technology-enabled business, with over 20 years of experience in delivering improved business outcomes.
Dylan Jones : What is your current role?
Alan Duncan : In October 2012 I was appointed to the newly created role of Director of Data Governance for UNSW Australia (The University of New South Wales).
In this role, I am responsible for the creation, implementation and oversight of the UNSW-wide information and data management goals, standards, practices and processes in support of the University’s strategic objectives.
I will be leaving UNSW in April 2014 as I’m relocating back to the UK, though I don’t know what I’m going to be doing there yet!
I am also a member of the Advisory Board for QFire Software, an Australian startup that delivers the capability for organisations to understand, collect, validate, protect, monitor and enrich their data.
Dylan Jones : How does implementing data governance in academia differ to the commercial sector?
Alan Duncan : In most typical businesses, there’s a fairly strong and simple measure of success or failure – it all turns on whether or not we made a profit.
Everyone’s role within the business is expected to contribute to the bottom line in one way or another (either directly or indirectly). The business functions, services, organisational structures and key processes are put in place to support that overarching and very measurable goal.
Data Governance’s role is then to help assure the business’ data, such that the right data can be used to control or enable any given business function. While that assurance role may in itself be a complex one (anything that involves getting people to do things differently is complex!), in some respects the profit motive at least makes it pretty straightforward to find the right context for any data-centric activities.
You then just (just? Ha!) need to herd the cats – and over the years, I think I’ve become a pretty good Cat Herder.
In contrast, the organizational dynamics and social contract at the Uni are entirely different.
While at some level, we need to be sure that the institution operates in the black, money is not the prime consideration for most people and we don’t have any strong motivators of efficiency, effectiveness or productivity.
The drivers relate to intellectual pursuit, academic excellence and reputational status, which can mean about as many different things as you have people (at UNSW, that’s over 5000 permanent staff, plus about another 6000 contract, casual and associate staff, and something in the order of 50,000 students).
We also have the principle of ‘Academic Freedom’, which means each member of staff has the right to pursue his or her own area of study without interference.
By association, ‘academic freedom’ also implies a fair amount of ‘institutional freedom’.
Most people here are super smart, very autonomous and entirely focused on their subject matter, so the institutional functions and processes of the university entity aren’t really part of their world – the institution exists to create and maintain the organisational, technical and physical infrastructure that enables organic and flexible development of the research and academic goals.
So it’s not so much like herding cats, it’s more like bee-keeping.
With care and patience, you can plant more flowers to influence the production of more honey, plant different flowers to change the honey’s flavour, move the swarm to a different area of the garden, or split the swarm when it gets too large.
But you’ve got to make sure you’ve got your protective gear on and blow plenty of smoke up them before you try to do anything!
Dylan Jones : You’ve obviously been involved with many data governance initiatives at varying stages of implementation. What are some of the common implementation pitfalls that you’ve witnessed organisations falling into?
Alan Duncan : “Fire, ready, aim!” is a typical problem, especially in IT-literate organisations.
With delivery-oriented people, the starting point for any initiative is very often “how do we do it?”, rather than “what are we delivering?” and most importantly “why are we doing this?”.
I always try to start any new initiative with a discovery phase that explores the business drivers and organisational dynamics first – once you’ve got those well understood, then you can start to plan out what solutions are required, and how to go about delivering them.
I’m also very conscious of the human element being the major factor in success or failure.
In my view, as practitioners we tend to concentrate way too much on the practical control processes and mechanisms of the Data Governance discipline (whether it be data profiling, cleansing, MDM, data classification, archiving and retention, Data Councils etc).
We need to spend a lot more of our effort on the communication, collaboration and coaching of the people involved – our role is first and foremost to encourage and foster a data-savvy workforce and analytic, enquiring culture.
Dylan Jones : A lot of companies take a centralised approach to data quality management, what’s your preference to implementing a data quality structure capability across the organisation?
Alan Duncan : It’s a discussion that still divides opinion!
I used to be a strong advocate of a top/down approach supported by a centralised team as the focal point of an organisation’s capability – the Information Management Competency Centre. And that can be effective for some organisations where there is a combination of political will and investment funding.
However, in more recent years I’ve come to the conclusion that centralisation isn’t the only approach.
In many cases it would actually be counter-productive. I’m getting pretty fed up with the ‘Four Legs Good, Two Legs Bad’ commentators in the industry who are still pushing the mantra of ‘get a sponsor, plan strategically, work across the Enterprise.’
I’d suggest that grass-roots activities can be just as effective in kick-starting a focus on data and deliver tangible results in environments where money is tight or where a senior stakeholder isn’t available to champion the cause.
Understand the business objectives, understand the culture, understand the organisational situation – THEN make your decisions about how to go about implementing an effective Data Governance regime.
What you do need in any circumstances are passionate people who understand the value of data, whether that comes from an Executive-level champion or from an operational team just trying to get their job done.
I covered this issue in more depth in my recent discussion paper:
Dylan Jones : Where do you see the Data Governance and Data Quality market heading in the next two to three years?
Alan Duncan : I think we need to get back to basics. In many respects, we may well have actually gone backwards in terms of data quality and business responsibility for data.
Back when I first started out in the industry in the early 1990s, we used to put a whole lot of effort into developing our business systems in such a way that data definitions and data structures were well understood. Not just for referential integrity purposes, but at the member record level too.
Many businesses still had dedicated data processing teams (clerks) who were responsible for – and took pride in – the accuracy and completeness of the data. And because this was more or less their only role, they were damned good at it; diligent, conscientious and fast.
High-quality data that could then be queried, reported, and acted upon by the business. Everything was pretty focussed on execution on whatever process mattered, and the computer systems were simply there to speed up and ensure rigour of the recording process.
Speed forward twenty years and the world looks like a pretty different place. We’re living in a mobile, connected, graphical, multi-tasking, object-oriented, cloud-serviced world, and the rate at which we’re collecting data is showing no sign of abatement.
The tools, technologies and methods available to us are so much more advanced and powerful than those green-screen, one-size-fits all centralised systems of the mid-eighties and early nineties, but I think our progress has come at a significant (unacknowledged or even unrecognised) cost.
Distributed computing, increased personal autonomy, self-norming organisations and opportunity for self-service were meant to lead to better agility, responsiveness and empowerment.
The trade-offs are in the forms of dilution of knowledge, hidden inefficiencies, reduced commitment to discipline and rigour, and unintended consequences. It’s messy, and almost guaranteed to get worse. I sometimes refer to this state of decay as ‘business entropy’.
The advent of ‘Data Governance’ as an emerging discipline (and indeed other forms of governance – process, architectural, security etc.) could be considered as a reactionary attempt to introduce a degree of structure, moderation and resilience into this ever-evolving state of chaos.
Ideally, we’ll get back to point where we don’t need to be ‘doing Data Governance’. I aspire for good data management to come from good business practices, and vice versa – a virtuous circle.
Data quality can’t be an afterthought, it should be an embedded part of running an efficient and effective company.
Dylan Jones : What about the developing market for Data Governance tools?
Alan Duncan: We’ve currently got a lot of product companies trying to sell us their ‘Data Governance’ tools, then work out what the business problem is. That can end up with the wrong tool for the wrong job. You could say ‘buyer beware’, but that attitude kind of galls me and I think it’s unethical.
The fact that I see quite a few tech vendors latching onto the term ‘Data Governance’ as a buzzword to re-brand their existing archiving/database/ETL products is a whole other story, but that gets my goat too!
This isn’t just in the Data Governance/Quality space, by the way – the whole Information Management sector is still very tools-centric. Bill Inmon has been very vocal about this recently, and – surprise, surprise – some of the tool vendors (naming no names) have reacted pretty forcefully. Methinks they doth protest too much.
So, I’m hopeful that the winners in the market will be companies that start out with the aim to address a cultural or process problem, and then deliver products and services that engage with providing a solution to that challenge.
That’s one of the reasons I’m so excited to be working on advisory basis with QFire Software . They’ve recognised that data quality is largely a human and social problem, and are developing their products to support a collaborative, co-operative and democratised approach to delivering better data within the context of real business outcomes.
Dylan Jones : Finally, for the CDO or data leaders faced with the huge challenge of delivering change in a modern data-driven organisation, can you describe the end state for them? Where you’ve implemented these changes successfully what does life look like for workers and leadership alike, particularly with respect to information management?
Alan Duncan : First and foremost, CDO’s have got to be educators and communicators.
I see it as their number one priority to be evangelising on the value and accountability for data – to drive a culture of evidence-based decision-making and to champion the idea that ‘data is the life blood of our business.’
Everyone in the business has conscious competence for data, to the extent that they need to to fulfil their role.
That means educating everyone across the organisation to be aware of the uses of, and responsibilities for, data as a part of their role.
It also means highlighting the impacts and interconnection with other colleagues around them. Where data is concerned, the CDO is the ‘chief steward’ of the data’s context and narrative. The CDO’s office is a service function that offers data-enabled facilitation, communication and collaboration (and coercion when necessary.)
They’ve got to be careful though, otherwise the business can end up abdicating responsibility – the actual usage and value from data needs to sit front-and-centre with business functions.
The CDO should be driving an agenda where data becomes a seamless part of business operations, not an afterthought of business process or an adjunct to I.T. systems. I’d expect to see discussions about the data forming an early and integral part of any and all project planning, solutions design and business operations.
It’s a partnership with I.T., not a subservient relationship. And I’d also counsel caution, because it would be all-too-easy for CDO’s to focus on the technology agenda and end up being in competition with I.T. delivery. That’s counter-productive.
I also observe a significant lack of data literacy in the Executive group – that scares me (though I suppose I should look upon it as an opportunity!).
In this day and age, winning businesses have data-savvy leaders and CDOs need to engage with the other executive leaders to make them painfully aware of their accountability for data and the impact that it has.
For example, I’ve been participating in a LinkedIn thread recently where it was suggested that a particular CDO didn’t see the benefit of having a Data Council, and that some CDOs are being appointed without any data experience!
If that’s really the case, then those organisations have made very bad appointments, and I hope their businesses suffer accordingly!
About Alan Duncan
Alan D. Duncan is an evangelist for information and analytics as enablers of better business outcomes and a member of the advisory board to QFire Software. Formerly Director of Data Governance at UNSW Australia (The University of New South Wales), he was named by Information-Management.com in their 2012 list of “Top 12 Data Governance gurus you should be following on Twitter”.