Data Quality and Data Governance Strategies for Dodd-Frank, Solvency, Basel, HIPAA
with Jay Zaidi
There has never been more pressure on organisations to comply with regulatory compliance and risk management directives.
This pressure is driving increase demand for robust data quality and data governance strategies but how should companies get started? What does the regulatory landscape look like? How can companies adopt data quality management holistically?
In this interview with leading data quality strategist Jay Zaidi, a leader of global risk management and compliance based data management projects, we learn of the challenges and strategies for data quality and data governance within regulated industries.
Image credits – Creative Commons epicharmus
Data Quality Pro: There is significant interest in this topic, since existing and new regulations are complex and their scope is global in nature. Given your tenure at one of the largest Financial Services firms in the world and several years in the Health Care industry, would you mind sharing your thoughts on this topic, before we dive into specifics.
Jay: As you rightly mentioned, I have spent a significant amount of time in the Financial Services and Health Care industries and therefore am familiar with their business models and compliance challenges. Data and information play a very important role in supporting regulatory compliance and risk management activities. Data strategists and practitioners have to develop solutions to address them. I’ve conducted research on regulations such as the Dodd-Frank Wall Street Reform Act, Solvency, Basel and Health Insurance Portability and Accountability Act (HIPAA), to identify patterns and information-related requirements. I shall highlight the patterns that emerge from a data perspective and connect the dots for your readers, so that they can see how they are affected by this and the actions that they can take to add business value to their firms. Finally, I will share some information management strategies that can be utilized.
Data Quality Pro: Please describe some of the regulations that companies in the Insurance, Financial Services, and Health Care domains have to comply with.
Jay: In my opinion, Dodd-Frank, Basel, Solvency and HIPAA are the most important ones.
Dodd-Frank literally affects all publicly traded companies that do business in the United States or have business dealings with US firms. Dodd-Frank’s impact spans many business domains, although it’s primary focus is on financial institutions – which drove the financial crisis of 2008. This Act addresses systemic risks, increases oversight, regulates capital adequacy, and provides for managed liquidation of large institutions, in the event of failure.
BASEL was established with two fundamental objectives: to strengthen the soundness and stability of the international banking system and to obtain “a high degree of consistency in its application to banks in different countries with a view to diminishing an existing source of competitive inequality among international banks” (Basel Committee on Banking Supervision 1988).
Solvency was initiated by the European Commission (EC) in 2000 to implement a fundamental change to European insurance regulations. The project aims to create a more harmonized, risk-orientated solvency regime resulting in capital requirements that are more reflective of the risks being run. The EC’s objectives are (1) Deepening the integration of the EU insurance market (2) Improving the protection of policyholders and beneficiaries (3) Improving the international competitiveness of EU insurers and (4) Better regulation of the EU insurance markets.
HIPAA was implemented to support the security and privacy of patient data in the Health Care industry in the United States, as electronic medical records were introduced.
Data Quality Pro: What common themes does your analysis of the above regulations indicate?
Jay: Five major themes emerged from my analysis of the above regulations. These are (1) a focus on better and proactive risk management at the enterprise level (2) greater transparency into business measures such as capital adequacy (3) consistency and standardization of key business processes and data entities across companies (4) security and privacy of customer and patient data and (5) a focus on protecting the interests of consumers and patients.
Data Quality Pro: Dodd-Frank is a massive regulation – the biggest in the US in the last 70 years. What was the driver for Dodd-Frank Wall Street Reform Act and what types of companies does it apply to?
Jay: In order to minimize the negative impacts of the financial crisis of 2008, US and European central banks were forced to bail out large financial institutions and step in to provide liquidity to other industries that were impacted. The financial meltdown was caused by risky lending practices and highly leveraged positions. The US authorities introduced a sweeping set of new regulations called the Dodd Frank Wall Street Reform and Consumer Protection Act (“Dodd Frank Act” or Act) to address systemic risks, increase oversight, regulate capital adequacy, and provide for managed liquidation of large institutions, in the event of failure. The Act applies to all regulated public companies (e.g. banks, insurance companies, hedge funds, etc) that meet certain criteria and has implications on their risk management, regulatory compliance, securitization, investor reporting, financial reporting, corporate governance, executive compensation and decision support functions.
Data Quality Pro: Do you think there were was specific risky behavior by certain large Financial Services and Insurance firms? Could this have been avoided? What role did data play in all this?
Jay: My research and analysis of industry data shows that there were numerous causes for the financial meltdown in 2008. Firms and national governments made risky decisions related to underwriting and lending money, there were cases of “irrational exuberance”, certain incorrect assumptions were used (e.g. the value of homes will continue to increase), lack of stringent underwriting guidelines, borrowers ability to pay, exotic products (e.g. no documentation loans, teaser rate loans, balloon mortgages, option Adjustable Rate Mortgage Securities, etc.). In addition, complex derivatives were issued, that most company’s couldn’t price accurately and in many cases, even trained professionals could not unravel the internals of such instruments, to determine risk exposure. To make matters worse, credit rating agencies were financially supported by the same Financial Services firms whose products they were rating – a clear conflict of interest. This was a failure of massive proportions, that affected the economies of many countries in the world.
I’m sure it could have been avoided, if warning signs were heeded to, firms were not as leveraged, prudent risk management practices were used and the assumptions were validated, based on facts and history. We had the relevant data and there were many pointers indicating issues, but no one took them seriously. In such situations, one has to look at the data, the output of various models, validate assumptions and then develop conclusions. Its hard to raise red flags, when everyone around you is bullish and no one wants to hear bad news. Hindsight is always 20/20.
From a data perspective, what we can do now is to learn from our mistakes and put the proper risk management, information governance, information quality controls, timely reporting and predictive analytics and adequate checks and balances in place, to ensure that such a situation doesn’t occur in future. There is also a need for management to get near real time access to business intelligence, that may indicate systemic issues.
Data Quality Pro: Let’s focus on the Financial Services-related regulations for now. Who is going to monitor compliance across the globe, to ensure that a “Financial Meltdown” similar to the one that occurred in 2008 doesn’t occur again?
Jay: The regulatory landscape is complex and uncoordinated. It comprises of a host of EU and US agencies such as:
- Securities and Exchange Commission (SEC)
- Federal Deposit Insurance Corporation (FDIC)
- Financial Stability Board (FSB)
- Commodity Futures Trading Commission (CFTC)
- US Federal Reserve (Fed)
- Federal Housing Finance Agency (FHFA)
- Office of Financial Research (OFR)
- Financial Stability Oversight Council (FSOC)
- Financial Services Authority (FSA)
- European Securities Committee (ESC), and
- Committee of European Securities Regulators (CESR)
These agencies have to harmonize and rationalize their compliance mandates, to ensure that they are doing their job. I believe this will be an on-going process and things will stabilize over time. But you can imagine how difficult this will be for firms that are being overseen by the regulatory bodies.
In addition to the complex regulatory landscape, there are several reporting standards utilized in the industry. The notable ones are managed by the following entities:
- International Standards Organization (ISO)
- Depository Trust & Clearing Corporation (DTCC)
- Society for Worldwide Interbank Financial Telecommunication (SWIFT)
- American National Standards Institute (ANSI), and
- International Financial Reporting Standards (IFRS)
Firms should consider re-evaluating their reporting architecture, to ensure that it is flexible to accommodate current and future regulatory mandates.
Data Quality Pro: There is some talk about designating certain firms as “too big to fail” and “stress tests”. How does this work and why should our readers care? What should these firms be doing differently from the rest?
Jay: Given the exposure to the global economy from the very large firms, the Financial Stability Board (FSB) has labeled 29 global banks as “too big to fail”. The list includes eight U.S. banks and seventeen European banks, along with three Japanese institutions and one from China. These companies are being asked to conduct regular stress tests, to ensure that they can withstand major disruptions, without impacting the rest of the firms that they conduct business with. They have to strengthen their internal controls and risk management and compliance operations significantly. It is important to note that information management, information quality management and information governance play a critical role in this.
Data Quality Pro: There are certain terms such as “Systemic Risk” that are used a lot in the media, especially after the 2008 Financial meltdown. What does this really mean?
Jay: Due to the inter-connected nature of the global economy, the failure of one firm could potentially bring down the entire ecosystem. One of the primary focus of the regulators is to monitor regulated firms to ensure that they do not take any actions or decisions that may cause “Systemic Risk” – to bring the entire system down. However, since the regulators are not going to be able to check each firm’s practices at regular intervals, the responsibility lies on those firms to do the right thing, by putting the proper controls and risk management processes in place, with the associated transparency into key business indicators.
Data Quality Pro: Do you think just meeting the compliance requirements are sufficient? Will this guarantee that the company will be risk free and does not have to do more?
Jay: Not really. Regulations merely provide a framework and some parameters within which a business should operate. Each firm must develop a strategy to comply based on it’s specific constraints and business drivers. Merely complying with regulations isn’t sufficient – and doesn’t guaranty success.
Data Quality Pro: Some of these regulations such as Dodd-Frank Wall Street Reform Act are extremely complex and open to interpretation. In addition to this, companies are lobbying to water down some provisions or eliminate them completely. Seems like most are in a wait and see mode. Your thoughts?
Jay: The current market conditions are bad and many companies are still recovering from the 2008 crisis, so there is push back. The regulation is complex and hard to interpret and some firms feel that certain clauses should be eliminated or watered down. There is a significant amount of lobbying going on and there is certainly a wait and see attitude in the US. I am aware of the fact that several large financial services firms have started implementing programs to address the key requirements of Dodd-Frank. For those that haven’t, my suggestion would be to use this as an opportunity to assess their information management maturity, identify gaps or weaknesses, develop strategies to address them and initiate projects to address the larger gaps or the higher priority items.
Data Quality Pro: Regulatory compliance seems quite challenging and expensive, but firms have no option but to comply. I believe you have some thoughts on addressing this?
Jay: A review of leading industry publications shows that companies are under tremendous pressure to comply with the numerous complex regulations. These regulations are open to interpretation, in some cases have opposing stances on certain aspects such as capital adequacy and continue to undergo revisions. Firms have to make a significant initial investment to modify their processes, technology, data and systems to comply with the initial version of a regulation, followed by ongoing expense to maintain and enhance the solution, to comply with any new revisions. Given the legal, reputational and financial exposure for non-compliance, they have very little room to maneuver. I won’t go into the merits of deregulation versus re-regulation, but the bottom line is that this is a reality that must be dealt with. Data and information are at the heart of all this. That is why firms must invest in improving their Information Management maturity and capabilities.
Data Quality Pro: One thing that strikes me is that so many organisations react to these compliance directive with what often appears to be reactive, blind panic.
Jay: Given the global scope of the regulations and the urgency of the situation, many corporate leaders are grappling with developing targeted strategies and implementation options. In addition to external competitive pressures and market forces, they are faced with internal challenges related to legacy systems, immature information management capabilities, siloed business models, siloed information management practices, lack of work flow between components of the information supply chain, and internal politics. So, this is certainly a very complex problem to solve, with aggressive deadlines and serious ramifications for non-compliance. I would suggest that companies take a strategic approach to this and focus on the larger themes that are emerging from an information management perspective. I touched upon the major ones earlier.
Data Quality Pro: It is quite apparent that all regulations will evolve over time, so compliance isn’t a one-time activity, but an ongoing process that may require process, data and application changes. You seem to have some thoughts on how companies can deal with this, in a cost effective manner?
Jay: I certainly have strong views on this particular topic…because we have treated these regulations like any other corporate initiative – without taking a step back and understanding the key themes that emerge, putting ourselves in the shoes of the regulators to determine what their mandate is, and treating this as just another project that must be handled within our siloed business model. I am making some very strong statements, so let me explain what I mean. Regulations are a reality and will continue to evolve. New regulations may emerge in future and certain regulations may be retired once they’ve achieved their objective. To address the complexities of business, firms will have to deal with this – from a people, process, technology and data perspective – a major expense.
So, how are we tackling this now – we stand up large cross-functional teams to address new regulations. Each regulation is handled as an independent program – rather than as a set of inter-dependant programs – with common information management requirements and themes I’ve identified. This approach promotes siloed thinking, doesn’t address information quality or governance holistically, doesn’t address the reporting challenges across the regulations and is very expensive – to launch and sustain from a compliance perspective. Regulations will continue to evolve and new regulations may arise (e.g. BASEL 2, BASEL 3, SOLVENCY 2 etc.). In my humble opinion, this is not the best and most cost-effective approach for addressing regulatory compliance and risk management.
Here’s what I am proposing. Firms should take a holistic approach to data quality (HDQ), implement Federated Data Governance, develop a flexible reporting architecture to deal with the various reporting requirements to support regulations, focus on counter party and customer data and invest in predictive analytics and business intelligence. They can certainly leverage the investments they’ve already made in the above mentioned information management areas, as well as in regulatory compliance functions, but orient them as horizontal support functions (i.e. shared services), rather than addressing them in vertical silos.
There is an initial investment involved here – to lay out the Federated Governance Model, the HDQ Framework and Deployment model, a data architecture that supports real or near real time analytics and a new and improved Reporting Architecture – but this one time investment can be enhanced easily to support new regulations or changes to existing ones – with a smaller incremental cost. You don’t have to stand up large teams of business and data experts any more, every time regulatory changes occur. I am talking about a paradigm shift – one that must be seriously considered, to address the current and future operational and compliance challenges in a cost-effective and strategic manner.
This approach provides significant benefits with respect to identification of systemic risks, holistic view into information quality across the information supply chain, view into a firm’s counter party exposure, better information governance tied to accountability at the senior management levels and high quality information that can be used for decision making and risk management.
Data Quality Pro: So, what you are saying is that companies can take some steps now with respect to their data architecture, data quality, data governance and metadata management to prepare for regulations. What are some things that they should be focusing on?
Jay: Many firms have initiated corporate-wide programs to address the implications of the Act on their operations. Those that haven’t are well advised to develop a strategy and an execution plan to get their house in order, to ensure compliance. Since complying with all aspects of the Act is highly dependent on the availability of high quality information and strong governance processes, a firm’s strategy and execution plans must pivot around these two emerging disciplines, across its information supply chain (not in business silos). The onus of implementing robust regulatory compliance and risk assessment frameworks as part of an integrated information supply chain solution falls on institutions impacted by the Act.
Data Quality Pro: Many firms have a significant number of legacy systems that may be tightly coupled and a data architecture that has grown organically. These are huge constraints that create major challenges. What are some things companies can do to comply, while dealing with such constraints, in addition to budgetary and market pressures?
Jay: My suggestion would be for firms to develop a future state or target state data architecture first, driven by the key themes and requirements of the various regulations and business priorities, if they haven’t already. This can be used as the mechanism to identify applications or data stores that must be phased out. Because legacy systems are hard to replace all at once, firms will have to develop a strategy to do this in a phased manner. One strategy could be to focus on retiring those systems that are harder to maintain, are running on obsolete technology, are not serving their business purpose and do not align with the future state architecture first. In parallel with this activity, companies could build out their information governance, information quality (holistically not in silos), new reporting architecture, master data management (MDM) capabilities (prioritize counter party management and customer data management), information privacy and security capabilities and predictive analytics, in a phased manner.
Information governance will ensure consistent business semantics, definition of security and privacy details, information management policies, standards and procedures. Holistic Information Quality across the information supply chain, will ensure the availability of high quality information for decision making, risk management, customer relationship management, internal and external reporting and financial disclosures. A new reporting architecture will simplify creation of standard reports for dissemination to internal and external consumers. MDM focused on counter parties and customers, will ensure single versions of the truth for these two critical data entities and predictive analytics will provide business insights to better decisioning and business management. Since information privacy and security are extremely important, focus should be on identifying the data elements that must be secured and appropriate controls built around them to ensure that they are not compromised. The European Union has recently passed a stringent set of measures targeting information privacy and security – that has severe financial and reputational ramifications, if breached.
Your readers should refer to the articles on Information Governance and Holistic Data Quality (HDQ) management that I have authored, if they wish to delve deeper into these topics.
Data Quality Pro: So, based on what you’ve shared with us, there is a lot of ground work needed from a Strategy, Planning and Execution stand point, in the areas of data management, data integration, data governance, data quality management and metadata management.
Any words of wisdom, on how firms can do this given all the constraints they are under?
Jay: The primary focus of any data strategy should be to add business value and become an enabler for the business to grow – while supporting better regulatory compliance, risk management and reporting.
The project plans and execution steps will be driven by the data strategy. From an architecture perspective, we are talking about the intersection of in-memory databases, big data, complex event processing and predictive analytics. Such solutions are not easy to design and implement, but utilizing disruptive thinking, internal and external expertise, and cutting edge technology, will enable success.
Data Quality Pro: Jay, thanks for shedding some light on this very complex topic and providing our readers with your thoughts on this. I’m sure the readers will pick some important nuggets.
Jay: You are welcome Dylan! Thanks for giving me an opportunity to discuss some very important topics with your readers. I hope I was able to shed some light on the impact of global regulations on information management, connect a few dots for them and show them why a paradigm shift with respect to Information Management is required, to address the complex business challenges.
Read these other articles by Jay Zaidi:
- How to Deliver Enterprise Data Quality Management: TheJayZaidi Interview
- How to create Enterprise Data Quality Dashboards and Alerts, with Jay Zaidi and Bonnie O’Neill
- Business Case for Stronger Enterprise Data Management Capabilities in Financial Institutions
- “Holistic Data Quality” – A Paradigm Shift in Enterprise Data Quality Management
- Enterprise Data Quality Dashboard – “Holistic Data Quality”
- Should Users Switch from Office Productivity Tools to a Commercial Data Quality Tool?
Jay Zaidi is an astute, hands-on, versatile and results-oriented leader with proven success in Enterprise Data Management, Strategic Planning, and Program Management.
He is passionate about solving multidimensional problems. During his professional career, Jay has conceptualized and led business transformation and change management programs in the Financial Services business vertical.
He has led global data management projects to address Regulatory Compliance, Risk Management and Operational challenges. He consults with and influences all levels of management and works to bridge gaps, facilitate communication and develop integrated business solutions. Proven success in strategic guidance to leaders in Fortune 100 firms.
Jay can be reached via his LinkedIn Profile.