Expert Interview With Phil Simon, Author of Why New Systems Fail and The Next Wave of Technologies

In this article I interview Phil Simon, a popular author, blogger and consultant who writes regularly on the subject of emerging technologies, data quality, data migration and a range of associated topics.

Phil blogs at philsimonsystems.com where you can read extracts from The Next Wave of TechnologiesWhy New Systems Fail and The Age of the Platform.

Data Quality Pro: Bloor Research uncovered some interesting findings in their data migration industry survey recently. They found that the failure rate was extremely high.What do you feel is preventing these companies learning from their mistakes?

Phil Simon:In a word, experience. Many of my clients have never been through a major new system implementation before. Most have never replaced their legacy back office systems with one integrated solution, at least with current staff members and levels. Such a project is obviously much more time-consuming than a minor upgrade of a stand-alone application that isn’t really connected to anything else.

Data Quality Pro: This leads us on to one of the common system implementation road map issues, called “unanticipated data issues” in your book. This is obviously a major obstacle early in the system implementation life cycle as it could lead to conflicts of internal parties and external partners etc. In those kinds of situations, where do you think the responsibility lies when transforming these unanticipated issues into a managed issue?

Phil Simon:It’s a great question. Software vendors, system integrators, and clientsallhave key roles to play on each project.

Software vendors want to sell software and know that software is reasonably flexible. I often look back over the last ten years working as a consultant. Many times organizations have underestimated the extent of their own data problems—and that has nothing to do with vendors or consultants. Maybe clients didn’t think their data was in dire need of cleansing (or at least to the extent required). Maybe clients opted for the consulting firm with the lower quote.

I would argue that no organization should be completely oblivious about the state of its data. When writing my first book, Why New Systems Fail, I realized that I had never worked on a project with a company that performed comprehensive data profiling or data analysis before commencing the project. That’s just scary.

The client often doesn’t know that its data is a mess until consultants get their hands dirty. Again, that’s scary. Data needs to be extracted and cleansedbeforeit can and should be loaded into the new system. So, if pressed, I would say that the fundamental responsibility probably lies with the client, but many times they don’t know that their data needs major purification—and that costs time and money. Unfortunately, the project plan and budget don’t allow for these activities to be performed properly.

Data Quality Pro: You mentioned that the companies, in many cases, just have no idea of impending issues. As we get smarter implementing systems using the latest migration and integration software for example, do you find that one of the issues is we’re actually trying to implement too fast? As we go through into the project, the time frames to deliver these things are so short that it is inevitable for important tasks to get cut – what are your views?

Phil Simon:Sure, sometimes organizations try to do too much too fast. Senior management often doesn’t understand that, during a major IT project, many key end users essentially have two jobs. For example, they have to handle “future” responsibilities while concurring doing their day jobs—e.g., running, accounts payable, processing payroll, administering security, or handling customers.

Now, consider what happens when a crisis occurs? Let’s say that, in the middle of testing the new system, I have to run production payroll or close the GL. My day job is always going to take precedence of my “future” job.

In my experience, things like this over the course of the project really add up. As a result, these projects often do not turn out as well as everyone had initially conceived back. The staffing calculations at the outset are often seriously flawed.

Data Quality Pro: Going back to the early questions, we talked a little about systems integrators (SI’s). In your experience, do you find that SI’s fully understand the processes involved in large systems migration or disciplines such as data quality? Or do you find that these things don’t really fit their business model?

Phil Simon:Do they understand what’s required? Absolutely. Are they incentivized to perhaps make some compromises or possibly bad judgments in order to win the business in the first place? Sure, and that can often create problems.

I worked on several projects over the last couple of years that, had the project had been scoped properly, the project would never have started.

For clients, a far smarter move is to spend two or three months thoroughly understanding their data and resolving related issuesbeforebringing in a whole team of external resources to start the project. For example, most IT projects would really benefit from data profiling before external consultants even start.

Data Quality Pro: We’re seeing an increase in the “factory” style approach to data quality improvement, often on migrations as part of new system implementations. Do you see this working effectively in your experience? And if you don’t see them working effectively, any reasons why they don’t?

Phil Simon:Organizations should pay more attention to data quality, period. Whether they are turning on a new system is irrelevant; DQ is critical.

Unfortunately, on data migration projects, most organizations in my experience attempt to load existing data sans cleanup. With respect to outsourcing, I’ve actually never seen DQ-activities successfully outsourced to another country.

In Tony Fisher’s The Data Asset, he asks, “Why would you let other people make those sort of core decisions about your data?” That’s really lost on many people on these projects. Any new system or application is only as good as the data stored in it. Why would you outsource that to people who don’t necessarily know the first thing about your business? I think it can be really dangerous. To me, the squeeze just isn’t worth the juice.

“Package slams” should be avoided because you’re just giving people access to bad information.

My second book, The Next Wave of Technologies, covers technologies such as cloud computing, SaaS, mobile computing, etc. People were talking about many of these ten years ago. Remember that these emerging technologies are just different means of creating, storing, and accessing data. I make the point in the book that, even though the technologies may have changed, you still need to present accurate information to your employees, customers, suppliers, vendors, etc.

Data Quality Pro: For companies that are starting down the road to implementing a new system, is there any kind of practical advice you can offer?

Phil Simon: Well, there is a great deal of new content in the second edition of Why New Systems Fail. The final chapter of the book has a great deal of tips and advice, but I’ll try to keep the answer to your question short and on-point here.

In the middle of system testing, design, and training, don’t expect everyone to hold up the project as users clean up millions of records in the process. There is not enough time for it. It’s one of the biggest reasons that new systems go live with so much junk already in it. In turn, bad data breeds trust issues with the new systems. You get one bite at the apple with a new system activation. If you can’t get good information out of it, then you really missed an opportunity. Spend the time and clean up your data.

Many times that’s just lost on people. The software vendor or consultancy typically doesn’t insist on it. From a data perspective, don’t load anything inherently questionable. It’s far easier to cleanse data outside of a system than within it.


About Phil Simon

Phil Simon is a seasoned independent systems consultant with his extensive knowledge of both well-known and homegrown applications. He started independent consulting in 2002 after six years of related corporate experience. During that time, he has cultivated over thirty clients from a wide variety of industries, including health care, manufacturing, retail, and the public sector. Phil is the author of the acclaimed Why New Systems Fail An Insider’s Guide to Successful IT Projects and most recently The Next Wave of Technologies: Opportunities from Chaos

Phil is a graduate of the School of Industrial and Labor Relations at Cornell University (MILR) and Carnegie Mellon (B.S., Policy and Management). He lives in Northern, NJ, USA. You can find out more about him by visiting his site:www.philsimonsystems.com.