The IT industry is fast moving, and is full of people in their twenties and thirties. So perhaps it is not surprising that it does not learn well from past mistakes. Those rather longer in the tooth can permit themselves a wry smile as the industry shows a new fad for “executive dashboards”, “cockpits”, “balanced scorecard applications” and the like. The idea is fine: rather than having to wade through endless reports, senior executives of a company will be presented with a pretty display, like the cockpit of an airplane, showing the critical operational numbers of how their business is performing: sales, gross margins, production quality and such. The more entertaining demos of such products show a mythical executive “drilling in” on a problem area, and immediately identifying an operational problem from the pretty charts that come up, which is then fixed via a swift email to the offending business unit.
Like many fairy tales, this sounds nice, but the reality is somewhat different. I recently met the chairman of a major brewery, and he told me a very different story. He explained that his dashboard has numerous missing business areas and little notes attached to numbers with endless caveats. Why should this be? The problem is not the dashboard applications in themselves; the problem is with actually getting at the underlying data that feeds the dashboards. Those with long memories will recall Enterprise Information Systems (EIS) which were sold to senior executives in the late 1980s on exactly the same promise. How many of those are still around? Those systems failed because it took the same army of analysts who currently produce the nice PowerPoint slides that executives review, to be able to feed the EIS systems that would prettily display the same numbers on the EIS screens. Most people would probably rather scan through a set of well-presented slides than click around on a screen anyway, so no value was added. All the same analysts were employed, plus a few more who had to feed the EIS system, so the systems were mostly quietly retired when the sponsoring executive moved on. The problem is a fundamental one: the information that provides things like gross margin by product, channel and customer is buried away in a multiplicity of separate operational systems: multiple instances of ERP, plus CRM systems, supply chain systems and the many, many other systems that large companies use. It is the multiple sets of business definitions embedded in these multiple systems that cause the problem.
For example, rebates and commission structures may vary by country, potentially distorting revenue figures when you add up net revenues across the globe. Costs are even more complex, with each business unit or country having a slightly different ERP instance, and slightly different ways of allocating back costs. Do you allocate all costs of a business transaction back to the original transaction via some allocation rule, or are some costs held at the country, regional or global level e.g. management overhead, office costs etc? What about marketing costs? Are these allocated back to each business unit, or to each individual transaction or customer account? It is a rare multinational company that can honestly say that these rules are all identical throughout the globe. I have personally worked in two of the largest and most successful global companies, and if I had a dollar for every time I heard someone arguing about whose data was the “right” data in a meeting then I would be relaxing on a beach rather than writing this article.
The reality in large companies is that a small army of analysts apply mind-boggling sets of rules to the data that is sent in from the various business units to iron out these little differences between business units and countries, and so present a moderately coherent view of things at the corporate level. Tools which seemingly reach out directly into the operational systems in these countries and retrieve the data look pretty on demos, but cannot work in practice since they have to resolve the semantic inconsistency between these systems, as well as deal with the very real issue of data quality. Since the average number of “master” definitions of key items like “product” and “customer” in a large corporation is not one, but eleven (according to a recent survey by Tower Group) it can be seen that such an approach is fundamentally flawed.
The usual way to resolve this is to build a corporate data warehouse in which the different source systems feed data that is then automatically massaged (“transformed”) into a single consistent form, reducing the average of eleven definitions to one. However the problem is that these systems are build on shifting sands. When one of the business units reorganizes, or the company acquires another one, this briefly consistent picture is shattered, and traditional data warehouses take weeks or months to restructure if there is a major structural shift in the underlying source data. If there are many sources, then every one of them can change, having a knock-on effect to the data warehouse, and a further effect on all the reporting systems that use the warehouse for source data. It doesn’t matter how pretty your reporting tool can format results, or draw nice graphs for you if the data underlying the report is wrong. The new generation of dashboard products may be cheaper and prettier than the 1980s EIS systems, but the IT industry appears to be adapting a selective memory [not sure of the wisdom of including this – dangerous ground] when it comes to remembering the problems that occurred with just such systems not that many years ago.
Regrettably, tedious issues like inconsistent master data, data quality and brittle data warehouses don’t play well in sales demos, so you won’t be hearing about them any time soon from your dashboard, cockpit or scorecard vendor. Since salesmen can be very persuasive, it looks like there will soon be a new generation of EIS systems built, with entirely a predictable outcome. We will soon see further growth in that old favorite software type, shelfware.
About the Author
Andy is an established enterprise software industry expert and commentator, named a Red Herring Top 10 Innovator in 2002. Andy founded Kalido as an independent software company after originally setting up the software venture within the Shell Group. He became an independent consultant in August 2006.Prior to leading Kalido's spin off from Shell in June 2003, Andy was CEO of Kalido Ltd in January 2001. In previous roles at Shell, Andy led a 290-person global consultancy practice of Shell Services International, and was Technology Planning Manager of Shell UK Oil. Prior to Shell, Andy worked in a number of senior technology positions within Exxon.
A 20-year veteran of data warehousing and integration projects, Andy is a regular speaker at international conferences such as ETRE, Tornado Insider, Red Herring, Gartner and Enterprise Outlook. See his award winning blog www.andyonenterprisesoftware.com for his insights on the industry.
Andy has a BSc (Hons) Mathematics degree from Nottingham University.