We would like to congratulate IBM, Informatica, SAP and Oracle on their continued strong showing in the latest Gartner Magic Quadrant for Master Data Management (MDM) of Customer Data Solutions.
I am not going to spend time restating the benefits that organisations can reap from a properly implemented Master Data strategy in terms of increased revenues or reduced customer churn through better knowledge of the customer, the ability to offer enhanced service, reduced costs and improved decision making. I do however want to explore the role of application metadata in a Master data implementation.
Just think of your own experiences with your Bank, Telecommunications or Utility provider – how well connected are the various systems that you touch, or touch you when you interact with them and what could they do to improve it? That will give you a clue as to the advantages and importance of a ‘single view of the customer/citizen/patient’ etc.
I would however like to draw your attention to an important aspect of all Master Data Management projects, irrespective of vendor solution, and which itself is considered to be vital to their success by Andrew White of Gartner. Read his blog on MDM and Metadata Management here.
This is the topic of metadata, an underlying architecture consideration which I believe is often missed or understated in terms of importance when looking at MDM projects, tools or methods both by vendors and analysts.
I would also stress that by metadata I do not, in this context, mean the metadata created within the vendor product and which helps them to manage the data flows through their products. I mean the metadata (tables, fields, descriptions, relationships and customisations), sometimes called the data model, which underpins the source applications which are to be incorporated in the Master Data solution.
Gartner defines Master Data Management of Customer Data Solutions as being software products which:
- “Support the global identification, linking and synchronization of customer information across heterogeneous data sources through semantic reconciliation of master data
- Create and manage a central, persisted system of record or index of record for master data
- Enable delivery of a single customer view to all stakeholders, in support of various business benefits
- Support ongoing master data stewardship and governance requirements through workflow-based monitoring and corrective-action techniques” (Gartner, 2014)
Their document also references the differences in approach that can be taken to implement a solution: “The instantiation of the customer master data, ranging from the maintenance of a physical “golden record” to a more virtual, metadata-based, indexing structure.”
There are specific requirements mentioned about the need to model complex relationships between source applications, to express data in terms of logical data models with associated metadata, to provide business consumable metadata as part of the management of data, business rules and sources. In summary there is an acknowledgement of the role of metadata, however I would argue that an understanding of how the source applications which are to be incorporated into any Master Data Management systems are structured in terms of their data model is a fundamental requirement. If you don’t know where the tables (metadata) are that represent for example, “customer” in your source applications and how they are related to other tables and might have been customised, then it is impossible to be sure that you are using or updating the right data. Also if you do not have easy access to the metadata information in source applications then your project could be delayed, go over budget or under deliver as you struggle to find it.
MDM vendors typically refer to metadata in the context of their own solutions and it is indeed important that they have such a concept as it makes the implementation and management of their software more effective. However, the task of finding the metadata from the source or ‘spoke’ applications and mapping to internal metadata is vital and of course the time it takes to do this will depend on the nature of the applications to be integrated with the MDM solution.
For some applications this job will be very easy and straightforward. Reverse engineering a database or application with a small number of tables and then mapping them to the internal metadata will be a relatively simple task which can often be accomplished manually.
Doing the same job for a large, complex and usually customised packaged software application, for example from SAP, Oracle or others, can be an arduous and difficult task, prone to delay and error. We know of no MDM vendor whose products have effective tools for reverse engineering the metadata from these systems and allowing data analysts and architects to easily locate what they need for their solution.
Typically their solutions work from prebuilt templates or just lists of tables which require the user to know what they are looking for before they start. Another alternative is to engage consultants or pull applications specialists into the project – adding to the cost and resource needed for the project.
My argument is that by taking due account of the metadata foundations which underpin the application data and developing a real understanding of it, the MDM solution would be better positioned to deliver projects that meet time, budget and quality project targets. This would involve the more tightly integrated use of modelling and metadata discovery tools and methods to complement the product suites currently delivered by those in the Magic Quadrant.
Trying to find your way around the data model for an ERP or CRM application without some form of accurate guide is a bit like trying to navigate from one part of London to another using the Underground without a map which has details of the stations, lines and intersections.
If you know the route and which stations to change at then it is easy. If you have no map then you can either try to find someone to ask, or guess, or use trial and error to find your way.
A map makes it easier and quicker to find where you are going. Use a map for your metadata and understanding the data model for large complex applications in the context of your project becomes more straightforward and faster.