Methodology for implementing master data management
Master data management is a discipline for creating and maintaining a trusted source of reference data. While technology is an important enabler, it should not be the driver.
We recommend a four step process for initiating and implementing master data management.
Business engagement and alignment are critical to ensuring that the investment in master data management, which can be expected to run into several years and millions of dollars for a large enterprise, delivers on business expectations.
In our experience, master data programs that are IT driven and do not have strong business engagement end up perceived as failures, irrespective of what is delivered.
Business vision and goals can be used to identify the accountable business executives that must be engaged in order to define the master data scope and objectives.
We recommend that these executives form the core of a strategic data governance program to determine and prioritise the financial benefit that data brings to the organisation and identify and mitigate the business risks associated with poor data practises and quality
Canonical data model
Master data should be defined in terms of a best practise set of attributes that represent and ideal data set able to support the business goals.
This ideal data set can be described as a canonical data model and should be defined to allow the organisation to map current and future data requirements. Industry standard models may be sourced and adapted, or a model may be built based on defined business requirements.
Existing data sources should be mapped to this logical model to ensure completeness and viability, and to support the implementation.
In a large environment we would recommend starting with a small subset of two to five systems that will provide the key data. Additional systems can be mapped and added later.
Data standards, terms and definitions
From a data governance perspective it is valuable at this time to define definitions of key data terms, identify and prioritise existing sources for key data, and define and identify categories such as sensitive or protected data that may need special handling during implementation. Data standards, validation rules and match rules should also be defined and agreed with the accountable business users.
A master data readiness assessment is a valuable tool to both discover de facto rules and standard that may exist within the data, and also to identify data risks that must be mitigated in order to deliver master data management.
These data governance and data quality steps are important prerequisites for successful master data management.
Technology identification and implementation
The three stages above create the foundation for a successful master data implementation. We are now ready to assess technologies to support the master data implementation.
We should have a clearly defined and business driven scope that includes:
- Business goals and priorities
- A canonical data model with source to target mapping
- Data standards, validations, match rules and risks that must be catered for during implementation.
Master Data Management (MDM) is intended to provide a pervasive, consistent, enterprise view of a core set of master data. Not surprisingly, therefore, successful MDM programmes are highly dependent on data quality.
Data quality solutions, conversely, should provide fit-for-purpose data across the enterprise.
While MDM is certainly a driver for data quality, the benefits of data quality must extend beyond master data to encompass all business critical data, most notably, in order to support Business Intelligence, Compliance requirements and operational efficiency.
A good data quality solution must manage both master data and related transactions.
So where should you start?
Obviously, it depends on your business requirement.
To use an analogy. data quality is the horse to the master data management cart. For a while the horse will be able to carry the required load. Eventually, the load may become to much for the horse and a full Master Data Management solution may be required. In this case, the data quality "horse" will continue to provide the pulling power that allows the cart to function.
On the other hand, the expensive cart is fundamentally useless without a functional horse.
This position is summarised by Aberdeen Research analyst, Nathaniel Rowe. "If you put MDM in place but you're using old, substandard data, you won't see much value from the effort," he said. "You'll have issues with the data if it isn't standardized."
"If you only have the budget to do data quality, that's more important, but keep looking toward the horizon for the next step,"
Data quality should not jut be a tick box in the MDM stack. Data quality management is a complex problem that is made worse when multiple sources of data are consolidated for MDM. Data quality should be assessed independently of MDM to ensure that you have a solution that is appropriate for your needs.
We have delivered a number of MDM projects with varying architectures, depending largely on what could be leveraged within the existing environment.
The common factor in each project was to ensure that the underlying data was able to support the business requirement.
By blending a business and data focus we deliver incremental benefits that justify the exisitng spend and build the business case for additional phases.
Master data management(MDM) is the discipline that is intended to link all critical data into a single, reliable point of reference.
A common error is to assume that MDM is achieved through the implementation of a so-called MDM tool.
In fact, master data management can be achieved without the use of an an MDM tool, although in most instances the right combination of tools will provide a lower risk option than trying to build everything you need from scratch.
Data governance is about agreeing, sharing and enforcing policies and rules for the use of data to support your key business processes. If you are trying to implement an MDM stack without understanding how the data must be used by different stakeholders then you don't have MDM, you have data integration.
If you are thinking about MDM you should consider the Collibra Data Governance Center
Widely accepted to be the leading business- centric data governance platform available, the Data Governance Center allows you to quickly and easily document, govern and share your data asstes with both business and technical stakeholders.
A 2008 Information Difference survey found that, the average company had 6 customer master systems and 9 product master systems. Larger organisations had many more than this, with 13% of companies having more than 100 systems holding critical master data.
Implementing a MDM product is simply a recipe for consolidating all of this data into another database - creating, in essence, a 7th customer master or a 10th product hub.
Data quality is the differentiator that allows you to standardise and combine related records to create a trusted master data source. In fact, data quality can deliver 80% of the value of a full blown MDM stack, at a fraction of the cost.
Our enterprise data quality platform, from Trillium Software, is a highly scalable, cost effective solution that has been proven in the South African context. Out of the box business rules for South Africa, and 200 other countries, ensure that you get immediate value.
The Trillium Software System provides certified connectors to all major MDM solutions - from vendors as diverse as IBM, SAP, Oracle, Software AG and Informatica. Trillium gives you the power to quickly assess and improve the quality of your source data, and to painlessly maintain the quality of data in your new MDM hub.