Our MDM implementation approach
In recent years, organizations have grown more aware of their data and the role it plays in the success or failure of their most critical business operations. Technology disruptions such as the Internet of Things (IoT), connected devices, mobility, cloud computing and digitalization have increased the flow of data making “data everywhere” a new normal. More data and the availability of cheaper storage that supports emerging technology like big data computing has only pushed organizations to store virtually everything. Unfortunately, this mantra has resulted in inconsistent parameters around mission-critical, vital and sensitive data points, in many cases, leading to the creation of data swamps full of dark data.
Today, organizations are struggling with the overall governance and movement strategies of critical operational data. The move towards a consolidated master data set from siloed data environments is something many are considering and trying to implement. The goal of such master data programs, or Master Data Management (MDM) initiatives, is to identify, validate and resolve data issues as close to the source as possible, as well as create a “golden copy” of critical enterprise data for consumption.
Yet, not all data is equally important. Business engagement and input into your master data program is critical to prioritise technical deliverables and meet business priorities.
Create a focussed scope by supporting key business processes first.
A successful MDM program ensures data consistency, completeness, and accuracy, but simply implementing an MDM program by itself cannot ensure success unless business supported data management programs are run in parallel.
MDM should be used to optimise key business processes. The master data governance committee or steering committee should include business stakeholders relying on these processes – these include producers, consumers and owners of data - as well as IT executives responsible for enablement and delivery.
Engaging these stakeholders early and often allows a focused master data scope to be defined and implemented, whilst ensuring that other stakeholders understand when and how their requirements will be accommodated.
Data governance is a critical success factor to ensure business alignment for the MDM project.
Governance should enable collaboration between all stakeholders, identify key data requirements and standards, and create the context and priorities for successful MDM. Without data governance, MDM projects frequently unravel due to conflicting needs and political agendas.
Governance ensures strategic planning and communication of decisions and discoveries related to data such as: What does the data mean? Where does the data come from? How do I find the data I need? Who owns the data?
Identify critical data elements supporting these processes and the producing and consuming systems
What master data is required, where is it currently stored, how is it changed?
Conduct an MDM Readiness Assessment to on the key master data.
Are there any unpleasant surprises lurking in the data? Data Quality efforts typically take much longer and cost much more than planned.
An MDM Readiness Assessment is vital to understanding and managing the data risk inherent in MDM, to define a data model that will minimise data migration failures, and to keep implementation time scales within scope and budget.
The MDM Readiness Assessment will provide you with critical metadata for planning the most cost-effective data models, migration strategy and implementation plan. Ultimately, data quality is the difference between trusted master data and yet another silo.
Develop the master data model
What attributes must be supported to support key processes?
The metadata generated during the MDM Readiness Assessment is a key feed into this model, as are the business requirements defined by your Data Governance team.
The goal is to define an extensible model that can support your existing requirements and that allows you to map all source system attributes to a “best fit” physical model. However, the model should not be more complex than required - the more data that is added the more difficult to manage
Ideally, we define a model at a logical level to ensure key business requirements are supported. We can then map the actual source and target datasets to our logical model.
Generate and test the master data
Enterprise data quality capabilities must be used to standardise and consolidate various representations of each record into the single best “golden” record.
You need to be able to trust your match process – no data stewards have the time to manually validate matches.
Implement Data Quality metrics
Is your master data fit for purpose? Master data that is of poor quality can quickly spread to consuming systems, degrading the overall quality of data within your business.
Data stewards, and other interested stakeholders, should monitor data quality dashboards to ensure that master data remains fit for purpose, and to allow corrective action if it reaches unacceptable levels.
Modify the producing and consuming systems
Existing systems may need to be modified to use the new master data. At the very least, systems should not add new master records without first doing a check to see whether the record already
Master Data Management Success factors
Gartner indicates that only 33% of MDM programs will succeed in demonstrating value to the organization. MDM programs without a data management strategy are most likely moving data with redundancy. This is a prescription for failure. Consolidating critical data points across different application environments like CRMs, ERPs and other core system applications, etc. is very complex because many applications share common data points, but they are not related through enforced data model layers.
Therefore, implementing an MDM program is not as easy as downloading a new piece of software. It’s a process that requires many steps. But before implementing an MDM program, it’s important to identify and address the following key success factors:
- End-User Buy-In: When starting the program, it’s important that scope and objectives are created with the end user’s buy-in. As a business-driven, IT-enabled program, MDM helps business stakeholders to own the data and ownership of the outcomes. Most MDM outcomes are expected to improve business process management program efficiency or redefine existing business processes. Expecting business teams to accept new changes without business buy-in is one of the most common causes of MDM failures.
- Data Classification and Policies: Data may be classified in many ways based on both internal and external policies. In recent years, with far-reaching regulations such as BCBS 239, GDPR and PoPIA, the need to classify critical or personal data, understand policy, usage, access, and distribution rights have come to light, with increased fines, financial, and reputational risks for lack of compliance.
- Golden Source Rule: The ability to establish the authority of the data source using clear business objectives that are verifiable to prove the data origin. This also includes data ownership that defines when and how the data from the source can be re-written or changed and who can change it.
- Data Catalog/Inventory: Most MDM programs start with a small data set like customers or geography. As they incrementally expand, it becomes more difficult and complex to address multi-domain MDM without proper data cataloguing or inventory. To do so, business users leverage data catalogues to identify and use the right sets of master data for decisions.
- Enterprise Data Quality: Can you trust your data? Data quality needs to address end-user challenges that are more subjective in nature such as data believability, data trustworthiness, data provenance, etc.
- Data Ownership: Identifying who owns the data has to occur before an MDM program is rolled out. Once the data custodian is identified, that individual then has to identify and bless all required rules and policies to ensure MDM success.
- Measurable outcomes: MDM programs usually propel business process changes within the user Before implementing your MDM program, it’s important to benchmark business efficiency and then measure incremental benefits achieved by the business team post-implementation.
The success of the above requirements is contingent upon solid data management and data governance programs. Tools and technology can help address any of the areas above that might be main points rather than success points. Ideally, organizations want to find an easy-to-manage, single solution that can be implemented quickly in a cloud environment, while complimenting core MDM tools.
Post MDM Implementation Concerns
After a successful MDM implementation, the work isn’t over. Like other implementations, the work is only beginning. You can’t set it and forget. Issues will arise and identified above is a list of the most common in an MDM implementation. Learn from these experiences and make a concerted effort to ensure failure doesn’t haunt you at every turn.
- Perception of Data Trustworthiness: If users fail to recognize MDM as the golden source and instead, look up another source for their data needs, not only can they burden the system with bad data, they make the MDM program investment redundant.
- Data Quality Issues: MDM initiatives and data quality are intimately linked. Missing data or non-standard data are two common data quality challenges that you see in MDM programs. The root cause seems to originate from the identified authoritative source that might have some data anomalies, while other complimenting sources have the information that can fill in the gap. Most of the time it’s due to rigid MDM source hierarchy rules, resulting in missed opportunities to augment the right information from non-authoritative sources, eventually leading to more work by stewards.
- Dynamic Data Metrics: As the MDM program grows to a multi-domain objective, just providing metadata capabilities alone might not help user acceptance of the MDM data set. Adding data metrics such as volume or date range information may help users understand the volatility of the data. Other factors like frequency of the data occurrence at golden source and frequency of the changes on master data points are some critical factors that can provide users with much-required insight that would guide better data usage.
- Leveraging Transactional Controls & Analysis: As noted earlier, a successful MDM program has to be done in conjunction with solid data management and data governance programs. Automated controls provide capabilities such as multi-point reconciliation that can be deployed across multiple source systems. They then feed your MDM application and reconcile data differences by generating dynamic reconciliation reports that help both SME and MDM architects to adjust the data in applications to be in line with the enterprise master data source. While not all controls are the same, it’s critical to find those that can be deployed in any environment and can access any source to work within the data flow process without interference. The controls are meant to monitor and capture data metrics that can help dissect the process metrics and data quality metrics around the MDM data. Such information can be of great aid to information architects to continually improve the MDM value.