Master data management - even as MDM or master data management - doesn't sound very exciting. At a time when knowledge doubles every five years or so, is it still necessary to deal with this actually "trivial" topic?
Business practice shows that correct data is still a complex issue: marketing campaigns do not bring the desired success because email addresses are not properly maintained, contracts are returned as undeliverable and duplicates cause unnecessarily high costs. System migrations often fail because fields in the old system have been misused due to a lack of flexibility in the data model, which makes it difficult to assign data correctly. [1] The formula applies:
Overall quality (GQ) =
System quality (SQ) x User quality (AQ)
It illustrates that although master data management can be supported by qualitative software, user quality, i.e. awareness of the importance of correct master data and the fatal causes of incorrect master data, is just as crucial for overall quality. Only when SQ and AQ reach a value as close as possible to 1 can an appealing overall quality be achieved (target: GQ = 1).
Information is only useful if it is up to date, consistent and uniformly presented. Ensuring data quality is therefore an organizational challenge that must be mastered at the point of origin and use of the master data in the specialist departments.
The costs of good data quality include costs for error detection and prevention. In addition to the hardware and software costs for supporting data quality management, this also includes the personnel costs for developing data requirements. On the other hand, there are the costs of poor data quality, i.e. costs caused by data quality problems. These include, among other things
A functional architecture that addresses the levels of strategy, organization and systems is ideal for professional master data management.
Management consultants Roland Berger cite the following aspects as key success factors:
Strategy - management buy-in: master data management and the associated data harmonization are long-term programmes that often require procedural and organizational changes. Strong management buy-in and support is a fundamental prerequisite for long-term and sustainable success.
Organization - Master data management is not just an IT project: the project is a cross-cutting issue within the company. The activities, processes, functionality and structures of master data management must therefore be coordinated across the various business areas. As master data is always managed by the business and not by IT, master data management is primarily a task for the specialist departments and all users accessing their systems at the point of sale (POS). If necessary, a separate management system and a specific process and organizational structure must be set up for this purpose.
Ongoing progress and quality checks are also essential: In order to measure overall progress and sustainably improve data quality, key performance indicator systems and controls should be established. Practice shows that simply setting rules is often not enough. The desired effects can only be achieved with the use of control mechanisms. Simple metrics can be developed quickly, such as the fill rate for optional fields. The general rule is: turn those affected into participants. If ideas from the specialist departments are taken into account, this in turn contributes to increased compliance with the input rules.
When companies are surveyed, they realistically estimate the efforts to maintain data quality to be rather high.
It is important to set up precise processes for recording data changes (e.g. customer relocation) and to provide employees with simple options for checking, correcting and adding to the databases for which they are responsible. This is the only way to ensure that the efforts invested in data quality using resources and budget are not in vain.
The following motives can be cited for successful master data management: [5]
For many years, we have set ourselves the goal of effectively supporting our customers in the complex process of procuring and qualitatively preparing the database required to process credit applications - from the offer to the customer to the application with fully automated credit decision and processing through to contract management.
This is made possible on the one hand by helpful functionalities in our Credit Management Solution (afb-CMS) and on the other hand by our comprehensive data services.
Selected functionalities for efficient data management workflows:
1 The complexity of this topic can be increased if not only company-wide but also inter-company master data management is considered.
2 Rolf Scheuch: Datenqualität sichern - Stammdaten-Management braucht Ordnung; www.computerwoche.de/a/stammdaten-management-braucht-ordnung,2516260
3 Christian Fürber: Messung von Datenqualitätskosten, www.iqinstitute-gmbh.de/blog/2012/12/06/messung-von-datenqualitatskosten/
4 Andreas Dietze / Thomas Fischer: Erfolgsfaktoren fürs Stammdaten-Management, www.rolandberger.de/medien/news/2013-10-10-rbsc-news-Erfolgsfaktoren_fuers_Stammdaten_Management.html
5 Rolf Scheuch: Ensuring data quality - master data management needs order; www.computerwoche.de/a/stammdaten-management-braucht-ordnung,2516260