More

    Four Best Practices for Accelerating Time-to-Value of Master Data Management

    Master data management (MDM) has become recognized as a key way for businesses to gain consistent and valuable insight on data, usually distributed across applications and systems. While the anticipated benefits of MDM are usually clear, a number of factors must be considered to implement an effective MDM program, ensuring actual success and return on investment. Talend has identified four best practices for accelerating successful returns on an MDM initiative.

    Four Best Practices for Accelerating Time-to-Value of Master Data Management - slide 1

    Click through for four best practices for accelerating successful returns on an MDM initiative, as identified by Talend.

    Four Best Practices for Accelerating Time-to-Value of Master Data Management - slide 2

    Finding just the right starting point for MDM is critical to getting it off the ground. You want to get executive buy-in for your project, and you want to start producing value before that buy-in wears off, or executive sponsors move on. Start by picking a clearly delineated area that is causing a current business pain, and make sure the scope is reasonable and achievable within weeks – months at most.

    Once the MDM project successfully delivers the anticipated returns, its expansion will become obvious. With this approach, however, you should always keep in mind the “big picture” and ensure that you are not building a disposable project, but a foundational one that will allow you to add to it over time.

    Four Best Practices for Accelerating Time-to-Value of Master Data Management - slide 3

    MDM projects also need to account for the wealth of “new data” that is now readily available and has become part of the organization’s extended data assets: new types of information that are coming from inside or outside the firewall, from unsuspected data sources or from sources that were simply not accessible before. These new types of information must be full constituents of an MDM infrastructure – either by being managed in the MDM hub itself, or linked from the MDM hub in a federated approach. They typically include social data and public/open data, as well as “dark data” – data that is hidden in log files, manufacturing equipment and various systems.

    Four Best Practices for Accelerating Time-to-Value of Master Data Management - slide 4

    Incorporate Big Data into your MDM strategy – and vice versa. Before the end of 2013, Big Data will be driving an essential part of the requirements for MDM programs as incorporating new types of data becomes a strong requirement. This trend, initially driven by customer-centric organizations or divisions, is already expanding to other domains such as manufacturing or logistics. Big Data augments conventional MDM sources to provide a complete view of the required domain.

    Adding Big Data to MDM does not mean that the master data hub will be stored in Hadoop (although some organizations are exploring the use of NoSQL databases), nor does it mean that its size will grow exponentially in a short timeframe. Rather, it means that some of the Big Data (or new data) will be managed in the MDM hub itself, linked from the MDM hub in a federated approach, or will simply benefit from the consistency, resolution and enrichment services that MDM provides.

    Four Best Practices for Accelerating Time-to-Value of Master Data Management - slide 5

    An MDM project is not simply about building, governing and maintaining the master data hub. An important dimension of MDM is how it participates in the larger application architecture. Rather than letting your MDM become part of the “accidental architecture,” design it so that it is a full constituent of IT. In addition, rather than restricting the data flows to direct synchronization with participating applications, incorporate the MDM system into the service-oriented architecture and have it – at the minimum – publish master data to other applications and systems through master data services, and make its functions (such as deduplication or enrichment) available as services to other applications.

    Latest Articles