BeyeBLOGS | BeyeBLOGS Home | Get Your Own Blog

« London FIMA Thoughts | Main | Security and Data Management »

November 12, 2007

FIMA Europe (part II)

Last time I spoke about the first of two hot topics at FIMA London, Data Governance. Today I want to hit on the second area I repeatedly heard about which was integrating downstream applications to the security master.

This came up an incredible amount. Many companies have built their master or purchased their master. (Note for those of you who are not in the finance world, you can buy a pre-populated MDM system where the data sources are already pre-mapped to each other. This is because there are 100 or so companies in the industry that provide data feeds to financial firms and the financial MDM vendors not only sell the MDM software, but the model with the data feeds pre-integrated. ). Regardless, when it comes to integrating to the hundreds of downstream legacy applications that currently get their data directly from outside data vendors they find that the cost of this integration is too expensive.

This is clearly a case of “If you build it, they will [not] come.” (for your “Field of Dreams” and Kevin Costner fans!]. The problem is multifold:
a. The cost of integrating to the downstream applications is an expensive manual process
b. The IT group which built the master often publishes an interface to integrate with and tells the downstream groups to integrate with it. The problem is that the downstream application groups don’t have the skills or the budget to do the integration so it doesn’t happen.
c. The IT groups don’t want to map their data to the downstream applications themselves because they don’t know the data structure of the downstream applications and SMEs for the downstream apps aren’t available to help. Very Catch-22.

The result is that millions of dollars are spent building the reference master and they currently end up being underutilized.

My talk at FIMA was on this very topic. I went through a real example of a talented data analyst that had to map 6 columns of data to 3 different data sources and what was expected to take 3 weeks ended up taking 7 months. As it turned out, the 6 columns were overloaded columns that contained different codes depending on which of the 50 U.S states a customer might be located. The result was that it was more like 300 columns shoved into those 6 columns and sorting out the mess ended up being very time consuming. This is just an example of why no one wants to step up to doing this work. The second half of my presentation talked about a new approach that automates the discovery of the relationship between the master and the downstream application data and speeds up this process.

My thinking on this is that if financial institutions want to resolve this issue, their master data management IT groups will have to step up to the plate (“Field of Dreams” pun intended) and develop a competency to do this last mile of integration. It just doesn’t make sense to make the downstream app groups do the work. They would only do the work one time each for their application and it doesn’t make sense to staff up for that. But the central group can develop a competency in this and as there is technology now to help, there is really no longer an excuse for not providing full service to their users and developing repeatable processes to solve this problem.

Posted by Todd Goldman at November 12, 2007 12:30 PM

Comments

Post a comment




Remember Me?