BeyeBLOGS | BeyeBLOGS Home | Get Your Own Blog

« October 2007 | Main | December 2007 »

November 12, 2007

FIMA Europe (part II)

Last time I spoke about the first of two hot topics at FIMA London, Data Governance. Today I want to hit on the second area I repeatedly heard about which was integrating downstream applications to the security master.

This came up an incredible amount. Many companies have built their master or purchased their master. (Note for those of you who are not in the finance world, you can buy a pre-populated MDM system where the data sources are already pre-mapped to each other. This is because there are 100 or so companies in the industry that provide data feeds to financial firms and the financial MDM vendors not only sell the MDM software, but the model with the data feeds pre-integrated. ). Regardless, when it comes to integrating to the hundreds of downstream legacy applications that currently get their data directly from outside data vendors they find that the cost of this integration is too expensive.

This is clearly a case of “If you build it, they will [not] come.” (for your “Field of Dreams” and Kevin Costner fans!]. The problem is multifold:
a. The cost of integrating to the downstream applications is an expensive manual process
b. The IT group which built the master often publishes an interface to integrate with and tells the downstream groups to integrate with it. The problem is that the downstream application groups don’t have the skills or the budget to do the integration so it doesn’t happen.
c. The IT groups don’t want to map their data to the downstream applications themselves because they don’t know the data structure of the downstream applications and SMEs for the downstream apps aren’t available to help. Very Catch-22.

The result is that millions of dollars are spent building the reference master and they currently end up being underutilized.

My talk at FIMA was on this very topic. I went through a real example of a talented data analyst that had to map 6 columns of data to 3 different data sources and what was expected to take 3 weeks ended up taking 7 months. As it turned out, the 6 columns were overloaded columns that contained different codes depending on which of the 50 U.S states a customer might be located. The result was that it was more like 300 columns shoved into those 6 columns and sorting out the mess ended up being very time consuming. This is just an example of why no one wants to step up to doing this work. The second half of my presentation talked about a new approach that automates the discovery of the relationship between the master and the downstream application data and speeds up this process.

My thinking on this is that if financial institutions want to resolve this issue, their master data management IT groups will have to step up to the plate (“Field of Dreams” pun intended) and develop a competency to do this last mile of integration. It just doesn’t make sense to make the downstream app groups do the work. They would only do the work one time each for their application and it doesn’t make sense to staff up for that. But the central group can develop a competency in this and as there is technology now to help, there is really no longer an excuse for not providing full service to their users and developing repeatable processes to solve this problem.

Share: del.icio.us Digg Furl ma.gnolia Netscape Newsvine reddit StumbleUpon Yahoo MyWeb  

Posted by Todd Goldman at 12:30 PM | Comments (0)

November 9, 2007

London FIMA Thoughts

I attended the FIMA conference in London this week. It is a conference for financial services firms and focused on master data management, mainly securities masters.

The two topics that came up over and over again during the two day marathon sessions were Data Governance and integration of the master to downstream systems. I will write bout the data governance issue today and leave the downstream integration for next time (my attempt at a cliff hanger as I am sure you are all just waiting with excitement to read the next installment ;) )

Data Governance: This has been a hot topic in the states for the past year with the start of the Data Governance conference but clearly people are realizing that you can’t just have the data management piece of a reference master without also have business people who will provide governance oversight to handle the exceptions.

Lots of discussion about where governance should sit. Is it an IT function or is it a business function. Outside of the financial services world, I think there is little debate about this, that governance and the data stewards sit on the business side as they have to make business decisions about the data. However at FIMA, there didn’t seem to be as much consensus on the topic. Perhaps it is because very often there is a lot of overlap and moving back and forth from business to IT and vice versa in financial institutions. In general, there is a lot more IT savy in financial institutions on the business side and the IT folks tend to take a lot of interest and have a lot of knowledge.

One consultant I spoke to commented that perhaps it is just that no one wants to step up to take “ownership” of the data. This is even more likely. Because “ownership” would imply responsibility and with Basel, MiFID, Sarbox etc. hanging over all these firms, no one at lower levels in the organization want to be responsible for the data because then they might implicitly be accountable in a legal sense.

As for the next topic, integrating with downstream systems, you will have to wait until tomorrow. Time for me to board the plane back to sunny California.

Share: del.icio.us Digg Furl ma.gnolia Netscape Newsvine reddit StumbleUpon Yahoo MyWeb  

Posted by Todd Goldman at 9:30 AM | Comments (1)