BeyeBLOGS | BeyeBLOGS Home | Get Your Own Blog

October 16, 2008

Profiting from Your Most Important Business Asset

Tom Redman, a.k.a. the "Data Doc", believes that information is an organisation's most valuable asset, but almost all companies grossly underuse their data assets. From his work with hundreds of companies across many different industries, Tom's diagnosis is that the cost of poor data quality to a business is typically 20% of its revenues. Couldn't your business benefit from a revenue uplift of one fifth right now?


With poor quality data costing organisations so much, it ought to be easy to build a business case for doing something about it, but projecting (and measuring) the return on investment (ROI) is something that many people struggle with. In my experience, data quality programmes nearly always realise sufficient tangible & quantifiable benefits to make their sponsorship a no-brainer.


My advice is to build a business case around the concrete benefits you can measure and demonstrate to your management. I've seen, for example, many customer data quality projects justified on the savings made by eradicating the printing and posting costs of sending mail to duplicated customers or undeliverable addresses. Sure, the improved customer service that results is also a benefit, but how do you measure its impact on the bottom line? Especially when there are other initiatives delivering improvements in the same area.


Building a business case with a clear ROI and continuing to measure the value of your data quality programme is critical. There's nothing more certain to grab and maintain the interest of you executive. If it was ever acceptable to invest in data quality without achieving a measureable return, those days are surely now over.


Tom made this point in a recent webinar hosted by the IAIDQ; he went as far as saying that you should abandon a data quality initiative if you can't demonstrate a return on investment. "Hear, hear," say I. Tom�s new book, The Data Driven Company, promises to provide insight into new strategies for profiting from quality data (I'm expecting my copy any day), but I'm also keen to hear your comments here.

Share: del.icio.us Digg Furl ma.gnolia Netscape Newsvine reddit StumbleUpon Yahoo MyWeb  

Posted by Steve Tuck at 4:30 AM

October 13, 2008

Propping up the House of Cards

Dominique Strauss-Kahn, head of the International Monetary Fund (IMF) says that the world financial system is on "the brink of systemic meltdown"; Vince Cable, Liberal Democrat spokesman describes the situation as a "bank tsunami"; the City is talking about a potential banking "armageddon". The unprecedented global financial crisis has left us all reeling; where were the regulators when we needed them?

Ken Clarke, former Chancellor of the Exchequer put it quite succinctly on last week's edition of the BBC's Question Time: "the regulators were useless and the new regulation system didn't work," he said. After years preparing and implementing Basel II, a regulation regime that was supposed to ensure the capital adequacy of financial institutions and reduce risk, we've seen bank failure after bank failure as the true state of their liquidity is revealed.

Vince Cable talks about the need for a new regulatory deal with the financial community, but warns that this "should not be done when the public mood is understandably for hanging, drawing and quartering anyone connected with banking... The priority now is disaster management." On that front, it's good to see a broadly united front from governments as they pump money (our money) into the financial system in an attempt to restore confidence and stabilise things. There's more finger-crossing and touching wood going on than most of us would like, I'm sure; hope is a key part of the strategy as they try to prop up this house of cards.

Banks have traded in increasingly complex financial instruments without a clear understanding of the market and credit risk. They owe it to their shareholders and the public at large (who may, in any case, become significant shareholders whether they like it or not) to take new measures to scientifically assess and mitigate risk.

The data that financial institutions hold should be put under the microscope for forensic analysis. How many banks, I wonder, rely on incomplete, inconsistent or out-of-date information for their risk assessments? Consider a couple of examples from the retail banking world. A 95%, interest only mortgage a year ago will have turned into a 110% mortgage today. Self-certified or historic income details may have been sufficient to lend money in a time of rapidly rising house prices, but it's the customer's current income that matters. Here are three data-centric suggestions to help financial institutions identify their current risk exposure:

  1. Perform a regular data audit of all key customer information, including calculated fields, to identify errors and anomalies that indicate credit or market risk.
  2. Ensure that KYC (Know Your Customer) checks are rigorously applied and customers are regularly screened against enhanced due-diligence lists to reduce operational and reputational risk.
  3. If you don't have a single view of your customers, GET ONE NOW. Understanding the complete relationship you have with your customers will allow you to measure your risk exposure in relation to individual entities and enable your marketing department to reduce attrition by targeting customers at risk.

This last point applies to all institutions, but is particularly salient for those institutions involved in a stressed merger. I'm not talking here about migrating legacy systems or implementing a grandiose Customer Data Integration strategy - those things can happen in due course, but now is not the time to be contemplating your navel about an IT project that might take 2-5 years to complete. I'm talking about cutting through the political wrangling and technology bigotry and delivering the information that the business needs to survive today. If that's not clear, call me now!

Share: del.icio.us Digg Furl ma.gnolia Netscape Newsvine reddit StumbleUpon Yahoo MyWeb  

Posted by Steve Tuck at 3:15 PM | TrackBack (0)

September 22, 2008

Data quality carrots

When I want to teach my dog to do something, I generally find it helps to offer her something in return. A small piece of cheese, or other tasty morsel generally does the trick. It doesn't have to be anything big or expensive and, after a short while, when she's learnt what it is I want. she'll respond without the need for anything more than a "good girl" as a thank you.

I'd say the same is pretty much true for my kids (although they respond better to cash than cheese and can generally understand more complex requests). So why is it that some data governance regimes think that everything will be alright if they issue an edict and back it up only with strong-arm tactics - "do it this way, or else."

If you want to encourage the right behaviour from your front-line staff who collect and enter information that other knowledge workers consume, why not start by offering them some incentive to do it. If you only measure their performance by crude measures, such as call volumes, or numbers of records entered, you cannot expect them to worry too much about the quality of the data they're actually typing in.

By measuring the quality of the information they're entering, and rewarding them for doing it right, you'll increase the value of that information, remove costly scrap and re-work and improve the output of the downstream processes that use the data. Just like my dog, the reward doesn't have to be big or expensive and, after a short while, you'll find that the good behaviour becomes second nature, which can be positively reinforced by regular monitoring and a polite "thank you." There's a place for the stick, but it's better to lead with the carrot.

Please note, the author does not recommend the offering of either carrots or cheese as a reward for good data quality.

Share: del.icio.us Digg Furl ma.gnolia Netscape Newsvine reddit StumbleUpon Yahoo MyWeb  

Posted by Steve Tuck at 9:18 AM

September 19, 2008

Getting their wires crossed

What on earth possessed me to transfer my mobile, office and broadband lines in the same week? How could I be so naive as to think that it would all go smoothly? Wired Guy

After a lot of frustrating calls to premium rate numbers, working through countless automated menus and listening to a lot of dreadful "on-hold" musak, my office line did get successfully transferred, but the mobile and the broadband were not so successful. Both look set for a delay of at least a week.

What has frustrated me most in my dealings with the FOUR telephone companies involved is the failure of anyone to take any ownership or responsibility for resolving THEIR issues. Instead I have been passed from pillar to post in my quest to sort things out for them.

It seems that everyone I speak to has a very limited remit and is incapable of talking to their colleague or even transferring me to the next department. Why do that when you can bump the customer and give them another premium rate number to call which places them in another queue they have to endure?

I now know more about the internal workings of the ordering process at these companies than most of their staff do and certainly more than I care to. Each of them has some form of single customer view in place and references me by my telephone and account numbers, but none of them appear to be truly joined up and the customer service staff have access to information that is incomplete at best and frequently incorrect. They may have the same identifier on the records, but their systems and processes are certainly not working in a joined up way, leaving the customer to assemble their own single view of the enterprise when things go wrong. As far as I can see, the only saving grace for these organisations is that they are all as bad as the others. Perhaps they should be called miscommunication companies...

Share: del.icio.us Digg Furl ma.gnolia Netscape Newsvine reddit StumbleUpon Yahoo MyWeb  

Posted by Steve Tuck at 10:15 AM

December 10, 2007

Trillium Software's Identity Crisis

I thought someone was trying to wind me up at first, but it's true - Trillium Software is currently paying for an advertisement on Google, that uses one word only - Datanomic! Why would such a well established data quality software vendor make such prominent use of a competitor's name? And why have they singled out Datanomic for such special attention? I'll let you make up your own mind about that.

Feel free to click on Trillium's link - it takes you to the registration page for a White Paper, but if you want the real Datanomic, simply go to www.datanomic.com. And Kevin, well spotted but no, this doesn't mean that Datanomic has been acquired by Trillium Software! LOL

Share: del.icio.us Digg Furl ma.gnolia Netscape Newsvine reddit StumbleUpon Yahoo MyWeb  

Posted by Steve Tuck at 12:30 PM

August 15, 2007

The Dos and Donts of CRM Data Migration

According to a Gartner Research report, Eight Steps to Implementing a Successful CRM Project (October 2006), CRM project failures continue to run at nearly 50%. What the report fails to identify is that a significant number of these failures are due to inadequate data resulting from deficient data migration practices. Poor data quality leads to poor user adoption and poor CRM performance; getting the data migration right is critical to success.

In my experience of dozens of CRM Migration projects there are two key mistakes that are commonly made, to avoid these:

  • People Assume that Migrating Data will be Easy: Companies invest large sums of money, time and effort in buying and customising a new CRM system, but all too often they overlook the challenge of populating it with fit for purpose data, leaving the migration to the last minute and then just dropping data from the legacy system straight into the new system. Think of it like buying a new car and selecting the colour and a range of luxury options, but then fitting it with the engine and tyres from your old vehicle and wondering why it performs so badly.
  • People Leave the Data Migration to the Technical Team Alone: Whilst a data migration undoubtedly needs skilled technical staff to understand the physical data models and to extract and load data in an optimised way, the critical area of data transformation requires business knowledge. Key decisions about how information should be translated from one system to another, or how duplicated records should be matched and resolved should be made by a business user and ratified by a governance board.

All too often, the implementation of a new CRM system is seen primarily as a technical challenge. The truth is that, to be successful, a CRM programme requires collaboration between the business and technical teams. The data migration element is not just about plumbing the two systems together; its about understanding the contents of the data and its business context and constructing a process that delivers data that is optimised to perform in the new system. This requires time and effort; here are 8 steps of my own for ensuring a successful CRM Data Migration:

  • Establish a governance board that sits above both the CRM Implementation and CRM Data Migration Project. This should consist of representatives from the business and IT.
  • Start the CRM Data Migration project at the same time as work on the tailoring and implementation of the CRM system itself begins. The migration project will often identify shortcomings in the system design - its best to identify these as early as possible so that they can be properly address rather than worked around.
  • Identify all of the source systems. Unless you are doing a like-for-like system replacement your new CRM system it is likely that the new system will need to be populated with information extracted from a number of legacy applications.
  • Decide on an approach to migrating historical information; will all, some or none of it be migrated? This is most relevant when replacing a CRM system. Including historical data may dramatically increase the volume of data to be migrated (and consequently the time it takes), but not migrating it means you may need to maintain the old system for archive purposes.
  • Assess up-front whether a one-off data migration is required, or if the project actually requires the ongoing integration of different systems. Many so called migrations actually require continuing feeds of data to be in place.
  • Decide whether a big bang approach is practical (is there sufficient downtime to execute the migration?) or if a phased, or trickle migration is required. Avoid the mistake of one insurance company that got to within a month of implementing a new CRM system before realising that it would take more than 2 weeks to migrate all of the data, but it had only allowed for a 48 hour operational window in which to complete it.
  • Use subject matter experts, from within the business, to map data to the business level entities (Customer, Address, Contact History) in the new system. When implementing a replacement system the mapping exercise should be repeated based on the legacy system entities to ensure that no critical data is left behind.
  • Identify whether the data migration needs to include a deduplication process and where it will be done. If one is needed, must it be completed before the data is loaded to the new CRM system, or could it be done afterwards with the data in situ? As with an business rules, those used to match and merge data should be designed by an empowered business user and confirmed by the governance board.

At first sight, it appears that little has changed in the 15 years since my first CRM data migration project; significant numbers of CRM implementations are still doomed to failure because inadequate attention is paid to the data that powers them. However, attitudes are, I believe, changing and so too is the technology that is used to deliver data migrations. These projects are finally moving out of the darkened basement, thanks to innovative solutions that provide powerful functions through easy-to-use interfaces and enable business users and IT specialists to collaborate effectively.

Visit my full Blog at www.dqview.com to read more

Visit my full blog at www.dqview.com

Copyright 2007 Steve Tuck - All Rights Reserved

Share: del.icio.us Digg Furl ma.gnolia Netscape Newsvine reddit StumbleUpon Yahoo MyWeb  

Posted by Steve Tuck at 4:15 PM

April 16, 2007

Customers: Corporate Asset or Corporate Liability

European financial services companies are risking their reputation and possible fiscal and custodial penalties by failing to recognise their exposure to potential criminal activity. As the deadline for implementation of the 3rd EU Money Laundering Directive fast approaches (15 December 2007) many money laundering reporting officers (MLROs) appear to be oblivious to the size of the problem they face.

The new directive further tightens the screw on financial services suppliers to know their customers. It requires them to take a 'risk-based' approach to screening their customers against prescribed sanctions lists and to also identify any client that is a politically exposed person (PEP). The legislation builds on existing efforts to prevent criminals access to the European Unions financial systems.

Visit my full Blog at www.dqview.com to read more

Visit my full blog at www.dqview.com

Copyright 2007 Steve Tuck - All Rights Reserved

Share: del.icio.us Digg Furl ma.gnolia Netscape Newsvine reddit StumbleUpon Yahoo MyWeb  

Posted by Steve Tuck at 4:45 PM

December 31, 2006

Third Party Data - Silver Bullet or Unreliable Evidence?

It can be very tempting to view third party data as a silver bullet to all your data quality woes. take for instance the PAF (Postcode Address File), produced by the Royal Mail in the UK. They describe it as "the most up-to-date and complete address database in the UK, containing over 27 million addresses." I'm not going to take issue with their claim and I regularly recommend that organisations make use of PAF data.

However, I think it's also worth pointing out a few things about PAF data:
1. PAF is produced by Royal Mail to aid with the effective and efficient delivery of postal items - therefore, it is only concerned with postal addresses, not all addressable locations (telecommunication and utility providers deliver services to many other addressable objects, such as streetlamps, traffic lights and road signs).
2. Contrary to popular belief, PAF does not contain a record for every business and residential unit in the UK. Indeed, Royal Mail has actually removed some records for flats where they share a single mailbox.
3. PAF is updated regularly - but the changes can take months to be completely rolled out. Many organisations fail to update their computer systems with PAF changes in a timely manner.
4. PAF is not infallible - it contains errors, omissions and duplicates. Business addresses, particularly those in business parks and those that rely on the business name are particularly prone to inaccuracies.

Authorative sources of data are indeed useful - just don't count on them to tell the truth, the whole truth and nothing but the truth.

Share: del.icio.us Digg Furl ma.gnolia Netscape Newsvine reddit StumbleUpon Yahoo MyWeb  

Posted by Steve Tuck at 8:30 AM

August 15, 2006

Dell's hot technology

Dell has today announced the recall of more than 4 million laptop batteries over fears that they could overheat and start a fire.

Dell recalls 4m laptop batteries Dell recalls batteries over fears of explosions

There may well have been some data quality issues involved in the manufacture of these batteries (by Sony), but what concerns me is the opportunity for error when checking whether a battery is potentially dangerous or not. Here's a shot of the label on my own laptop battery:

Visit my full Blog at www.dqview.com to read more

Visit my full blog at www.dqview.com

Copyright 2006 Steve Tuck - All Rights Reserved

Share: del.icio.us Digg Furl ma.gnolia Netscape Newsvine reddit StumbleUpon Yahoo MyWeb  

Posted by Steve Tuck at 2:30 PM

July 26, 2006

A Timely Reminder

To measure the quality of any data item we need to understand its definition. Without that, we might totally misunderstand what we're looking at and if we're using the information as the basis for making an important decision the consequences can be dire.

Sometimes it's the presentation of the data that is at fault - take this example:

Date: 07/04/23

What is the date? The 4th of July or the 7th of April? And is the year 1923 or 2023?

If we were dealing with a customer database and the field was defined as the customer's date of birth I think we could safely assume that the year was 1923, but spot the same value in a field defined as a mortgage repayment date and the decision could go the other way. As to resolving which is the day and which the month, we probably all jump one way or another based on what we're used to. The problem is that the presentation of the date is ambiguous and without a clear definition it is open to misinterpretation.

Visit my full Blog at www.dqview.com to read more

Visit my full blog at www.dqview.com

Copyright 2006 Steve Tuck - All Rights Reserved

Share: del.icio.us Digg Furl ma.gnolia Netscape Newsvine reddit StumbleUpon Yahoo MyWeb  

Posted by Steve Tuck at 3:15 PM