Data Quality Problems are Corporate IT’s Dirty Little Secret

From Innovations, a website published by Ziff-Davis Enterprise from mid-2006 to mid-2009. Reprinted by permission.

Paul Barth likes to tell the story of a commercial wholesale bank that saw the opportunity to grow business and customer satisfaction by up-selling existing customers. Its sales force had traditionally been organized around product lines, but the bank knew that cross-selling was an unexploited opportunity. The question was what to sell to whom.

Sales management decided to pull all its customer information into one place to look for patterns. By targeting segments, the sales force could sell its products to the most likely buyers. Small businesses, for example, were good prospects for lines of credit.

Armed with these profiles, the bank retrained its sales force to sell a portfolio of products to different customer types.  The predictive data allowed reps to focus their energies on the customers who were most likely to buy. The result: sales to existing customers grew $200 million, product-per-customer ratios increased and customer satisfaction rose 50%.  All of this was achieved without expanding the product line.

Clean Data Yields Insight

This wasn’t an example of technology wizardry.  It was an example of clean, well-integrated data leading to a positive business outcome. By taking an integrated view of its customers, the bank was able to spot patterns and derive insights that led to smart business decisions.  Cost was minimal, upside was huge and the decisions resulted in a new annuity stream of increased business that lasted years beyond the original decision.

Data quality is an under-appreciated advantage.  For companies that apply the discipline and planning to address their data-quality needs and take an innovative approach to identifying new insights, the payoff is an order of magnitude of the cost.

Most large organizations have data quality problems that have resulted from years of acquisitions, stovepipe application development and error-prone data entry. Making sense of the mess looks like a huge job, but it can be done if you follow a disciplined process, suggests Barth, an MIT computer science Ph.D. who’s also co-founder of NewVantage Partners.

First, you need to step through all their critical data elements and decide what information is really strategic to the business. This process alone can winnow down tens of thousands of data elements to perhaps a few hundred.

Then establish consistent terminology. Barth cites one client that had six different definitions of “active customer.”  Everyone in the company needs to use the same language. A metadata repository can help standardize terms.

“Start to centralize where data is stored, how it’s provisioned and how people access and use it,” Barth recommends. Minimize copies. Each copy of production data creates complexity and the possibility of error.

Put in place a governance process for data quality that specifies rules about what levels of quality is acceptable. Create metrics by which to measure quality levels. Establish data ownership. One of the reasons companies have so many data problems is that no one owns the quality process. Ownership creates responsibility and discipline.

Get a handle on application development. Line-of-business owners shouldn’t be allowed to create rogue applications without central coordination. Many of these skunkworks projects use copied data because it’s more expedient than tapping into production databases.

Identify opportunities to create insights from data. This is the most critical step. Strategic opportunity comes from thinking up innovative ways to apply data analysis.

Annuity Benefits

Here’s one more example of the benefits of data quality. An acquisitive bank had to make decisions about closing and consolidating branches. These choices are usually made based on location, but some analytical thinkers at the bank had a better idea.  They built a database of customer transactions across the bank’s many channels, including branches, telephone banking and Web.  Then they looked at the behavior of customers across those profiles.

They discovered that customers who used multiple physical and electronic channels were less likely to leave the bank. That meant that branches that counted many of those high-value customers were actually better candidates for closure. Using this innovative approach to decision-making, the bank was able to close 50 branches and save $38 million annually without any revenue impact. That’s an annuity benefit that resulted from an innovative analysis of data the bank already had.

Leave a Reply

Your email address will not be published.