Data Quality Problems are Corporate IT’s Dirty Little Secret

From Innovations, a website published by Ziff-Davis Enterprise from mid-2006 to mid-2009. Reprinted by permission.

Paul Barth likes to tell the story of a commercial wholesale bank that saw the opportunity to grow business and customer satisfaction by up-selling existing customers. Its sales force had traditionally been organized around product lines, but the bank knew that cross-selling was an unexploited opportunity. The question was what to sell to whom.

Sales management decided to pull all its customer information into one place to look for patterns. By targeting segments, the sales force could sell its products to the most likely buyers. Small businesses, for example, were good prospects for lines of credit.

Armed with these profiles, the bank retrained its sales force to sell a portfolio of products to different customer types.  The predictive data allowed reps to focus their energies on the customers who were most likely to buy. The result: sales to existing customers grew $200 million, product-per-customer ratios increased and customer satisfaction rose 50%.  All of this was achieved without expanding the product line.

Clean Data Yields Insight

This wasn’t an example of technology wizardry.  It was an example of clean, well-integrated data leading to a positive business outcome. By taking an integrated view of its customers, the bank was able to spot patterns and derive insights that led to smart business decisions.  Cost was minimal, upside was huge and the decisions resulted in a new annuity stream of increased business that lasted years beyond the original decision.

Data quality is an under-appreciated advantage.  For companies that apply the discipline and planning to address their data-quality needs and take an innovative approach to identifying new insights, the payoff is an order of magnitude of the cost.

Most large organizations have data quality problems that have resulted from years of acquisitions, stovepipe application development and error-prone data entry. Making sense of the mess looks like a huge job, but it can be done if you follow a disciplined process, suggests Barth, an MIT computer science Ph.D. who’s also co-founder of NewVantage Partners.

First, you need to step through all their critical data elements and decide what information is really strategic to the business. This process alone can winnow down tens of thousands of data elements to perhaps a few hundred.

Then establish consistent terminology. Barth cites one client that had six different definitions of “active customer.”  Everyone in the company needs to use the same language. A metadata repository can help standardize terms.

“Start to centralize where data is stored, how it’s provisioned and how people access and use it,” Barth recommends. Minimize copies. Each copy of production data creates complexity and the possibility of error.

Put in place a governance process for data quality that specifies rules about what levels of quality is acceptable. Create metrics by which to measure quality levels. Establish data ownership. One of the reasons companies have so many data problems is that no one owns the quality process. Ownership creates responsibility and discipline.

Get a handle on application development. Line-of-business owners shouldn’t be allowed to create rogue applications without central coordination. Many of these skunkworks projects use copied data because it’s more expedient than tapping into production databases.

Identify opportunities to create insights from data. This is the most critical step. Strategic opportunity comes from thinking up innovative ways to apply data analysis.

Annuity Benefits

Here’s one more example of the benefits of data quality. An acquisitive bank had to make decisions about closing and consolidating branches. These choices are usually made based on location, but some analytical thinkers at the bank had a better idea.  They built a database of customer transactions across the bank’s many channels, including branches, telephone banking and Web.  Then they looked at the behavior of customers across those profiles.

They discovered that customers who used multiple physical and electronic channels were less likely to leave the bank. That meant that branches that counted many of those high-value customers were actually better candidates for closure. Using this innovative approach to decision-making, the bank was able to close 50 branches and save $38 million annually without any revenue impact. That’s an annuity benefit that resulted from an innovative analysis of data the bank already had.

A Mobile Game-Changer

Apple unveiled the iPhone Application List and boasted that it sold one million of the new 3G (third generation) devices in the product’s first weekend on the market. More important than the sales figures is the coup that Apple has pulled off: The iPhone 3G looks to be the first mobile device to make the leap from telecommunications to data. That makes it the first mobile platform to merit serious attention from marketers.

The iPhone inspires a passion among its users that few technology products have ever achieved. Ask an iPhone user to tell you what he or she likes and you’ll get a 20-minute sermon, complete with demos. What strikes me is that most people tell me they use the iPhone more for data than for telephony. This is where the product is a game-changer.

The mobile Internet has been an unholy mess for several years. Each handset maker, network operator and service provider uses a slightly different technology. This pointless incompatibility has frustrated application developers so much that many have decided simply to wait out the market until consensus is reached. About the only standard anyone has been able to agree upon is Mobile Web, a hobbled subset of the World Wide Web standards that doesn’t do anything particularly well.

Apple is bidding to change all that. The two big innovations in the iPhone are a usable http Web browser and sufficient local memory and storage to run applications on the device. This second feature is critical. Few people will choose to interact with their iPhone primarily through the browser, although they will browse to retrieve information. The beauty of native applications is that they can take full advantage of the iPhone’s speed and interface. When combined with a robust Internet back end, some truly interesting uses will develop.

In industry lingo, this setup is called client/server. Millions of people already take advantage of a mobile client/server architecture every day when they use their BlackBerries from Research in Motion. The BlackBerry’s e-mail interface is second to none, but an equally important usability factor is the device’s rapid performance. That’s because the BlackBerry downloads messages continually and displays them locally for rapid access. If the Blackberry’s performance was as slow as a Web e-mail program like Yahoo! Mail or Gmail, I suspect few people would bother with it.

Apple’s applications initiative is meant to give developers the means to build client/server applications on the iPhone. This can give users a fast, pleasant experience that’s optimized for the platform. Mobile Web doesn’t come close.

With an impressive list of more than 500 out-of-the-box iPhone applications and the capacity for developers to create really functional client/server programs, the iPhone stands to be the first truly mobile data device.

No cell phone maker I’ve seen has yet produced a meaningful competitor. Their origins are in voice, and most still don’t get the data thing. Apple’s early lead with mobile applications makes it the front runner in his new field.

Here’s the opportunity for marketers. Facebook pioneered the idea of using applications as a means to sell products, but there are so many Facebook applications right now that it’s almost impossible to break through the noise. The iPhone is currently an open field, and since people spend a lot more time away from their computers than in front of them, there’s more potential for audience engagement. The audience may be smaller, but the prospect of getting them to actually use your service is greater.

The applications that succeed on a mobile device will undoubtedly be different than those that work on a social network. Location awareness will be critical. Think in terms of what people want to do and know when they’re standing on a street corner or waiting in an airport. Give them services that help pass the time or entertain them. Better move quickly, though. I suspect the iPhone Application List won’t be a short one for very much longer.

Secrets of Selling Up

From Innovations, a website published by Ziff-Davis Enterprise from mid-2006 to mid-2009. Reprinted by permission.

Over the last couple of entries, I’ve talked about the benefits of deploying social network technology internally for business benefit. While the benefits may be clear to you, chances are you have to get past the sticky process of selling the idea to management, who invariably ask the return on investment (ROI) question.

ROI is important but it’s also a smokescreen that executives can use to avoid spending money. Corporations invest in plenty of functions that have vague ROI, including human resources and training. Unfortunately, social networking projects don’t lend themselves to clean ROI calculations. The benefits of better communications, more productive collaboration and access to organizational knowledge don’t fit neatly into a spreadsheet.

If you’re stuck on ROI, point to the often-neglected “I” part of the equation: investment. The new of tools, many of which are open source, are often significantly less expensive than the proprietary programs they replace. Then ask managers to consider the cost of a sales opportunity lost because team members didn’t have access to relevant knowledge within the organization. Or look at turnover rates and ask what the value is of capturing the knowledge that those departing workers take with them.

The best way to justify social media investments is through examples. Every large organization has a few people who understand the transformative effects of technology and who are eager to try new things. Embrace these people as your allies, particularly if their organizations produce revenue. Enlist them to participate in a pilot project that applies social networking concepts and technologies to an existing process. Perhaps it’s replacing an over-engineered project management system with a wiki. Or maybe it’s adopting podcasts in lieu of staff-wide conference calls. Offer all the support you can to make the pilot work. The experiences of early adopters wills rapidly spread throughout the organization.

One key to successful adoption of any new technology is to eliminate alternatives. If people are told to use a wiki for collaboration, for example, but still have the option of communicating with colleagues by e-mail, they will choose the latter. People default to the path of least resistance, and that usually means sticking with what they know. You have to remove that option to demonstrate the effectiveness of an alternative solution. You can’t eliminate alternatives, but you’re sponsor can. That’s why choosing allies at the management level is so important.

A successful pilot project is a foundation for revisiting the ROI question. Interview project participants to document their impressions. Identify what they believe are the cost savings or revenue opportunities. Share these field experiences with the skeptics. Real-life stories are more convincing than formulas.

Supplement this with an internal public relations effort. Subscribe to RSS feeds of bloggers and publications that advocate for your position. Share expert opinions and published case studies with your internal decision-makers. Keep an eye out, in particular, for activities by your competitors. Don’t expect results overnight, but with time and repetition, you can wear down the critics.

I'm Going Viral: 250 Free Galleys of My Next Book Available

In the spirit of viral marketing, and with my publisher’s enthusiastic consent, I’m giving away 250 copies of my new book, Secrets Of Social Media Marketing, to anyone who fills out this form on my website. I’m betting that if people start talking about the book before it reaches the shelves, it will be good for sales.

I’d like to take credit for the idea, but it was actually proposed me by my friend and colleague David Meerman Scott, who co-authored Tuned In, a business book that has been number one on Amazon several times in the last few weeks. Scott’s New Rules of Marketing and PR was the number one marketing book of 2007 on Amazon, so if it’s good enough for him..

Creating Content From the Ground Up

One of my clients has been experimenting with an innovative and efficient approach to content development and I want you to know about it.

The company is in a highly specialized and big-ticket b-to-b industry. Its executives are very busy and very well paid. The VP of marketing wanted to develop some thought leadership white papers, but the prospect of pinning down these executives for hours to develop the content wasn’t practical. Instead, the marketing departing is using podcasts to construct white papers from the ground up

Here’s how it works: We schedule a 30- to 45- minute phone call with these busy executives to capture background information and hot topics in their areas of expertise. I then create a list of questions that are intended to draw out the executives’ thinking (journalists are pretty good at this!).

We record an interview of approximately 30 minutes’ duration. An edited version is posted as a podcast on the company’s website, but the marketing group also has the full interview transcribed via a low-cost outside service. Marketing cleans up and reorganizes the transcript and posts the document as a position paper.

Over a series of interviews, an executive’s observations and experiences can be rolled up in interesting ways. Multiple interviews with one executive can yield an in-depth white paper. Or point interviews with several executives can be combined into a corporate backgrounder. Customers and prospects can also subscribe to the podcast series. For the small transcription fee (services can be had for as little as a dollar a minute) and some inexpensive editing, the VP has a series of byline articles from the most visible people in his company.

Rethinking Research
I’ve recommended this approach to more and more clients lately. New online tools enable us to rethink our approach to assembling complex documents. It used to be the process demanded hours or days of research. Now we can take notes in real-time and assemble them later.

Blogs are ideally structured as collections of thoughts, observations and insights expressed in short bursts. It’s fast and easy to capture these brainstorms online. Got an idea? Twitter it for prosperity. When you go back and look at information assembled in this way, you often see relationships that weren’t obvious at the time. Between search, tags and bookmarks, it’s possible to assemble these building blocks in different ways.

Some thought leaders take this to the limit. Marketing guru Seth Godin, for example, is known for writing entire books based on collections of interesting blog posts. The blog is his notepad for ideas that can be combined into coherent themes.

In some (though certainly not all) cases, this is a more efficient way to research a topic than spending hours mining the Web or library stacks. For my client, it’s also a way to repurpose content across multiple media. Maybe it will work for you. What do you think? Twitter me @paulgillin.

Data Quality Problems are Corporate IT’s Dirty Little Secret

From Innovations, a website published by Ziff-Davis Enterprise from mid-2006 to mid-2009. Reprinted by permission.

In the early days of home broadband, I was a customer of a very large cable company; whose name I’m sure you know. When making one of my frequent calls to the technical support organization, I was typically required to give all my account information to a representative who then transferred me to a support tech, who asked me to repeat all of the account information I just gave the first person.. If my problem was escalated, I got transferred to a third rep who would ask me for — you guessed it – my account information.

This company’s reputation for lousy customer service was so legendary that if you type its name and “customer service” into a search engine today, you get mostly hate sites. One of its biggest problems was poor data integration. For example, its sales database was so fragmented that I routinely received offers to sign up for the service, even though I was already a customer. I’m a customer no longer.

Does this sound familiar? Most of us and have had frustrating experiences of this kind. Thanks to acquisitions, internal ownership squabbles and poor project management, most large companies have accumulated islands of disintegrated, loosely synchronized customer data. PricewaterhouseCoopers’ 2001 Global Data Management survey found that 75 percent of large companies had significant problems as a result of defective data. It’s unlikely the situation has improved much since then. Data quality is corporate America’s dirty little secret.

The Path to Dis-Integration
There are several reasons for this, according to Paul Barth, an MIT computer science Ph.D. who’s also co-founder of NewVantage Partners. One is acquisitions. As industries have consolidated, the survivors have accumulated scores of dissimilar record-keeping systems. Even basic definitions don’t harmonize. Barth tells of one financial services company that had six different definitions for the term “active customer.”

A second problem is internal project management. The decentralization trend is pushing responsibility deeper into organizations. There are many benefits to this, but data quality isn’t one of them. When departmental managers launch new projects, they often don’t have the time or patience to wait for permission to access production records. Instead, they create copies of that data to populate their applications. These then take on a life of their own. Synchronization is an afterthought, and the occasional extract, transform and load procedure doesn’t begin to repair the inconsistencies that develop over time.

Data entry is another problem. With more customers entering their own data in online forms and fewer validity checks being performed in the background, the potential for error has grown. Data validation is a tedious task to begin with and really effective quality checks tend to produce a lot of frustrating error messages. E-commerce site owners sometimes decide it’s easier just to allow telephone numbers to be entered in the ZIP code field, for example, as long as it moves the customer through the transaction.

Finally, data ownership is a tricky internal issue. Many business owners would rather focus on great features than clean data. If no one has responsibility for data quality, the task becomes an orphan. The more bad data is captured, the harder it is to synchronize with the good stuff. The problem gets worse and no one wants responsibility to clean up that mess.

Barth has a prescription for addressing these problems. It isn’t fast or simple, but it also isn’t as difficult as you might think. Next week we’ll look at an example of the benefits of good data quality and offer some of his advice for getting your data quality act together.

Daily Reading 07/14/2008

  • Hasbro badly fumbled the opportunity to profit from the popularity of Scrabulous, a knock-off version of Scrabble that is a megahit on Facebook. Instead of partnering with or buying the developers of Scrabulous, Hasbro sued them instead. Now it is bringing an officially sanctioned version of Scrabble to Facebook – a year late. Expect audience indifference.

    tags: daily_reading, facebook

  • An SEO pro looks at three business websites and offers practical advice to improve their search performance. There is good meat-and-potatoes wisdom in this article. Any company could benefit from these tips.

    tags: daily_reading, search