The Coming Utility Computing Revolution

From Innovations, a website published by Ziff-Davis Enterprise from mid-2006 to mid-2009. Reprinted by permission.

Nicholas Carr is at it again, questioning the strategic value of IT.  Only this time I find myself in nearly total agreement with him.

Carr became famous, or infamous, for his 2003 Harvard Business Review article “IT Doesn’t Matter,” in which he argued that IT is an undifferentiated resource that has little strategic business value.  His thinking has evolved since then, and in his new book, The Big Switch, he proposes that utility computing will increasingly become the corporate information infrastructure of the future.

Utility computing means different things to different people.  Some people draw an analogy to the electrical grid, but Carr argues that the information utility is far richer and more strategic.  He outlined some of his perspective in this Q&A interview in CIO Insight magazine.

Nicholas Carr

Nicholas Carr

The utility computing model that Carr foresees encapsulates many of the hottest concepts in IT today: virtualization, modular computing, software as a service, Web 2.0 and service-oriented architecture.  Computing utilities of the future will be anchored in enormous data centers that deliver vast menus of applications and software components over the Internet. Those programs will be combined and tailored to suit the individual needs of business subscribers.

Management of the computing resource, which for many years has been distributed to individual organizations, will be centralized in a small number of entities that specialize in that discipline.  Users will increasingly take care of their own application development needs and will share their best practices through a rich set of social media tools.

In this scenario, the IT department is transformed and marginalized.  Businesses will no longer need armies of computing specialists because the IT asset will be outsouced. Even software development will migrate to business units as the tools become easier to use.

This perspective is in tune with many of the trends that are emerging today.  Software is a service is the fastest growing segment of the software market and is rapidly moving out of its roots in small and medium businesses to become an accepted framework for corporate applications.  Data centers are becoming virtualized and commoditized. Applications are being segmented into components defined as individual services, which can be combined flexibly at runtime.

There are sound economic justifications for all of these trends, and there’s no reason to believe they won’t continue.  So what does this mean for IT organizations and the people who work in them?

Carr sums up his opinion at the end of the CIO Insight interview: “[I]nformation has always been a critical strategic element of business and probably will be even more so tomorrow. It’s important to underline that the ability to think strategically…will be critically important to companies, probably increasingly important, in the years ahead.”

Taking this idea one step further, you can envision a future in which a pure IT discipline will become unnecessary outside of the small number of vendors that operate computer utilities.  University computer science programs, which have long specialized in teaching purely technical skills, will see those specialties merged into other programs.  Teenagers entering higher education today are already skill at building personal application spaces on Facebook using software modules.  It’s a small step to apply those principles to business applications.

Sometimes, the past is a good predictor of the future. In my next entry, I’ll give an example of how technology change revolutionized the world a century ago and draw some analogies to the coming model of utility computing.

Function Over Features

From Innovations, a website published by Ziff-Davis Enterprise from mid-2006 to mid-2009. Reprinted by permission.

I’ll admit to being a hopeless gadget freak, the kind who has to have the latest shiny object, if not the day comes out, then within a few months.  One of my addictions is MP3 players, of which I’ve owned at least six dating back to the mid-1990s.

The most feature-laden MP3 player I’ve ever known is the iRiver, a product that was ahead of its time in price/performance. Not only did it boast a spacious 20 GB hard disk two years ago, but it has very good digital recording capabilities. At $200 in 2006, it was a steal.  And yet I barely use it.

Instead, I have consistently opted for the more expensive and less feature-rich Apple iPod.  Why? Because for all its technical sophistication, the iRiver is too damned hard to use. Even after two years, I still struggle with its unintuitive menu system. The iPod, in contrast, is almost joyfully simple. It’s twice the cost and less than half the capacity, yet it’s my MP3 player of choice.

This experience occurred to me recently when I was listening to this keynote presentation from the O’Reilly Media Rails Conference by software development expert Joel Spolsky. Download it and listen.  It’s 45 minutes of sheer fun.

Spolsky’s point is anything but a joke, however. He tells his techie audience is not to let complexity obscure the appeal of simplicity. he cites the example of Apple’s iPhone and compares it to the Samsung Blackjack. In nearly every technical respect, the blackjack is a superior product.  It has more features, better bandwidth and expandable storage.  It supports Bluetooth and many Windows applications.  It weighs less than the iPhone and has a full built-in keyboard.  Yet the iPhone is killing the Blackjack – and everybody else – in the mobile device market.

The Futility of Features Wars

We’ve see this phenomenon repeatedly in the consumer electronics market. Technically superior products lose out to rivals that excel at capturing the user’s imagination. Microsoft Windows defeated the technically superior OS/2 on the desktop.  TiVo continues it to rule the roost in the DVR market, despite the presence of rivals with better price/performance.  The Macintosh is now putting pressure on cheaper Windows machines, largely because Apple has tuned it to work well with a small number of really popular applications.

Spolsky makes the point that developers often fixate on features and trivialize user experience.  This isn’t surprising.  Many developers I’ve known dismiss design and user navigation as detail work. They want bells and whistles, which is what appeals to them.

But ordinary consumers could care less about these things.  In many cases, they will make huge trade-offs in functionality and cost in order to get something that just works.

In their best-selling book, Tuned In, authors Craig Stull, Phil Myers and David Meerman Scott cite many examples of this effect. Among them is Nalgene, a plastic water bottle that is marketed to college students and outdoor enthusiasts in a variety of vibrant colors and branded labels.

Few products are more commoditized than plastic bottles, yet a clever marketing campaign built around environmental awareness and students’ need to express themselves through their accessories has made this vessel a hit at twice the price of its competitors. Thermo Fisher Scientific succeeded in marketing a product originally aimed at scientists to a consumer market because it was able to get inside the minds of those target customers.

The authors of Tuned In advocate rigorous market research over gut level decision-making. Their mantra: “Your opinion, while interesting, is irrelevant.”  In other words, companies that produce products for themselves often succeed in selling to precisely that market.

User experience does count. Just ask any iPod owner.

Open Source Quality

From Innovations, a website published by Ziff-Davis Enterprise from mid-2006 to mid-2009. Reprinted by permission.

The market for data quality software has gone open source, and it’s about time.

Late last month, Talend Open Data Solutions, a maker of data integration and profiling software, made its Talend Data Quality software available under the General Public License. This follows the French company’s move in June to open its Open Profiler product in a similar way. The tools can be used together to assess the quality of the information in customer databases and to correct basic name and address information against a master list maintained by the US Postal Service.

What’s more important, though, is the potential of open sourcing to bring down the costs of the complex and frustrating data cleansing process. As I noted a few weeks ago, data quality is one of the most vexing problems businesses face. Data that’s inconsistent, out of date or incorrectly formatted creates inefficiency, angry customers (have you ever gotten three direct mail pieces at the same time, each addressed a little differently?) and lost opportunity.

Solutions to data quality problems have existed for decades, but they’ve always been sold by small vendors, usually at prices starting in the six figures. Over the last few years, many of these vendors have been snapped up by bigger software firms, which then bundled data quality tools and services into giant software contracts. There aren’t a lot of vendors left that specialize in solving the quality problem specifically.

This fragmentation has frustrated a process that should be a part of every company’s IT governance practice. While each company has its own data quality issues, many are common to a large number of businesses. Talend’s approach doesn’t address the cost of accessing databases of “clean” data, but it has promise to make the cleansing process itself cheaper and more automated.

The secret is the magic of open source, which enables users to easily exchange their best work with each other. For just one example of how this works, check out SugarCRM’s SugarExchange. This collection of third party applications has been built by contributions from SugarCRM’s customers and independent developers. While some modules carry license fees, many don’t. The point is that software authors who useful extensions to the base CRM system have the wherewithal to share or sell them to others who have similar needs. That’s difficult or impossible to do with proprietary software.

This so-called “forge” approach to development lends itself particularly well to data quality because so many issues are common to multiple companies. For example, if I come up with a clever way to compare employee records to a database of known felons, I should be able to share it and even charge for it. That isn’t possible when the market is spread across an assortment of small, closely held companies. If Talend can extend its TalendForge library to incorporate a robust collection of data quality components, it can make data cleansing practical and affordable to a much larger universe of companies.

The data quality problem is so pervasive that it demands a collaborative approach to a solution. This is a good start.

Google’s Chrome is a Game-Changer

From Innovations, a website published by Ziff-Davis Enterprise from mid-2006 to mid-2009. Reprinted by permission.

Sometimes, innovation is in knowing what to take out as much as what to put in.

Case in point is Chrome, the new browser from Google. You would think the last thing the world needs is another browser, and you would be right if all that browser did was pile on more features.  Google again bucks the conventional wisdom with Chrome, however. It’s fast, simple and designed for the way people use the Web today rather than how they used it in a few years ago.

Google released the Chrome beta yesterday to immediate speculation that it’s the foundation for an Internet operating system.  The company vigorously denies these claims, but based on my own admittedly limited tests, I’d say Google’s protests ring hollow.  Chrome is clearly aimed at Microsoft’s jugular, and if it succeeds in gaining widespread adoption, it will hasten adoption of the whole software-as-a-service style of computing.

There is nothing particularly innovative about the Chrome interface, other than its stark simplicity and the clever way in which it integrates search and browsing history.  Simplicity comes from Chrome’s adherence to the Firefox interface and its minimalist features.  For example, the browser has no menu bar.

What blew me away about this early version of Chrome, though, is its speed. For starters, Chrome doesn’t burden users with the constant procession of warnings and dialog boxes that have made Internet Explorer almost unusable. That may change if Chrome becomes an object for hacker attacks, but for now, Google’s relative freedom from security threats and government scrutiny is a plus.

More importantly, Google has done some innovative work under the covers to enhance Chrome’s performance running AJAX applications. AJAX is the foundation of Web apps, but its surging popularity has caused some problems for users. That’s because programs written in JavaScript (the “J” in AJAX) seize control of the browser while they’re working. A single bad script can slow an entire computer to a crawl. Memory management limitations in Firefox have also hampered performance.

Chrome has several features to optimize JavaScript programs. In brief, Chrome allows each JavaScript program to inhabit a unique virtual machine and a unique thread.  This lessens the likelihood that one program can monopolize an entire session.  It also means that a crash in one browser window doesn’t take down the whole application.

There are also some innovations in memory management and garbage collection that speed performance.  You can satisfy your inner techie by reading an extensive sequence of technical explainers presented in cartoon format at www.google.com/googlebooks/Chrome/.

Google has a self-interest in optimizing AJAX, of course: that’s the way it delivers its Documents line of office applications. Chrome is clearly optimized to work well with Google Documents.  In my tests, a Google Document loaded faster than one launched with Microsoft Word or Excel.  While Google Documents still don’t approach Microsoft’s functionality, its open architecture has been a magnet for independent developers, who will quickly add features the core applications lack. Google also recently made it possible for word processing documents to run offline using its Gears plug-in. Taken together, Google Documents on Chrome are a much more compelling alternative to Microsoft Office than they have been in the past.

Google now enjoys the pole position on the Internet and has an impressive suite of applications with the capacity to run them offline. Chrome still requires Windows to run, but we can expect that barrier to fall soon. Microsoft is about to fight back with Internet Explorer 8, but previews suggest the browser won’t break a lot of new ground. I suspect throats are tightening a little in Redmond, Wash.

Global Treasure Hunt Unleashes Innovation

From Innovations, a website published by Ziff-Davis Enterprise from mid-2006 to mid-2009. Reprinted by permission.

The start of a week-long vacation has me in a somewhat festive mood today, so I thought I’d tell you about a new hobby I’ve been pursuing recently that’s also given me new insight into capacity of IT to transform lives.

About two years ago, my wife and I took up geocaching. It’s a global game that was enabled by the Clinton administration’s 2000 decision to open up the government’s network of global positioning satellites (GPS) to civilian use. Geocaching is basically a worldwide treasure hunt. Players use handheld GPS receivers to find containers placed by other enthusiasts in locations ranging from city street corners to remove mountaintops.

geocache
A classic cache hide in the base of a tree.

In its simplest form, players provide each other with nothing more than the GPS coordinates for the treasures, which are typically containers filled with trinkets. Geocaches can be as small as a pencil eraser or as large as a suitcase. When players locate one, they note their visit in a log book that’s kept in the container and also post their “find” on the geocaching.com website.

Global Craze

Geocaching has exploded in popularity. In January, 2005, there were about 141,000 geocaches hidden in the world. Today, there are more than 640,000, and a half million “finds” are logged each week. There are geocaches all around you. Nearly 1,000 have been placed within a 10-mile radius of Dallas-Fort Worth airport alone. My wife and I were so taken with this phenomenon that we started to write a book about it – Geocaching Secrets – that will be published in the spring.

One thing that’s captivated us about geocaching is transformative effect it has on the hobbyists who pursue it. We’ve spoken to several people who have logged more than 20,000 finds over the last eight years, or nearly 10 per day. Two years ago, one team spent months planning an attempt to find the most caches in a single day. They finished with over 300.

People report that geocaching has brought their families together, introduced them to dozens of new friends and led them to destinations that they never would have visited. One enthusiast credits the game with helping him shed 150 pounds and giving up smoking by using various vaporizers of all kinds. Several have said it saved their marriages. One disabled war veteran told me geocaching gavee him a reason to live at a time when he was seriously contemplating suicide.

It’s also brought people outdoors and introduced them to new worlds they never knew. In seeking caches within my local area, I’ve visited dozens of parks, nature preserves and cityscapes that I never knew existed. One recent excursion took my family on a discovery tour through the streets of lower Manhattan, with each stop relating to a different Revolutionary War event.

Platform for Innovation

Sprinkler_Head_Cache
Container hidden as a sprinkler head

I’ve been struck by the innovative spirit that this relatively simple idea has unleashed. The people who hide geocaches (some have stashed well over 1,000) come up with devilishly difficult tactics to disguise their treasures. One person specializes in hiding tiny containers inside of animal bones. Another constructs containers out of hollowed-out logs. Many others specialize in scrambling the geo-coordinates in elaborate puzzles and ciphers. One recent search directed my wife and me to watch a movie for a reference to obscure letters written by Benjamin Franklin. We then had to apply a cipher to the writings to zero in on individual words, the first letters of which spelled out the clue. That’s just the tip of the iceberg. Some cache owners devise riddles that would tax the skills of a mathematics Ph.D.

Educators use geocaching to teach their students about geology, mathematics and creative thinking. The Arkansas Parks Service stashed a cache in each of the state’s 52 parks this year in an effort to lure citizens into the outdoors. Hundreds of finds were reported in just the first two weeks.

Like the Internet, the geocaching community is entirely self-guided. There are no governing bodies and no formal leadership. The rules of the game are determined by the consensus of the players in a process that’s loosely democratic but without votes. Bad behavior is regulated by peer pressure.  People do the right thing in order to preserve the simple beauty of the game.

Geocaching is just one example of how technology unleashes creativity that changes people’s lives. It’s also a testament to how big ideas emerge when people are given tools and freedom to discard assumptions and invent new possibilities.

A Distant Mirror of Technology Change

Reading a recent CIO Insight interview with Timothy Chou brought to mind an analogy to, of all things the early days of railroading.

Chou is the kind of futurist who makes the guardians of the conventional wisdom cringe: an industry veteran who understands the way things have always been done and who now tours the country arguing for radical change.

After spending 25 years selling IT products to CIOs, Chou joined Oracle in 1999 to launch the company’s fledgling software as a service (SaaS) business. The experience was transformative. In 2004, he penned a provocative book called The End of Software in which he argued that the compelling benefits of SaaS would sweep over the enterprise software industry. He’s now self-published a second book, Seven, that challenges CIOs and software vendors to understand how important IT infrastructure is to their businesses and to make some tough decisions about where to invest. The title refers to seven software business models ranging from traditional licensing at one extreme to Internet-only software businesses like Amazon and Google at the other.

In an interview with Paula Klein, Chou is careful not to lecture CIOs about the choices they should make, but it’s pretty clear where he stands: most of them should be aggressively moving to hand over much of their software and infrastructure to someone else. While CIOs are ideally positioned to understand the business tradeoffs (“Ask any CIO who has completed an ERP implementation, and he or she will tell you more about how the business really runs than anyone on the executive staff,” he says), the decision– is a political minefield where many CIOs still fear to tread.

Losing Control

One of the most often-cited reservations CIOs have about SaaS is that they lose control of their IT environment. Chou argues that this is a red herring. Most IT organizations aren’t particularly good at backup and recovery, for example, so the belief that they can do a better job than a commercial vendor is misplaced. Many people think you have to pay more for reliability, but the opposite is true,” he says.” If you want to buy the most reliable car, don’t pick the $1 million handcrafted sports car; find a Toyota that’s been produced a million times.”

Addressing another common concern – that SaaS deprives IT organizations of the ability to customize their software environments – Chou argues that the tradeoff is usually worth it. Businesses, he says, “may have to forgo much of the customization that they’re used to in order to get simpler, cheaper, more reliable applications.”

His remarks reminded me of an insight I gained while on a recent visit to the Railroad Museum of Pennsylvania. There I learned that in the days before the railroads were built, it took Pennsylvania citizens two full weeks to travel from Philadelphia to Pittsburgh.  This forced communities to become self-sufficient because it was so difficult to obtain goods from the outside world.

The Triumph of Choice

Railroads ushered in the fruits of the Industrial Revolution, making it possible for mass produced goods to reach a wide audience. I’m sure that many people at the time mourned the loss of the local blacksmith, who crafted each horseshoe individually. Some probably chafed at the idea of molding their lives to the demands of a railroad timetable.

In the end, however, most people accepted the fact that low cost and wide selection was a reasonable alternative to expensive, customized goods.  And as much as we gripe about the tyranny of the airlines today, we appreciate the fact that they can get us from New York to San Francisco for less than $300.

The alternative to complete control is choice, and the power shift from seller to buyer has brought dramatic benefits, Chou argues. In the past, “Enterprise software vendors sold only $1 million products, and the only channel to buyers was a human,” he tells CIO Insight. “Today, the Internet provides a low-cost, diverse channel for information that can be used to educate anyone. Reference calls have been replaced with hundreds of online forums that let you understand a diverse set of experiences with a product.

In other words, competition and the rules of the market create a healthy atmosphere for diversity, which increases choice. Buyers may not be able to get exactly what they need, but they can come pretty close. For most of them, that’s a pretty good trade-off.

From Innovations, a website published by Ziff-Davis Enterprise from mid-2006 to mid-2009. Reprinted by permission.

With Cloud Computing, Look Beyond the Cost Savings

From Innovations, a website published by Ziff-Davis Enterprise from mid-2006 to mid-2009. Reprinted by permission.

Back in the early days of data center outsourcing, some pioneer adopters got a rude surprise.  These companies had outsourced all or large parts of their data centers to specialty providers who bought their equipment, hired their staff and offered attractive contract terms that shaved millions of dollars in expenses in the first year.

The surprise was that the contract terms weren’t so that attractive once the outsourcer became embedded in the client’s expense line.  Customers found that nearly everything carried hefty escalator fees, ranging from unbudgeted capacity increases to software patches to staff overtime. But there was little customers could do.  They were locked into the contractor and the cost of unlocking themselves was prohibitive.

This story came to mind recently during a chat with Bob McFarlane, a principal at facilities design firm Shen Milsom & Wilke. McFarlane is an expert in data center design, and his no-nonsense approach to customer advocacy has made him a hit with audiences around the country.

McFarlane thinks the current hype around hosted or “cloud” computing is getting out of touch with reality.  Cloud computing, which I’ve written about before, involves outsourcing data processing needs to a remote service, which theoretically can provide world-class security, availability and scalability.  Cloud computing is very popular with startups these days, and it’s beginning to creep onto the agenda of even very large firms as they reconsider their data processing architectures.

The economics of this approach are compelling.  For some small companies in particular, it may never make financial sense to build a captive data center because the costs of outsourcing the whole thing are so low.  McFarlane, however, cautions that value has many dimensions.

What is the value, for example, of being able to triple your processing capacity because of a holiday promotion?  Not all hosting services offer that kind of flexibility in their contract, or if they do, may charge handsomely for it.

What is the value of knowing that your data center has adequate power provisioning, environmentals and backups in case of a disaster? Last year, a power failure in San Francisco knocked several prominent websites offline for several hours when backup generators failed to kick in. Hosting services in earthquake or flood-prone regions, for example, need extra layers of protection.

McFarlane’s point is to not buy a hosting service based on undocumented claims or marketing materials. You can walk into your own data center and kick a power cord out of the wall to see what happens.  Chances are you can’t do that in a remote facility.  There are no government regulations for data center quality, so you pretty much have to rely on hosting providers to tell the truth.

Most of them do, of course, but even the truth can be subject to interpretation. The Uptime Institute has created a tiered system for classifying infrastructure performance. However, McFarlane recalls one hosting provider that advertised top-level Uptime Institute compliance but didn’t employ redundant power sources, which is a basic requirement for that designation.

This doesn’t mean you should ignore the appealing benefits of cloud computing, but you should look beyond the simple per-transaction cost savings. Scrutinize contracts for escalator clauses and availability guarantees.  Penalties should give you appropriate compensation.  While you won’t convince a hosting service to refund you the value of lost business, you should look for something more than a simple credit toward your monthly fee.

If you can, plan a visit to a prospective hosting provider and tour its facilities.  Reputable organizations should have no problem letting you inside the data centers and allowing you to bring along an expert to verify their claims. They should also be more than willing to provide you with contact information for reference customers. Familiarity, in this case, can breed peace of mind.

Packaged Innovation

From Innovations, a website published by Ziff-Davis Enterprise from mid-2006 to mid-2009. Reprinted by permission.

Can you bottle innovation?  Conventional wisdom says no; innovation comes from inspiration backed by knowledge.  It can’t be packaged or automated.

However, a Boston-based company is challenging the conventional wisdom. Innovation Machines has developed technology that applies a semantic search engine to the task of mining possibilities for innovative new materials and procedures.

If the phrase “semantic search” means nothing to you, join the club.  I got a telephone briefing on Innovative Machines’ technology and couldn’t quite figure out what it did.  So I stopped in for a visit and got one of the more impressive demos I’ve seen in recent years.

I’m a veteran of thousands of demos, and so have learned to be skeptical, but this was interesting stuff.  Innovative Machines’ customer list would indicate that the company is on to something.

Semantic search involves mining text documents not only for terms but for relationships between terms. Most search engines can’t do this. They can deliver some insight by finding words in close proximity to each other, but they don’t establish a clear relationship.

For example, if you search for “smoking” and “cancer” on Google, the results indicate there’s a relationship between the two, but the search engine won’t explicitly define that relationship. Semantic search goes a step further. It’s intended to deliver a small number of results but with terms that are specifically related to each other. For example, a semantic search engine might infer from its search that smoking and cancer are related and return documents that explain that relationship.

The semantic engine is at the core of what Goldfire does. A host of other features are wrapped around that, including a project workbench and a database of scientific and patent literature.  The demo I saw showed one example of how innovation can be guided, if not packaged.

Suppose your company makes packaged food and you want to figure out a way to substitute artificial sweetener for sugar.  Engineers can use the workbench to deconstruct ingredients in the current product and then test the substitution of various artificial sweeteners.  Goldfire’s scientific database understands the characteristics of alternative ingredients, such as texture, taste, heat tolerance and chemical interactions.  A researcher could model the impact of substituting different artificial sweeteners and determine which ones are good candidates for a new recipe.  By querying on the attributes of potential substitutes, engineers could also discover new ingredients they hadn’t thought of.

The patent database comes into play when attempting to innovate on existing intellectual property.  For example, an automotive engineer could deconstruct the components of a patented turbocharger and test the impact of substituting different metal alloys.  This could lead to an improved design that doesn’t infringe on existing patents.  In fact, Invention Machine says this re-engineering of existing patents is one of the most popular applications of its product.

Goldfire isn’t a simple product and to use.  Customers typically go through several days of training and setup to customize the software to their industry.  It also isn’t cheap; installations run in the six figures. For the kinds of problems Goldfire is meant to solve, however, these costs aren’t surprising.

Goldfire is a difficult product to describe, but an easy one to understand once you see it in action.  The company provides several podcasts and videocasts that demonstrate how customers are applying the technology.  This isn’t innovation in a bottle, but it’s a pretty good start.

Incidentally, I have no financial interest in the company or its product. I just think this is a technology that deserves more attention.

Data Quality Problems are Corporate IT’s Dirty Little Secret

From Innovations, a website published by Ziff-Davis Enterprise from mid-2006 to mid-2009. Reprinted by permission.

Paul Barth likes to tell the story of a commercial wholesale bank that saw the opportunity to grow business and customer satisfaction by up-selling existing customers. Its sales force had traditionally been organized around product lines, but the bank knew that cross-selling was an unexploited opportunity. The question was what to sell to whom.

Sales management decided to pull all its customer information into one place to look for patterns. By targeting segments, the sales force could sell its products to the most likely buyers. Small businesses, for example, were good prospects for lines of credit.

Armed with these profiles, the bank retrained its sales force to sell a portfolio of products to different customer types.  The predictive data allowed reps to focus their energies on the customers who were most likely to buy. The result: sales to existing customers grew $200 million, product-per-customer ratios increased and customer satisfaction rose 50%.  All of this was achieved without expanding the product line.

Clean Data Yields Insight

This wasn’t an example of technology wizardry.  It was an example of clean, well-integrated data leading to a positive business outcome. By taking an integrated view of its customers, the bank was able to spot patterns and derive insights that led to smart business decisions.  Cost was minimal, upside was huge and the decisions resulted in a new annuity stream of increased business that lasted years beyond the original decision.

Data quality is an under-appreciated advantage.  For companies that apply the discipline and planning to address their data-quality needs and take an innovative approach to identifying new insights, the payoff is an order of magnitude of the cost.

Most large organizations have data quality problems that have resulted from years of acquisitions, stovepipe application development and error-prone data entry. Making sense of the mess looks like a huge job, but it can be done if you follow a disciplined process, suggests Barth, an MIT computer science Ph.D. who’s also co-founder of NewVantage Partners.

First, you need to step through all their critical data elements and decide what information is really strategic to the business. This process alone can winnow down tens of thousands of data elements to perhaps a few hundred.

Then establish consistent terminology. Barth cites one client that had six different definitions of “active customer.”  Everyone in the company needs to use the same language. A metadata repository can help standardize terms.

“Start to centralize where data is stored, how it’s provisioned and how people access and use it,” Barth recommends. Minimize copies. Each copy of production data creates complexity and the possibility of error.

Put in place a governance process for data quality that specifies rules about what levels of quality is acceptable. Create metrics by which to measure quality levels. Establish data ownership. One of the reasons companies have so many data problems is that no one owns the quality process. Ownership creates responsibility and discipline.

Get a handle on application development. Line-of-business owners shouldn’t be allowed to create rogue applications without central coordination. Many of these skunkworks projects use copied data because it’s more expedient than tapping into production databases.

Identify opportunities to create insights from data. This is the most critical step. Strategic opportunity comes from thinking up innovative ways to apply data analysis.

Annuity Benefits

Here’s one more example of the benefits of data quality. An acquisitive bank had to make decisions about closing and consolidating branches. These choices are usually made based on location, but some analytical thinkers at the bank had a better idea.  They built a database of customer transactions across the bank’s many channels, including branches, telephone banking and Web.  Then they looked at the behavior of customers across those profiles.

They discovered that customers who used multiple physical and electronic channels were less likely to leave the bank. That meant that branches that counted many of those high-value customers were actually better candidates for closure. Using this innovative approach to decision-making, the bank was able to close 50 branches and save $38 million annually without any revenue impact. That’s an annuity benefit that resulted from an innovative analysis of data the bank already had.

Secrets of Selling Up

From Innovations, a website published by Ziff-Davis Enterprise from mid-2006 to mid-2009. Reprinted by permission.

Over the last couple of entries, I’ve talked about the benefits of deploying social network technology internally for business benefit. While the benefits may be clear to you, chances are you have to get past the sticky process of selling the idea to management, who invariably ask the return on investment (ROI) question.

ROI is important but it’s also a smokescreen that executives can use to avoid spending money. Corporations invest in plenty of functions that have vague ROI, including human resources and training. Unfortunately, social networking projects don’t lend themselves to clean ROI calculations. The benefits of better communications, more productive collaboration and access to organizational knowledge don’t fit neatly into a spreadsheet.

If you’re stuck on ROI, point to the often-neglected “I” part of the equation: investment. The new of tools, many of which are open source, are often significantly less expensive than the proprietary programs they replace. Then ask managers to consider the cost of a sales opportunity lost because team members didn’t have access to relevant knowledge within the organization. Or look at turnover rates and ask what the value is of capturing the knowledge that those departing workers take with them.

The best way to justify social media investments is through examples. Every large organization has a few people who understand the transformative effects of technology and who are eager to try new things. Embrace these people as your allies, particularly if their organizations produce revenue. Enlist them to participate in a pilot project that applies social networking concepts and technologies to an existing process. Perhaps it’s replacing an over-engineered project management system with a wiki. Or maybe it’s adopting podcasts in lieu of staff-wide conference calls. Offer all the support you can to make the pilot work. The experiences of early adopters wills rapidly spread throughout the organization.

One key to successful adoption of any new technology is to eliminate alternatives. If people are told to use a wiki for collaboration, for example, but still have the option of communicating with colleagues by e-mail, they will choose the latter. People default to the path of least resistance, and that usually means sticking with what they know. You have to remove that option to demonstrate the effectiveness of an alternative solution. You can’t eliminate alternatives, but you’re sponsor can. That’s why choosing allies at the management level is so important.

A successful pilot project is a foundation for revisiting the ROI question. Interview project participants to document their impressions. Identify what they believe are the cost savings or revenue opportunities. Share these field experiences with the skeptics. Real-life stories are more convincing than formulas.

Supplement this with an internal public relations effort. Subscribe to RSS feeds of bloggers and publications that advocate for your position. Share expert opinions and published case studies with your internal decision-makers. Keep an eye out, in particular, for activities by your competitors. Don’t expect results overnight, but with time and repetition, you can wear down the critics.