Data Quality Problems are Corporate IT’s Dirty Little Secret

From Innovations, a website published by Ziff-Davis Enterprise from mid-2006 to mid-2009. Reprinted by permission.

Paul Barth likes to tell the story of a commercial wholesale bank that saw the opportunity to grow business and customer satisfaction by up-selling existing customers. Its sales force had traditionally been organized around product lines, but the bank knew that cross-selling was an unexploited opportunity. The question was what to sell to whom.

Sales management decided to pull all its customer information into one place to look for patterns. By targeting segments, the sales force could sell its products to the most likely buyers. Small businesses, for example, were good prospects for lines of credit.

Armed with these profiles, the bank retrained its sales force to sell a portfolio of products to different customer types.  The predictive data allowed reps to focus their energies on the customers who were most likely to buy. The result: sales to existing customers grew $200 million, product-per-customer ratios increased and customer satisfaction rose 50%.  All of this was achieved without expanding the product line.

Clean Data Yields Insight

This wasn’t an example of technology wizardry.  It was an example of clean, well-integrated data leading to a positive business outcome. By taking an integrated view of its customers, the bank was able to spot patterns and derive insights that led to smart business decisions.  Cost was minimal, upside was huge and the decisions resulted in a new annuity stream of increased business that lasted years beyond the original decision.

Data quality is an under-appreciated advantage.  For companies that apply the discipline and planning to address their data-quality needs and take an innovative approach to identifying new insights, the payoff is an order of magnitude of the cost.

Most large organizations have data quality problems that have resulted from years of acquisitions, stovepipe application development and error-prone data entry. Making sense of the mess looks like a huge job, but it can be done if you follow a disciplined process, suggests Barth, an MIT computer science Ph.D. who’s also co-founder of NewVantage Partners.

First, you need to step through all their critical data elements and decide what information is really strategic to the business. This process alone can winnow down tens of thousands of data elements to perhaps a few hundred.

Then establish consistent terminology. Barth cites one client that had six different definitions of “active customer.”  Everyone in the company needs to use the same language. A metadata repository can help standardize terms.

“Start to centralize where data is stored, how it’s provisioned and how people access and use it,” Barth recommends. Minimize copies. Each copy of production data creates complexity and the possibility of error.

Put in place a governance process for data quality that specifies rules about what levels of quality is acceptable. Create metrics by which to measure quality levels. Establish data ownership. One of the reasons companies have so many data problems is that no one owns the quality process. Ownership creates responsibility and discipline.

Get a handle on application development. Line-of-business owners shouldn’t be allowed to create rogue applications without central coordination. Many of these skunkworks projects use copied data because it’s more expedient than tapping into production databases.

Identify opportunities to create insights from data. This is the most critical step. Strategic opportunity comes from thinking up innovative ways to apply data analysis.

Annuity Benefits

Here’s one more example of the benefits of data quality. An acquisitive bank had to make decisions about closing and consolidating branches. These choices are usually made based on location, but some analytical thinkers at the bank had a better idea.  They built a database of customer transactions across the bank’s many channels, including branches, telephone banking and Web.  Then they looked at the behavior of customers across those profiles.

They discovered that customers who used multiple physical and electronic channels were less likely to leave the bank. That meant that branches that counted many of those high-value customers were actually better candidates for closure. Using this innovative approach to decision-making, the bank was able to close 50 branches and save $38 million annually without any revenue impact. That’s an annuity benefit that resulted from an innovative analysis of data the bank already had.

Secrets of Selling Up

From Innovations, a website published by Ziff-Davis Enterprise from mid-2006 to mid-2009. Reprinted by permission.

Over the last couple of entries, I’ve talked about the benefits of deploying social network technology internally for business benefit. While the benefits may be clear to you, chances are you have to get past the sticky process of selling the idea to management, who invariably ask the return on investment (ROI) question.

ROI is important but it’s also a smokescreen that executives can use to avoid spending money. Corporations invest in plenty of functions that have vague ROI, including human resources and training. Unfortunately, social networking projects don’t lend themselves to clean ROI calculations. The benefits of better communications, more productive collaboration and access to organizational knowledge don’t fit neatly into a spreadsheet.

If you’re stuck on ROI, point to the often-neglected “I” part of the equation: investment. The new of tools, many of which are open source, are often significantly less expensive than the proprietary programs they replace. Then ask managers to consider the cost of a sales opportunity lost because team members didn’t have access to relevant knowledge within the organization. Or look at turnover rates and ask what the value is of capturing the knowledge that those departing workers take with them.

The best way to justify social media investments is through examples. Every large organization has a few people who understand the transformative effects of technology and who are eager to try new things. Embrace these people as your allies, particularly if their organizations produce revenue. Enlist them to participate in a pilot project that applies social networking concepts and technologies to an existing process. Perhaps it’s replacing an over-engineered project management system with a wiki. Or maybe it’s adopting podcasts in lieu of staff-wide conference calls. Offer all the support you can to make the pilot work. The experiences of early adopters wills rapidly spread throughout the organization.

One key to successful adoption of any new technology is to eliminate alternatives. If people are told to use a wiki for collaboration, for example, but still have the option of communicating with colleagues by e-mail, they will choose the latter. People default to the path of least resistance, and that usually means sticking with what they know. You have to remove that option to demonstrate the effectiveness of an alternative solution. You can’t eliminate alternatives, but you’re sponsor can. That’s why choosing allies at the management level is so important.

A successful pilot project is a foundation for revisiting the ROI question. Interview project participants to document their impressions. Identify what they believe are the cost savings or revenue opportunities. Share these field experiences with the skeptics. Real-life stories are more convincing than formulas.

Supplement this with an internal public relations effort. Subscribe to RSS feeds of bloggers and publications that advocate for your position. Share expert opinions and published case studies with your internal decision-makers. Keep an eye out, in particular, for activities by your competitors. Don’t expect results overnight, but with time and repetition, you can wear down the critics.

Data Quality Problems are Corporate IT’s Dirty Little Secret

From Innovations, a website published by Ziff-Davis Enterprise from mid-2006 to mid-2009. Reprinted by permission.

In the early days of home broadband, I was a customer of a very large cable company; whose name I’m sure you know. When making one of my frequent calls to the technical support organization, I was typically required to give all my account information to a representative who then transferred me to a support tech, who asked me to repeat all of the account information I just gave the first person.. If my problem was escalated, I got transferred to a third rep who would ask me for — you guessed it – my account information.

This company’s reputation for lousy customer service was so legendary that if you type its name and “customer service” into a search engine today, you get mostly hate sites. One of its biggest problems was poor data integration. For example, its sales database was so fragmented that I routinely received offers to sign up for the service, even though I was already a customer. I’m a customer no longer.

Does this sound familiar? Most of us and have had frustrating experiences of this kind. Thanks to acquisitions, internal ownership squabbles and poor project management, most large companies have accumulated islands of disintegrated, loosely synchronized customer data. PricewaterhouseCoopers’ 2001 Global Data Management survey found that 75 percent of large companies had significant problems as a result of defective data. It’s unlikely the situation has improved much since then. Data quality is corporate America’s dirty little secret.

The Path to Dis-Integration
There are several reasons for this, according to Paul Barth, an MIT computer science Ph.D. who’s also co-founder of NewVantage Partners. One is acquisitions. As industries have consolidated, the survivors have accumulated scores of dissimilar record-keeping systems. Even basic definitions don’t harmonize. Barth tells of one financial services company that had six different definitions for the term “active customer.”

A second problem is internal project management. The decentralization trend is pushing responsibility deeper into organizations. There are many benefits to this, but data quality isn’t one of them. When departmental managers launch new projects, they often don’t have the time or patience to wait for permission to access production records. Instead, they create copies of that data to populate their applications. These then take on a life of their own. Synchronization is an afterthought, and the occasional extract, transform and load procedure doesn’t begin to repair the inconsistencies that develop over time.

Data entry is another problem. With more customers entering their own data in online forms and fewer validity checks being performed in the background, the potential for error has grown. Data validation is a tedious task to begin with and really effective quality checks tend to produce a lot of frustrating error messages. E-commerce site owners sometimes decide it’s easier just to allow telephone numbers to be entered in the ZIP code field, for example, as long as it moves the customer through the transaction.

Finally, data ownership is a tricky internal issue. Many business owners would rather focus on great features than clean data. If no one has responsibility for data quality, the task becomes an orphan. The more bad data is captured, the harder it is to synchronize with the good stuff. The problem gets worse and no one wants responsibility to clean up that mess.

Barth has a prescription for addressing these problems. It isn’t fast or simple, but it also isn’t as difficult as you might think. Next week we’ll look at an example of the benefits of good data quality and offer some of his advice for getting your data quality act together.

IT Can Innovate in Cutting Energy Costs

From Innovations, a website published by Ziff-Davis Enterprise from mid-2006 to mid-2009. Reprinted by permission.

As the price of gasoline has raced past the once-unthinkable level of $4 a galloon in the US, everyone is trying to come to grips with the implications of this historic event. Boomers like me remember the last time gas prices tripled in the 1970s. It plunged the US into a protracted recession with 18% annual inflation. Such an outcome is unlikely this time – the economy is much more globalized today that it was in those days – but it’s fair to say that the ripple effects of this shock will continue for years.

But times of crisis are also times of opportunity. The energy scare of the 1970s led to a near tripling of automotive fuel efficiency and much broader awareness of tactics for avoiding waste. It also led to shifts in the balance of power in many markets. Small, efficient players seized the opportunity to chop away at the entrenched rivals and make amazing gains. Toyota, which was a bit player in the US in 1970, was a major force a decade later.

The next couple of years are going to be traumatic. The price of everything is going to go up. The market leaders will sigh and say it’s out of their hands, but you don’t have to be so sanguine.

In many markets, new leaders will emerge among companies that can hold the line on prices by making quantum gains in efficiency. IT will be a competitive edge.

Quantum gains don’t come from adjusting the power options on PCs or turning off monitors at night. They come from rethinking entire processes. The gainers will be the companies that can innovate in reducing energy costs in area like these:

Logistics – Moving goods from one place to another as quickly and cheaply as possible will be a competitive differentiator in many industries. The latest modeling and linear programming tools can identify the cheapest and most direct logistics options. Yield management can optimize resources and help companies choose which under-performers to discard.

Workforce management – Airlines have raised fares 21 times this year and that trend will continue as long as fuel prices rise.  Business travel is on its way to becoming an expensive luxury. Back in the office, it’s becoming increasingly pointless for businesses to force their employees to commute to work each day just to sit in meetings. A big part of reducing costs in the future will be reducing real estate footprints, commuting costs and dollars burned on air travel. Technology has a huge role to play here and innovative firms that can create truly mobile and virtual workforces will gain a cost edge over the bigger companies, most of whom still barely even offer telecommuting.

Power management – When it comes to reducing power consumption, IT brings a lot to the table. Data centers consume an estimated 3% of electrical power. Moving processing tasks to off-peak hours, virtualizing under-utilized servers, redesigning data centers to lower cooling costs and switching users from desktops to more power-efficient laptops all have benefits.

The bigger opportunity, though, may be to outsource large parts of the IT infrastructure. I recently wrote about the rise of utility computing services that consolidate many customers into one giant data center. These services are so inexpensive that, for many companies, it simply won’t make sense to buy new servers any more. Even the fully amortized costs of in-house IT won’t match those of a cloud computing service.

Outsourcing –Speaking of outsourcing, the energy crisis lends new momentum to this decade-old trend. Faster networks and better software tools will make it possible for businesses to site operations in low-cost locations, including those with lower energy costs. Many corporations already outsource customer service and software development, but any function that can be performed by distributed teams is a candidate. Look at accounting, marketing operations, data analysis and even help desk as candidates. There is a human factor to be considered in moving work cross-country or overseas, of course, but the vitality of the company may be at stake.

What is your IT organization doing to streamline operations and reduce energy costs? Tell us in the comments area below.

Giving Up Control Unleashes Wisdom of Crowds

From Innovations, a website published by Ziff-Davis Enterprise from mid-2006 to mid-2009. Reprinted by permission.

At IBM, podcasts are now a popular form of internal communication. One IBM executive who used to hold an unwieldy weekly conference call with 500 people spread across the globe now podcasts the same information. Listenership has doubled. At a company in which 40% of the employees work primarily outside of an office, podcasting is revolutionizing internal communication.

It wasn’t always that way. Podcasting was introduced without fanfare at the computing giant, but it quickly achieved traction because employees were allowed to experiment and “play” with the new medium, according to George Faulkner, who is one of IBM’s most visible podcasters.  In fact, one of the first successful internal uses of podcasts arguably had no business value at all: it was an IBM “battle of the bands.”

But that experimentation led to experimentation which yielded practical business applications that have reduced internal communication costs and improved IBM’s outreach to investors and the public. This is one more example of how letting go of control can unleash the innovative energy within an organization.

Sabre Holdings learned this point an enterprise Web 2.0 platform that is changing the ways its employees communicate and creating a company knowledge base. The software is known internally as SabreTown and now is being packaged for sale as Cubeless. It’s social networking software that any company can use behind its firewall.

SabreTown was derived from a website called Bambora that Sabre constructed for consumers. Members define their areas of expertise and agree to answer questions from other members in those areas. The more exchanges that take place, the richer the member’s profile becomes and the more useful the travel database.

SabreTown has helped unlock untapped expertise within Sabre Holdings, according to Al Comeaux, senior vice president, corporate communications for Sabre. With Sabre’s rapid evolution into a globally distributed company (only 45% of employees are U.S.-based today, compared to 85% eight years ago), there was a need to break down barriers of location and time.

Getting employees to buy in to the community meant giving up control over how they used the tool. Sabre rolled out the application internally without restricting the topics employees could discuss. “The more we can get people talking, the more we can capture,” says Erik Johnson, general manager of Cubeless.

More than 200 groups have formed within SabreTown and personal blog spaces are available to everyone. Any information entered into SabreTown is processed by the relevance engine and built into employees’ personal profiles. Sabre is effectively creating a massive knowledge base in which employees willingly populate with their own information. And the company is building out SabreTown’s capabilities to make it into a full-blown social network.

A key element of success was giving up control. Sabre Holdings’ executives say that SabreTown would never have taken off internally if Sabre had tried to dictate how it could be used. By letting people play, innovation took over and business applications emerged. SabreTown is a hit within Sabre and it will pay big dividends as employees share expertise.

As I noted last week, large organizations and their managers struggle with giving up control, but that’s often precisely what they need to do to. Web 2.0 technologies have demonstrated that the wisdom of crowds is greater than the knowledge of any one manager. By giving up control, organizations can gain loyalty and respect. Which, paradoxically, enhances control.

Enterprise Social Networks Are Key to Corporate Knowledge Bases

From Innovations, a website published by Ziff-Davis Enterprise from mid-2006 to mid-2009. Reprinted by permission.

At the Central Intelligence Agency, Intellipedia is challenging long-held views about information propriety.  Intellipedia is an internal application of a wiki, which is one of the most popular enterprise social media tools.  The CIA is using the wiki to capture intelligence gathered from its global network of field agents and internal researchers.  It’s part of a broad effort by the notoriously secretive organization to break down silos of information and create an organizational knowledge base.

It’s also changing the culture of the agency.  In an address last week to the Enterprise 2.0 conference in Boston, the CIA’s Sean Dennehy noted that the success of shared knowledge bases requires giving up control.  “We need to fight against locked down spaces,” he said. The comment drew murmurs of surprise and some applause from the audience, who couldn’t quite believe it came from a CIA executive.

If the CIA can learn to give up control, imagine what your company can do. Social media is all about sharing. It’s based on the principle that participants give a little to get a lot.  The more you contribute to the body of knowledge, the more everyone benefits.  This principle underlines the success of a wide range of collaborative Internet sites, ranging from del.icio.us to Wikipedia. If Web 2.0 has demonstrated anything, it is that most people are motivated to do the right thing.

Pharmaceutical giant Pfizer has made the same discovery. Its Pfizerpedia wiki has more than 10,000 articles as well as numerous how-to videos. A nascent podcasting program is now spreading information by voice, and employees are trading Web discoveries through a giant internal social bookmarking platform. Pfizer is learning the power of sharing.

That’s a difficult concept for some managers to internalize. Traditional organizational structures are based on the idea that employees can’t be trusted to do the right thing.  They need to be constantly monitored and corrected to avoid going off the rails.

Social media is demonstrating that the opposite is true. It turns out that when you remove hierarchy, the community usually takes responsibility for policing its members and insuring quality work. This is especially true within groups of professionals and it couldn’t happen at a better time.

Large organizations need to start capturing their organizational knowledge. There are compelling to do this. Some 64 million baby boomers will retire in the next three years. Over the next 15 years, the size of the workforce between the ages of 30 and 49 will shrink by 3.5 million and by 2015, there will be 16 million more workers over the age of 50 then there are today.

These workers will each take with them years of accumulated knowledge. While some of that knowledge will be obviated by business evolutions, the skills needed to design an assembly line or calculate a cash flow statement won’t change. This makes it more critical than ever for organizations to capture institutional knowledge before it fades away.

Social media tools are an ideal way to do this because people populate the database themselves. In the same way that Facebook or MySpace members continually add personal information to their profiles, business professionals can develop rich descriptions of their skills and experiences by reaching out and helping each other. In the past, enterprises had only rudimentary ways to capture this information. With the arrival of internal social networks, they can now store everything in an enterprise knowledge base.

Over the next couple of entries, I’ll look at how this is playing out in US organizations today and how you can get your users – and managers – on board.

The Collaboration Paradox

From Innovations, a website published by Ziff-Davis Enterprise from mid-2006 to mid-2009. Reprinted by permission.

I’m a big believer in the value of social media, enough to have written two books on the subject.  In the spirit of practicing what I preach, I posted the draft of the first book on my blog 18 months ago with good results.  Several thousand visitors read the chapters and several dozen contributed meaningful feedback.

So when I was writing the second book this spring, I thought I would go one better. I posted the entire draft on a Wiki and used my newsletter, blog and personal contacts to invite people to contribute to the finished product.

Few did.  In fact, over the course of six weeks only nine people joined the wiki and only three or four made meaningful changes. It turned out that a blog, with its limited capacity for collaboration, was far more effective in achieving my collaborative goal.

This got me thinking about the paradox of group collaboration. There’s no question that wikis can make teams more productive. Yet they are probably the greatest disappointment of the suite of Web 2.0 tools.

I’m involved in three or four organizations that use wikis to coordinate people’s activities. Not once have I seen them used to their potential. Of the few people who actually contribute to the wikis, most send a duplicate copy of the content by e-mail to make sure everyone is in the loop. Many public wikis survive only because a small number of members maintain them. Few have many active contributors.

Yet there are some phenomenally successful examples of wiki technology.  The most famous is Wikipedia, with its 10 million articles in 253 languages. Wikipedia founder Jimmy Wales recently started Wikia, a library of 2,500 special-interest wikis in 66 languages that allows people to create reference materials from the perspectives that are important to them.

There’s also evidence that Wikis are enjoying good success behind the firewall.  IBM has said that wikis are its number one social media tool, making it possible for a widely dispersed workforce to collaborate.

Why are wikis such a disappointment when they have so much potential? I think the reason is that productivity has nothing to do with it.

Wikis succeed when the interest of every member is served by participation. Projects that mainly benefit individuals or organization offer few compelling reasons for others to get involved.  It turns out that people are more than happy to comment upon another’s work, but getting them to actively contribute requires an extra measure of self-interest.  People were happy to comment upon my book, but the incentives to get them to actively contribute to someone else’s work were insufficient.  On the other hand, people who are passionate about coin collecting have an incentive to make the numismatics section of Wikipedia an accurate public record.

Productivity is often held out as an incentive for people to use new technology, but I believe that’s only a minor factor.  People continue to use spreadsheets when databases would do a better job.  They fumble along with e-mail, despite its many limitations, because that’s what they know.  The most successful new technologies have been those that enable people to transform their work or their way of life. Incremental improvements are never enough to sustain meaningful behavior change.

Old PCs Pose Environmental, Regulatory Threat

From Innovations, a website published by Ziff-Davis Enterprise from mid-2006 to mid-2009. Reprinted by permission.

We all know how great it feels to have a new PC plunked down on our desktop or in our briefcase.  But for IT organizations, that exhilaration is increasingly compounded by anxiety.  What should they do about disposing of the computer that’s being replaced?

This issue is gathering importance as the number of old computers grows.  Gartner has forecast that consumers and businesses will replace more than 925 million PCs worldwide by 2010.  And that’s just one category of computer.  Gartner expects another 46 million servers to ship during the next five years, and about one billion mobile phones to be discarded yearly beginning in 2010.

There are obvious ecological concerns that attend this problem, of course. Most personal computers contain chemicals that can poison water supplies and old CRT monitors have lead linings that should never make their way into a landfill.

But the risks to businesses these days can hit even closer to home.  Discarded computers can contain proprietary data that, if disclosed, can open a company to a host of legal and compliance problems. Among the regulations that provide severe financial penalties and even imprisonment for improper data protection are the Health Insurance Portability and Accountability Act (HIPAA), Gramm-Leach-Bliley Act and Sarbanes-Oxley Act.  There are also a host of local regulations to consider, the result of Congress’s decision many years ago to make environmental rules the domain of individual states

Companies have gotten by for years on ad hoc approaches to computer disposal.  Often, they sell old machines to employees, give them to charities or palm them off on trash hauling business that dispose of the equipment in places unknown. But regulators don’t buy the “out of sight, out of mind” philosophy. Most place the onus of insuring data protection on the original owner. That means that if a PC or cell phone containing protected information turns up in a landfill overseas somewhere, the firm that captured the data is on the hook for any legal obligations.

A particular concern is the trash companies, who often piggyback their computer disposal services on top of their basic business of hauling away Dumpsters full of refuse. While many of these companies are no doubt legitimate, some tried to cut costs by piling IT equipment into containers and shipping them overseas.

In some cases, this equipment is simply thrown into open holes in the ground, causing unknown public health concerns. Many Third World companies also of the have subcultures of entrepreneurs who to disassemble equipment and sell the piece parts on the open market. In 2006, The BBC bought 17 second-hand hard drives in Nigeria for $25 each and recovered bank account numbers, passwords and other sensitive data from them. Under many regulations, the original buyers of that equipment could be liable for any security or privacy breaches that resulted.

Nearly every business should have a plan for disposing of end-of-life computers.  If storage equipment is to be repurposed, it needs to be thoroughly erased. The Department of Defense’s 5220.22-M erasure standard insures that media is completely cleansed of recoverable data. A simpler approach is to take a hammer and smash the storage media into smithereens. Whatever tactic you use, you need to document the data destruction using the appropriate compliance forms.

A new practice has also emerged called IT Asset Disposition (ITAD). ITAD vendors essentially outsource the disposal process and provide tracking, verification and even insurance against liability. Some firms can also remanufacture components and sell them, thereby reducing costs for their customers.  Research firm International Data Corp. has published a good study on the market. The site Greener Computing also has helpful advice.

Firefox Solidies Mind Share Lead

From Innovations, a website published by Ziff-Davis Enterprise from mid-2006 to mid-2009. Reprinted by permission.

Market share gains by the Firefox Web browser continued into early 2008, with Firefox now commanding 27% percent of all website visits. In total installed base, it still trails far behind Microsoft’s Internet Explorer, but the open-source browser has already established itself as the mind share leader. That’s a remarkable feat in less than four years.

Firefox’s success is a tribute to the power of community development and the stickiness of open-source applications. It’s an example of how giving up control can enhance market leadership. Today, I would argue, Firefox is the dominant browser on the Internet.

How can that be when Firefox has only about a third of IE’s market share? Here’s a case where share is deceptive. For one thing, Firefox has momentum, having grown from less than 1% in 2004 to its present base of an estimated 140 million users. Secondly, Firefox has the allegiance of the most influential Internet users: enthusiasts, developers and people who contribute actively to social media sites. If there’s anything the history of the software industry has shown us, it’s that platforms that generate developer enthusiasm invariably edge out their competitors.

One can even argue that Firefox is already the top browser among these thought leaders. For example, look at the results of this poll from LifeHacker, a site devoted to computer and personal productivity advice. It’s unscientific, but still interesting.  Asked why they use Internet Explorer, only about 15%, said it was because they actually preferred the software.  Half of the respondents don’t even use IE at all.  So while Firefox may have relatively low market share among all computer owners, it has achieved parity among the audience of serious enthusiasts.

This is the beauty of an open architecture comes into play.  Firefox was designed and licensed from the beginning to accommodate user-develop extensions.  More than 2,000 of them listed on the Firefox add-ons site, ranging in weekly download activity from hundreds of thousands to less than a dozen.  Some of the extensions aren’t very good, but that doesn’t really matter.  Users make their own choices and sites like LifeHacker take care of publicizing the best work.

Microsoft also permits developers to write extensions to Internet Explorer, but its approaches is quite different.  In the early days, the IE software development kit was tightly controlled and add-ons had to pass Microsoft scrutiny in order to even be listed in the official directory.  Microsoft had made an earnest effort to loosen up this process, but the company is culturally resistant to this unfettered development. In contrast to Firefox’s 2,000 extensions Microsoft’s official directory of an IE add-ons lists less than 100 entries. Perhaps that’s because, as the company states on his homepage, “The add-ons available here have been carefully screened by Microsoft and rated by users to help you select the ones that suit your needs and preferences.” Perhaps users don’t want their choices screened.

Listen to this short podcast from last summer’s O’Reilly’s Emerging Technology Conference. In it, developers contrast the chaotic mess of the Firefox developer forums with the muted restraint of the IE third-party community. As StumbleUpon’s Garrett Camp notes, new Firefox extensions and updates spark endless analysis and debate while IE developers rarely talk at all. Intensity makes markets dynamic and innovative, and that’s what Firefox has.

This market share war is meaningless from a revenue standpoint because browsers are free. But Firefox’ continuing success is a powerful case for the superiority of the open-source model.  Conceding power may paradoxically be the best way to gain power.

Utility Computing Finds its Sea Legs

From Innovations, a website published by Ziff-Davis Enterprise from mid-2006 to mid-2009. Reprinted by permission.

Nearly a decade ago, I worked at an Internet startup in the crazy days of the early Web.  Success demanded speed, rapid growth and scalability.  We were frequently frustrated in those days by the demands of building out our computing infrastructure.

The company burned more than $1 million in that pursuit. Racks of Unix servers had to be acquired and configured.  The equipment was set up at a co-location facility that required on-site care and feeding by technical staff.  Installation and testing took weeks.  Maintenance was a time-consuming burden requiring a staff of technicians who had to be available at all hours.  At least twice during the first year, server crashes took the company off-line for more than a day.  Stress and burnout were a constant issue.

Today, I suspect the company would do things quite differently.  Instead of acquiring computers, it would buy processing power and storage from an online service.  Startup times would be days or weeks instead of months.  Scaling the infrastructure would require simply buying more computer cycles. There would be no cost for support personnel. Costs would be expensed instead of capitalized.  More importantly, the company would be up and running in a fraction of the time that was once required.

The current poster child of utility computing is, of all companies, Amazon.com.  An article in last week’s Wired magazine describes the phenomenal success of the initiatives called S3 and EC2 that Amazon originally undertook to make a few dollars off of excess computing capacity. Today, the two services count 400,000 customers and have become a model that could revolutionize business innovation.

That’s right, business innovation. That’s because the main beneficiaries of utility computing are turning out to be startups. They’re using the services to cut the time and expense of bringing their ideas to market and, in the process, propelling innovation.

The utility computing concept has been around for years, but questions have persisted about who would use it. Big companies are reluctant to move their data offsite and lose control of their hardware assets.  They may have liked utility computing in concept, but the execution wasn’t worth the effort

It turns out the sweet spot is startup firms. Many business ideas never get off the ground because entrepreneurs can’t raise the $100,000 or more needed for capital investment in computers. In contract, Amazon says it will transfer five terabytes of data from a 400G-byte data store at a monthly fee of less than $1,400. If you use less, you pay less. It’s no wonder cash-strapped companies find this concept so appealing. Wired notes that one startup that uses Amazon services dubbed one of its presentations “Using S3 to Avoid VC [venture capital].”

Now that companies are getting hip to this idea, expect prices to come down even further. Sun already leases space on its grid networkIBM has an on-location variant. Hewlett-Packard has an array of offerings. There are even rumors that Google will get into the market with a free offering supported by advertising. And, of course, there will be startups.

The availability of cheap, reliable and easy-to-deploy computing services could enable a whole new class of entrepreneurs to get their ideas off the ground.  It’s just one more example of IT’s potential for dramatic business change.