Crisis Tests IT’s Influence

From Innovations, a website published by Ziff-Davis Enterprise from mid-2006 to mid-2009. Reprinted by permission.

Between checking retirement portfolios and flipping over to wsj.com or Moneywatch every hour or two, a lot of people aren’t’ getting much work done this week. Who can blame them? Between the unprecedented bankruptcies of some of Wall Street’s biggest firms, the turmoil in the stock market and dire statements from top officials in the U.S. government, it’s easy to believe that the world is caving in.

I’m nervous, too, but I’m also resisting the urge to forecast disaster because I just don’t think the Great Depression can happen again. Part of the reason is information technology.

There are two ways in which IT can make important contributions to pulling the U.S. economy out of chaos: by empowering rapid decisions and enabling communication. As evidence of the first dynamic, look at the chart below. It shows growth in U.S. gross domestic product from 1930 through 2007. You don’t have to squint much to see where the trend has been going. The huge swings in GDP performance that occurred during the Depression and war years have gradually become gentler and more predictable. Negative growth, which was a regular occurrence during much of the 20th century, has only occurred once in the last 25 years. The further right you go on the chart, the more boring economic performance becomes. That’s precisely how businesses like it. Consistency sets the stage for more confident long-term planning.

Technology’s Shadow Role

There are multiple reasons why the economy has stabilized in recent decades; globalization and the Federal Reserve are certainly two of them. But I would suggest that information technology plays a shadow role. Note that the GDP numbers show a clear smoothing trend beginning around the middle 1960s, which was when computers began to make their way into back offices on a grand scale. The trend becomes even more refined in the late 1980s, when PCs started landing on every desktop.

I think it’s no coincidence that economic cycles became less volatile when managers and regulators began deploying sophisticated models to predict the path of business. Even as the economy has been roiled by financial crises, 9/11 and the bursting of the Internet bubble over the last two decades, recessions have tended to be shallow and brief and recoveries have been smoother and more sustained than in previous cycles. One factor may be that economic plays have more sophisticated means to model the impact of their decisions than they did before. That leads to better forecasting and quicker mid-course corrections, which makes for less volatility. No one’s suggesting that we aren’t in for some difficult times, but if the past is any indication, we’re better equipped to pull out of the tailspin today than ever before.

The other potentially positive IT influence on economic cycles is the Internet, and in particular Web 2.0. Within just the last five years, businesses have embraced robust new ways to communicate with their constituencies. As new economic surprises have turned up almost daily over the past few weeks, people have flocked to their Facebook groups, Twittered their concerns and voiced their opinions on news sites like never before. Smart business leaders should be tapping in to these conversations and using them to help guide their own decisions. If you want to learn how your customers are thinking about the latest dose of bad news these days, you need only to ask them or just listen. Trends that used to take months to identify can now be discerned in a few hours. It’s too early to know what the impact this fact will have on economic performance, but it’s likely to encourage faster and more competent decision-making.

Web 2.0 also enables corporate leaders to communicate directly to their constituents to offer own perspectives on unfolding events. Unfortunately, they aren’t doing much of that yet. A quick check of blogs operated by Chrysler, Marriott, McDonald’s, Whole Foods, Accenture, Boeing, Wal-Mart and Southwest Airlines shows that none has yet departed from delivery cheery good-news fare to comment upon the economic issues that weigh most heavily on American minds. Cheers to General Motors and PriceWaterhouseCoopers for attempting to lend some of their perspective to the conversation. I only hope the others are too busy listening at the moment to make time to state their own views. America certainly wants to hear them.

Innovation From Below

From Innovations, a website published by Ziff-Davis Enterprise from mid-2006 to mid-2009. Reprinted by permission.

On a visit to Compaq Computer (remember them?) several years ago, an executive told me proudly that Compaq had reinvented itself to avoid becoming what he referred to as “the Mercedes of PCs;” in other words, the luxury supplier in a market that was increasingly demanding low cost and high volume.

It turned out that Compaq succumbed to cheaper competition, anyway. It quality-at-all-costs culture couldn’t successfully accommodate the “good enough” demands of the emerging market.

This story came to mind this week as I noted the impending launch of the Tata Nano (below), a new low-cost automobile being introduced next month by India’s giant industrial conglomerate Tata Group. Priced at just $2,500, the Nano is a marvel of simplicity.  At a time when the cheapest American car costs nearly $11,000, it’s a bid to rewrite the economics of the global auto industry in an effort to reach a giant emerging class of consumers.

The Nano is remarkable not for its technological sophistication but for what designers chose to leave out.  For example, the base model has no air conditioning, power brakes or radio.  Even its steering column is constructed of hollowed-out steel to save money on materials and reduce weight.  It has a top speed of just 65 mph from its tiny rear-mounted engine.

Tata made these trade-offs in order to accommodate other needs that it believes are more important to its audience.  Specifically, the Nano can accommodate four passengers while putting around at 50 miles per gallon.  It meets all Indian emission, pollution and safety standards, although admittedly those are looser than in the US.  It’s also designed for modular assembly, meaning that local entrepreneurs can add value by swapping in their own components for those provided by the manufacturer. From its lowest-cost perch, it can easily be scaled up without sacrificing its compelling price advantage.

An Open Source Automobile

The Nano may be the world’s first open-source car.  Tata intentionally chose to design the vehicle around modular components and to single-source most of those parts in order to keep costs down. Modularity also creates an opportunity for innovation in manufacturing, for the Nano is designed to be assembled in small quantities. Entrepreneurs can set up boutique assembly shops to build the vehicles profitably in small quantities for their local markets.  They can plug in the components that their customers value most and still retain significant cost advantage.

This kind of innovation goes against the grain of most American thinking.  In our technology-driven culture, we are wedded to the principle that sophistication and elegance define innovation.  We celebrate product over process and take pride in the number of patents our companies earn.

In contrast to Tata’s bare minimalist approach, American automakers have increasingly been outfitting their vehicles with navigation systems, video entertainment and computerized wizardry encoded in proprietary black-box designs. A high-end DVD entertainment system in an American car costs more than an entire Nano. While there’s no question that these high-tech products are innovative, but they serve a narrow slice of the luxury market that is already saturated with choice.

Tata, in contrast, is focused on volume. It expects to sell over one million Nanos in the first year and to ramp up steeply from there.  It’s targeting the 98 out of every 100 Indians who don’t own a car.  In contrast, the U.S. has 76 vehicles for every 100 citizens.

There’s a lesson here for tech professionals. As I’ve noted before on this blog, disruptive market change almost always comes from below. In computers, consumer electronics, software and even retailing, low-cost providers who unlock new markets invariably displace high-margin boutique businesses at the high end.

Tata is targeting the hundreds of millions of consumers who will constitute the middle class of the future. These people have never owned a car and have no affinity to a brand. Most have also never owned computers, luxury electronics or online bank accounts. They are a green field of opportunity for companies that can meet their need for affordability and convenience. While Tata will no doubt face skeptics and setbacks, it has defined a clear vision and built a company around it.

Is your company positioned to meet this opportunity? Or is it destined to become the Mercedes of your market? Comment below and tell me if you think low-cost markets are an important opportunity to you.

The Coming Utility Computing Revolution

From Innovations, a website published by Ziff-Davis Enterprise from mid-2006 to mid-2009. Reprinted by permission.

Nicholas Carr is at it again, questioning the strategic value of IT.  Only this time I find myself in nearly total agreement with him.

Carr became famous, or infamous, for his 2003 Harvard Business Review article “IT Doesn’t Matter,” in which he argued that IT is an undifferentiated resource that has little strategic business value.  His thinking has evolved since then, and in his new book, The Big Switch, he proposes that utility computing will increasingly become the corporate information infrastructure of the future.

Utility computing means different things to different people.  Some people draw an analogy to the electrical grid, but Carr argues that the information utility is far richer and more strategic.  He outlined some of his perspective in this Q&A interview in CIO Insight magazine.

Nicholas Carr

Nicholas Carr

The utility computing model that Carr foresees encapsulates many of the hottest concepts in IT today: virtualization, modular computing, software as a service, Web 2.0 and service-oriented architecture.  Computing utilities of the future will be anchored in enormous data centers that deliver vast menus of applications and software components over the Internet. Those programs will be combined and tailored to suit the individual needs of business subscribers.

Management of the computing resource, which for many years has been distributed to individual organizations, will be centralized in a small number of entities that specialize in that discipline.  Users will increasingly take care of their own application development needs and will share their best practices through a rich set of social media tools.

In this scenario, the IT department is transformed and marginalized.  Businesses will no longer need armies of computing specialists because the IT asset will be outsouced. Even software development will migrate to business units as the tools become easier to use.

This perspective is in tune with many of the trends that are emerging today.  Software is a service is the fastest growing segment of the software market and is rapidly moving out of its roots in small and medium businesses to become an accepted framework for corporate applications.  Data centers are becoming virtualized and commoditized. Applications are being segmented into components defined as individual services, which can be combined flexibly at runtime.

There are sound economic justifications for all of these trends, and there’s no reason to believe they won’t continue.  So what does this mean for IT organizations and the people who work in them?

Carr sums up his opinion at the end of the CIO Insight interview: “[I]nformation has always been a critical strategic element of business and probably will be even more so tomorrow. It’s important to underline that the ability to think strategically…will be critically important to companies, probably increasingly important, in the years ahead.”

Taking this idea one step further, you can envision a future in which a pure IT discipline will become unnecessary outside of the small number of vendors that operate computer utilities.  University computer science programs, which have long specialized in teaching purely technical skills, will see those specialties merged into other programs.  Teenagers entering higher education today are already skill at building personal application spaces on Facebook using software modules.  It’s a small step to apply those principles to business applications.

Sometimes, the past is a good predictor of the future. In my next entry, I’ll give an example of how technology change revolutionized the world a century ago and draw some analogies to the coming model of utility computing.

Function Over Features

From Innovations, a website published by Ziff-Davis Enterprise from mid-2006 to mid-2009. Reprinted by permission.

I’ll admit to being a hopeless gadget freak, the kind who has to have the latest shiny object, if not the day comes out, then within a few months.  One of my addictions is MP3 players, of which I’ve owned at least six dating back to the mid-1990s.

The most feature-laden MP3 player I’ve ever known is the iRiver, a product that was ahead of its time in price/performance. Not only did it boast a spacious 20 GB hard disk two years ago, but it has very good digital recording capabilities. At $200 in 2006, it was a steal.  And yet I barely use it.

Instead, I have consistently opted for the more expensive and less feature-rich Apple iPod.  Why? Because for all its technical sophistication, the iRiver is too damned hard to use. Even after two years, I still struggle with its unintuitive menu system. The iPod, in contrast, is almost joyfully simple. It’s twice the cost and less than half the capacity, yet it’s my MP3 player of choice.

This experience occurred to me recently when I was listening to this keynote presentation from the O’Reilly Media Rails Conference by software development expert Joel Spolsky. Download it and listen.  It’s 45 minutes of sheer fun.

Spolsky’s point is anything but a joke, however. He tells his techie audience is not to let complexity obscure the appeal of simplicity. he cites the example of Apple’s iPhone and compares it to the Samsung Blackjack. In nearly every technical respect, the blackjack is a superior product.  It has more features, better bandwidth and expandable storage.  It supports Bluetooth and many Windows applications.  It weighs less than the iPhone and has a full built-in keyboard.  Yet the iPhone is killing the Blackjack – and everybody else – in the mobile device market.

The Futility of Features Wars

We’ve see this phenomenon repeatedly in the consumer electronics market. Technically superior products lose out to rivals that excel at capturing the user’s imagination. Microsoft Windows defeated the technically superior OS/2 on the desktop.  TiVo continues it to rule the roost in the DVR market, despite the presence of rivals with better price/performance.  The Macintosh is now putting pressure on cheaper Windows machines, largely because Apple has tuned it to work well with a small number of really popular applications.

Spolsky makes the point that developers often fixate on features and trivialize user experience.  This isn’t surprising.  Many developers I’ve known dismiss design and user navigation as detail work. They want bells and whistles, which is what appeals to them.

But ordinary consumers could care less about these things.  In many cases, they will make huge trade-offs in functionality and cost in order to get something that just works.

In their best-selling book, Tuned In, authors Craig Stull, Phil Myers and David Meerman Scott cite many examples of this effect. Among them is Nalgene, a plastic water bottle that is marketed to college students and outdoor enthusiasts in a variety of vibrant colors and branded labels.

Few products are more commoditized than plastic bottles, yet a clever marketing campaign built around environmental awareness and students’ need to express themselves through their accessories has made this vessel a hit at twice the price of its competitors. Thermo Fisher Scientific succeeded in marketing a product originally aimed at scientists to a consumer market because it was able to get inside the minds of those target customers.

The authors of Tuned In advocate rigorous market research over gut level decision-making. Their mantra: “Your opinion, while interesting, is irrelevant.”  In other words, companies that produce products for themselves often succeed in selling to precisely that market.

User experience does count. Just ask any iPod owner.

Open Source Quality

From Innovations, a website published by Ziff-Davis Enterprise from mid-2006 to mid-2009. Reprinted by permission.

The market for data quality software has gone open source, and it’s about time.

Late last month, Talend Open Data Solutions, a maker of data integration and profiling software, made its Talend Data Quality software available under the General Public License. This follows the French company’s move in June to open its Open Profiler product in a similar way. The tools can be used together to assess the quality of the information in customer databases and to correct basic name and address information against a master list maintained by the US Postal Service.

What’s more important, though, is the potential of open sourcing to bring down the costs of the complex and frustrating data cleansing process. As I noted a few weeks ago, data quality is one of the most vexing problems businesses face. Data that’s inconsistent, out of date or incorrectly formatted creates inefficiency, angry customers (have you ever gotten three direct mail pieces at the same time, each addressed a little differently?) and lost opportunity.

Solutions to data quality problems have existed for decades, but they’ve always been sold by small vendors, usually at prices starting in the six figures. Over the last few years, many of these vendors have been snapped up by bigger software firms, which then bundled data quality tools and services into giant software contracts. There aren’t a lot of vendors left that specialize in solving the quality problem specifically.

This fragmentation has frustrated a process that should be a part of every company’s IT governance practice. While each company has its own data quality issues, many are common to a large number of businesses. Talend’s approach doesn’t address the cost of accessing databases of “clean” data, but it has promise to make the cleansing process itself cheaper and more automated.

The secret is the magic of open source, which enables users to easily exchange their best work with each other. For just one example of how this works, check out SugarCRM’s SugarExchange. This collection of third party applications has been built by contributions from SugarCRM’s customers and independent developers. While some modules carry license fees, many don’t. The point is that software authors who useful extensions to the base CRM system have the wherewithal to share or sell them to others who have similar needs. That’s difficult or impossible to do with proprietary software.

This so-called “forge” approach to development lends itself particularly well to data quality because so many issues are common to multiple companies. For example, if I come up with a clever way to compare employee records to a database of known felons, I should be able to share it and even charge for it. That isn’t possible when the market is spread across an assortment of small, closely held companies. If Talend can extend its TalendForge library to incorporate a robust collection of data quality components, it can make data cleansing practical and affordable to a much larger universe of companies.

The data quality problem is so pervasive that it demands a collaborative approach to a solution. This is a good start.

Google’s Chrome is a Game-Changer

From Innovations, a website published by Ziff-Davis Enterprise from mid-2006 to mid-2009. Reprinted by permission.

Sometimes, innovation is in knowing what to take out as much as what to put in.

Case in point is Chrome, the new browser from Google. You would think the last thing the world needs is another browser, and you would be right if all that browser did was pile on more features.  Google again bucks the conventional wisdom with Chrome, however. It’s fast, simple and designed for the way people use the Web today rather than how they used it in a few years ago.

Google released the Chrome beta yesterday to immediate speculation that it’s the foundation for an Internet operating system.  The company vigorously denies these claims, but based on my own admittedly limited tests, I’d say Google’s protests ring hollow.  Chrome is clearly aimed at Microsoft’s jugular, and if it succeeds in gaining widespread adoption, it will hasten adoption of the whole software-as-a-service style of computing.

There is nothing particularly innovative about the Chrome interface, other than its stark simplicity and the clever way in which it integrates search and browsing history.  Simplicity comes from Chrome’s adherence to the Firefox interface and its minimalist features.  For example, the browser has no menu bar.

What blew me away about this early version of Chrome, though, is its speed. For starters, Chrome doesn’t burden users with the constant procession of warnings and dialog boxes that have made Internet Explorer almost unusable. That may change if Chrome becomes an object for hacker attacks, but for now, Google’s relative freedom from security threats and government scrutiny is a plus.

More importantly, Google has done some innovative work under the covers to enhance Chrome’s performance running AJAX applications. AJAX is the foundation of Web apps, but its surging popularity has caused some problems for users. That’s because programs written in JavaScript (the “J” in AJAX) seize control of the browser while they’re working. A single bad script can slow an entire computer to a crawl. Memory management limitations in Firefox have also hampered performance.

Chrome has several features to optimize JavaScript programs. In brief, Chrome allows each JavaScript program to inhabit a unique virtual machine and a unique thread.  This lessens the likelihood that one program can monopolize an entire session.  It also means that a crash in one browser window doesn’t take down the whole application.

There are also some innovations in memory management and garbage collection that speed performance.  You can satisfy your inner techie by reading an extensive sequence of technical explainers presented in cartoon format at www.google.com/googlebooks/Chrome/.

Google has a self-interest in optimizing AJAX, of course: that’s the way it delivers its Documents line of office applications. Chrome is clearly optimized to work well with Google Documents.  In my tests, a Google Document loaded faster than one launched with Microsoft Word or Excel.  While Google Documents still don’t approach Microsoft’s functionality, its open architecture has been a magnet for independent developers, who will quickly add features the core applications lack. Google also recently made it possible for word processing documents to run offline using its Gears plug-in. Taken together, Google Documents on Chrome are a much more compelling alternative to Microsoft Office than they have been in the past.

Google now enjoys the pole position on the Internet and has an impressive suite of applications with the capacity to run them offline. Chrome still requires Windows to run, but we can expect that barrier to fall soon. Microsoft is about to fight back with Internet Explorer 8, but previews suggest the browser won’t break a lot of new ground. I suspect throats are tightening a little in Redmond, Wash.

Global Treasure Hunt Unleashes Innovation

From Innovations, a website published by Ziff-Davis Enterprise from mid-2006 to mid-2009. Reprinted by permission.

The start of a week-long vacation has me in a somewhat festive mood today, so I thought I’d tell you about a new hobby I’ve been pursuing recently that’s also given me new insight into capacity of IT to transform lives.

About two years ago, my wife and I took up geocaching. It’s a global game that was enabled by the Clinton administration’s 2000 decision to open up the government’s network of global positioning satellites (GPS) to civilian use. Geocaching is basically a worldwide treasure hunt. Players use handheld GPS receivers to find containers placed by other enthusiasts in locations ranging from city street corners to remove mountaintops.

geocache
A classic cache hide in the base of a tree.

In its simplest form, players provide each other with nothing more than the GPS coordinates for the treasures, which are typically containers filled with trinkets. Geocaches can be as small as a pencil eraser or as large as a suitcase. When players locate one, they note their visit in a log book that’s kept in the container and also post their “find” on the geocaching.com website.

Global Craze

Geocaching has exploded in popularity. In January, 2005, there were about 141,000 geocaches hidden in the world. Today, there are more than 640,000, and a half million “finds” are logged each week. There are geocaches all around you. Nearly 1,000 have been placed within a 10-mile radius of Dallas-Fort Worth airport alone. My wife and I were so taken with this phenomenon that we started to write a book about it – Geocaching Secrets – that will be published in the spring.

One thing that’s captivated us about geocaching is transformative effect it has on the hobbyists who pursue it. We’ve spoken to several people who have logged more than 20,000 finds over the last eight years, or nearly 10 per day. Two years ago, one team spent months planning an attempt to find the most caches in a single day. They finished with over 300.

People report that geocaching has brought their families together, introduced them to dozens of new friends and led them to destinations that they never would have visited. One enthusiast credits the game with helping him shed 150 pounds and giving up smoking by using various vaporizers of all kinds. Several have said it saved their marriages. One disabled war veteran told me geocaching gavee him a reason to live at a time when he was seriously contemplating suicide.

It’s also brought people outdoors and introduced them to new worlds they never knew. In seeking caches within my local area, I’ve visited dozens of parks, nature preserves and cityscapes that I never knew existed. One recent excursion took my family on a discovery tour through the streets of lower Manhattan, with each stop relating to a different Revolutionary War event.

Platform for Innovation

Sprinkler_Head_Cache
Container hidden as a sprinkler head

I’ve been struck by the innovative spirit that this relatively simple idea has unleashed. The people who hide geocaches (some have stashed well over 1,000) come up with devilishly difficult tactics to disguise their treasures. One person specializes in hiding tiny containers inside of animal bones. Another constructs containers out of hollowed-out logs. Many others specialize in scrambling the geo-coordinates in elaborate puzzles and ciphers. One recent search directed my wife and me to watch a movie for a reference to obscure letters written by Benjamin Franklin. We then had to apply a cipher to the writings to zero in on individual words, the first letters of which spelled out the clue. That’s just the tip of the iceberg. Some cache owners devise riddles that would tax the skills of a mathematics Ph.D.

Educators use geocaching to teach their students about geology, mathematics and creative thinking. The Arkansas Parks Service stashed a cache in each of the state’s 52 parks this year in an effort to lure citizens into the outdoors. Hundreds of finds were reported in just the first two weeks.

Like the Internet, the geocaching community is entirely self-guided. There are no governing bodies and no formal leadership. The rules of the game are determined by the consensus of the players in a process that’s loosely democratic but without votes. Bad behavior is regulated by peer pressure.  People do the right thing in order to preserve the simple beauty of the game.

Geocaching is just one example of how technology unleashes creativity that changes people’s lives. It’s also a testament to how big ideas emerge when people are given tools and freedom to discard assumptions and invent new possibilities.

A Distant Mirror of Technology Change

Reading a recent CIO Insight interview with Timothy Chou brought to mind an analogy to, of all things the early days of railroading.

Chou is the kind of futurist who makes the guardians of the conventional wisdom cringe: an industry veteran who understands the way things have always been done and who now tours the country arguing for radical change.

After spending 25 years selling IT products to CIOs, Chou joined Oracle in 1999 to launch the company’s fledgling software as a service (SaaS) business. The experience was transformative. In 2004, he penned a provocative book called The End of Software in which he argued that the compelling benefits of SaaS would sweep over the enterprise software industry. He’s now self-published a second book, Seven, that challenges CIOs and software vendors to understand how important IT infrastructure is to their businesses and to make some tough decisions about where to invest. The title refers to seven software business models ranging from traditional licensing at one extreme to Internet-only software businesses like Amazon and Google at the other.

In an interview with Paula Klein, Chou is careful not to lecture CIOs about the choices they should make, but it’s pretty clear where he stands: most of them should be aggressively moving to hand over much of their software and infrastructure to someone else. While CIOs are ideally positioned to understand the business tradeoffs (“Ask any CIO who has completed an ERP implementation, and he or she will tell you more about how the business really runs than anyone on the executive staff,” he says), the decision– is a political minefield where many CIOs still fear to tread.

Losing Control

One of the most often-cited reservations CIOs have about SaaS is that they lose control of their IT environment. Chou argues that this is a red herring. Most IT organizations aren’t particularly good at backup and recovery, for example, so the belief that they can do a better job than a commercial vendor is misplaced. Many people think you have to pay more for reliability, but the opposite is true,” he says.” If you want to buy the most reliable car, don’t pick the $1 million handcrafted sports car; find a Toyota that’s been produced a million times.”

Addressing another common concern – that SaaS deprives IT organizations of the ability to customize their software environments – Chou argues that the tradeoff is usually worth it. Businesses, he says, “may have to forgo much of the customization that they’re used to in order to get simpler, cheaper, more reliable applications.”

His remarks reminded me of an insight I gained while on a recent visit to the Railroad Museum of Pennsylvania. There I learned that in the days before the railroads were built, it took Pennsylvania citizens two full weeks to travel from Philadelphia to Pittsburgh.  This forced communities to become self-sufficient because it was so difficult to obtain goods from the outside world.

The Triumph of Choice

Railroads ushered in the fruits of the Industrial Revolution, making it possible for mass produced goods to reach a wide audience. I’m sure that many people at the time mourned the loss of the local blacksmith, who crafted each horseshoe individually. Some probably chafed at the idea of molding their lives to the demands of a railroad timetable.

In the end, however, most people accepted the fact that low cost and wide selection was a reasonable alternative to expensive, customized goods.  And as much as we gripe about the tyranny of the airlines today, we appreciate the fact that they can get us from New York to San Francisco for less than $300.

The alternative to complete control is choice, and the power shift from seller to buyer has brought dramatic benefits, Chou argues. In the past, “Enterprise software vendors sold only $1 million products, and the only channel to buyers was a human,” he tells CIO Insight. “Today, the Internet provides a low-cost, diverse channel for information that can be used to educate anyone. Reference calls have been replaced with hundreds of online forums that let you understand a diverse set of experiences with a product.

In other words, competition and the rules of the market create a healthy atmosphere for diversity, which increases choice. Buyers may not be able to get exactly what they need, but they can come pretty close. For most of them, that’s a pretty good trade-off.

From Innovations, a website published by Ziff-Davis Enterprise from mid-2006 to mid-2009. Reprinted by permission.

With Cloud Computing, Look Beyond the Cost Savings

From Innovations, a website published by Ziff-Davis Enterprise from mid-2006 to mid-2009. Reprinted by permission.

Back in the early days of data center outsourcing, some pioneer adopters got a rude surprise.  These companies had outsourced all or large parts of their data centers to specialty providers who bought their equipment, hired their staff and offered attractive contract terms that shaved millions of dollars in expenses in the first year.

The surprise was that the contract terms weren’t so that attractive once the outsourcer became embedded in the client’s expense line.  Customers found that nearly everything carried hefty escalator fees, ranging from unbudgeted capacity increases to software patches to staff overtime. But there was little customers could do.  They were locked into the contractor and the cost of unlocking themselves was prohibitive.

This story came to mind recently during a chat with Bob McFarlane, a principal at facilities design firm Shen Milsom & Wilke. McFarlane is an expert in data center design, and his no-nonsense approach to customer advocacy has made him a hit with audiences around the country.

McFarlane thinks the current hype around hosted or “cloud” computing is getting out of touch with reality.  Cloud computing, which I’ve written about before, involves outsourcing data processing needs to a remote service, which theoretically can provide world-class security, availability and scalability.  Cloud computing is very popular with startups these days, and it’s beginning to creep onto the agenda of even very large firms as they reconsider their data processing architectures.

The economics of this approach are compelling.  For some small companies in particular, it may never make financial sense to build a captive data center because the costs of outsourcing the whole thing are so low.  McFarlane, however, cautions that value has many dimensions.

What is the value, for example, of being able to triple your processing capacity because of a holiday promotion?  Not all hosting services offer that kind of flexibility in their contract, or if they do, may charge handsomely for it.

What is the value of knowing that your data center has adequate power provisioning, environmentals and backups in case of a disaster? Last year, a power failure in San Francisco knocked several prominent websites offline for several hours when backup generators failed to kick in. Hosting services in earthquake or flood-prone regions, for example, need extra layers of protection.

McFarlane’s point is to not buy a hosting service based on undocumented claims or marketing materials. You can walk into your own data center and kick a power cord out of the wall to see what happens.  Chances are you can’t do that in a remote facility.  There are no government regulations for data center quality, so you pretty much have to rely on hosting providers to tell the truth.

Most of them do, of course, but even the truth can be subject to interpretation. The Uptime Institute has created a tiered system for classifying infrastructure performance. However, McFarlane recalls one hosting provider that advertised top-level Uptime Institute compliance but didn’t employ redundant power sources, which is a basic requirement for that designation.

This doesn’t mean you should ignore the appealing benefits of cloud computing, but you should look beyond the simple per-transaction cost savings. Scrutinize contracts for escalator clauses and availability guarantees.  Penalties should give you appropriate compensation.  While you won’t convince a hosting service to refund you the value of lost business, you should look for something more than a simple credit toward your monthly fee.

If you can, plan a visit to a prospective hosting provider and tour its facilities.  Reputable organizations should have no problem letting you inside the data centers and allowing you to bring along an expert to verify their claims. They should also be more than willing to provide you with contact information for reference customers. Familiarity, in this case, can breed peace of mind.

Packaged Innovation

From Innovations, a website published by Ziff-Davis Enterprise from mid-2006 to mid-2009. Reprinted by permission.

Can you bottle innovation?  Conventional wisdom says no; innovation comes from inspiration backed by knowledge.  It can’t be packaged or automated.

However, a Boston-based company is challenging the conventional wisdom. Innovation Machines has developed technology that applies a semantic search engine to the task of mining possibilities for innovative new materials and procedures.

If the phrase “semantic search” means nothing to you, join the club.  I got a telephone briefing on Innovative Machines’ technology and couldn’t quite figure out what it did.  So I stopped in for a visit and got one of the more impressive demos I’ve seen in recent years.

I’m a veteran of thousands of demos, and so have learned to be skeptical, but this was interesting stuff.  Innovative Machines’ customer list would indicate that the company is on to something.

Semantic search involves mining text documents not only for terms but for relationships between terms. Most search engines can’t do this. They can deliver some insight by finding words in close proximity to each other, but they don’t establish a clear relationship.

For example, if you search for “smoking” and “cancer” on Google, the results indicate there’s a relationship between the two, but the search engine won’t explicitly define that relationship. Semantic search goes a step further. It’s intended to deliver a small number of results but with terms that are specifically related to each other. For example, a semantic search engine might infer from its search that smoking and cancer are related and return documents that explain that relationship.

The semantic engine is at the core of what Goldfire does. A host of other features are wrapped around that, including a project workbench and a database of scientific and patent literature.  The demo I saw showed one example of how innovation can be guided, if not packaged.

Suppose your company makes packaged food and you want to figure out a way to substitute artificial sweetener for sugar.  Engineers can use the workbench to deconstruct ingredients in the current product and then test the substitution of various artificial sweeteners.  Goldfire’s scientific database understands the characteristics of alternative ingredients, such as texture, taste, heat tolerance and chemical interactions.  A researcher could model the impact of substituting different artificial sweeteners and determine which ones are good candidates for a new recipe.  By querying on the attributes of potential substitutes, engineers could also discover new ingredients they hadn’t thought of.

The patent database comes into play when attempting to innovate on existing intellectual property.  For example, an automotive engineer could deconstruct the components of a patented turbocharger and test the impact of substituting different metal alloys.  This could lead to an improved design that doesn’t infringe on existing patents.  In fact, Invention Machine says this re-engineering of existing patents is one of the most popular applications of its product.

Goldfire isn’t a simple product and to use.  Customers typically go through several days of training and setup to customize the software to their industry.  It also isn’t cheap; installations run in the six figures. For the kinds of problems Goldfire is meant to solve, however, these costs aren’t surprising.

Goldfire is a difficult product to describe, but an easy one to understand once you see it in action.  The company provides several podcasts and videocasts that demonstrate how customers are applying the technology.  This isn’t innovation in a bottle, but it’s a pretty good start.

Incidentally, I have no financial interest in the company or its product. I just think this is a technology that deserves more attention.