All the Best, All For Free

From Innovations, a website published by Ziff-Davis Enterprise from mid-2006 to mid-2009. Reprinted by permission.

The IT world is a better place because Ian Richards is in it.

A few years ago I stumbled across a website called “46 Best-Ever Freeware Utilities.” It contained a fantastic list of software covering many of a PC user’s basic needs ranging from tune-up utilities to security packages, office programs, multimedia and more.  The site was the work of a man who called himself Gizmo Richards. I quickly became hooked.

Gizmo’s site actually covered more than just 46 programs and his weekly “Support Alert” newsletter was a treasure trove of information about how to find free software that met or even exceeded the quality of commercial alternatives.  For two years, it was the only e-mail newsletter I paid for.

Support Alert ended its run as an independent publication last July, when Richards became a Senior Editor at Windows Secrets and merged Support Alert into that organization’s line of newsletters. But the mission that created the Web’s best source of information about free software lives on at Gizmo’s Tech Support Alert, a moderated wiki that’s carefully attended by Richards and a team of 60 volunteers.

I called Richards this week for his insight on the state of free software and found him to be quite unlike the person I imagined.  Ian Richards is an affable 62-year-old Australian, a veteran of the mainframe world who found his calling in the early days of the microprocessor era and who stumbled upon celebrity when his informal list of freeware utilities assembled on a lark became a viral phenomenon.  Tech Support Alert should be in the bookmark list of every PC enthusiast.  Its more than 20,000 citations on attest to its popularity.

Richards is passionate about giving his visitors a guide to all that is free and good in the software market. “Ninety-five percent of the software products that people need are available as a freeware version,” he says. “Free” means different things to different people, of course, but in the Tech Support Alert definition, it refers to products that provide the kind of quality and functionality that users might otherwise expect to pay for. In other words, crippled or limited function products need not apply. “Before you buy a product, you should routinely consider getting a freeware version,” he told me. You can listen to my 31-minute interview with Richards here.

Gizmo and his volunteers excel at finding sources of free downloads, sometimes digging through little-known download sites to find early versions of commercial programs that don’t carry a fee.  For example, in the category of free backup programs, they recommend WinBackup V1.86 from Uniblue Systems.  “Although it’s no longer available from the vendor’s site, the program file winbackupfreedr.exe can still be downloaded from a number of sites,” the editors note. “The vendor is offering the older version for free with the hope that users might upgrade at some later time…However, the old program is good enough that most users probably won’t need to.”

Backup is one of more than 200 categories of software that Tech Support Alert lists.  Its freeware database runs into the thousands of programs, which is actually only a tiny fraction of the products available.  Richards and his volunteer team test every package they can get their hands on and recommend only the handful that they judge the best. Reviewers, “don’t have to be technical geniuses, but they do have to be technically competent and have a command of the English language,” he says.

The quality of free software has improved in the decade or so that Ian Richards has been covering the market.  Vendors have discovered that by offering scaled-down versions of their commercial products that meet the needs of the vast majority of customers, they can sell premium versions to those who require the very best.  Richards offers the example of AVG Technologies, a security software company whose antivirus and anti-spyware utilities have been running on my PCs for over two years.  AVG’s giveaway programs meet the needs of most home and small business users, but corporations will probably want to pay for the peace of mind and support that they get from the commercial versions.

AVG and others like it are blazing trails of a new kind of marketing innovation.  In the same way that America Online introduced hundreds of corporations to the power of the Internet, these companies are realizing that the technology that people use at home can create opportunities for commercial business. Tech Support Alert encourages this view by raising visitors’ awareness of the low-cost options that are available to them.  The site’s 100,000 daily page views testify to its value.

Gizmo Richards is at an age when many people think about retirement. Instead, he’s helping forge a new model for the software industry.  We’re lucky to have him.

Listen to the interview.

Opportunity Amid the Rubble

From Innovations, a website published by Ziff-Davis Enterprise from mid-2006 to mid-2009. Reprinted by permission.

During the great dot-com collapse of 2001/2002, I had the good fortune to work for a company that swam against the tide. The Internet startup was in its third year of fighting a battle against big and entrenched competitors. It was a David-and-Goliath story to begin with, and as the markets tanked, some people took on a sense of gathering gloom.

But their fears never came to pass. During a two-year period in which our competitors’ businesses shrank by over 40%, my company actually grew by half, albeit from a much smaller base. The reason is that our investors made a conscious decision to invest in the business at a time when that seemed like the last thing we should do. They believed that by providing customers with superior price/performance at a time of rampant budget-cutting, we would break from the pack. When the recovery arrived, we zoomed forward faster than we had ever thought possible.

It’s hard to think about investing right now. Conventional wisdom holds that a recession is a time to back off on investment and play it safe. Many big businesses will do that during the next few months.

But the problem with the conventional wisdom is that it’s so, well, conventional.  If everyone does the same thing, then nothing changes very much.  Markets are only transformed when businesses take advantage of competitive dislocation and redouble their efforts to change the rules.  The history of the IT industry is filled with examples.

In the last great US economic downturn in the early 1990s, companies like Oracle, Microsoft, Novell and Dell burst out of the pack with value propositions that were superior to those of much larger competitors.  At the time, corporate America was growing weary of custom solutions and wanted to buy more technology off-the-shelf.  Innovative companies answered that demand with attractively priced products that were acceptable, if not ideal alternatives to the more expensive leaders. They timed the market right and quickly reached positions of leadership.

Beating the Bubble

The bursting of the Internet bubble a few years back was no fun for anyone, but it triggered the remarkable rise of Google and a host of web-based services that created new platforms for innovation.  Customers had grown tired of maintaining shelves full of the software they didn’t use and on-demand platforms gave them an attractive alternative that was good enough and considerably cheaper than what they had previously used.

Even large companies can benefit from the self-examination that lean times bring. IBM confronted its own mortality in the recession of the early 90s and decided to transform its business.  Sure, the company went through plenty of cost-cutting, but it also maintained a commitment now to draw down its investment in research and development.  As the economy recovered in the mid-90s, those innovations investments paid off in a stream of new products competitors couldn’t match. The company also took advantage of recession to adjust its culture to delivering value in a few markets instead of dominating them all.

In all the cases above there was one constant: hard times created a need for new value.  The companies that benefited where the ones that invented ways to deliver “good enough” value at significantly lower cost.  There’s no reason to believe this time will be any different. Citi Investment Research is forecasting that corporate IT spending in the U.S. will drop 10% to 20% in 2009. This pullback hurts established players most, but it’s an opportunity for upstarts.

If you compete with giants, I can guarantee you that they are now hunkered down in their foxholes waiting for this whole mess to end.  They’re playing defense, focusing on protecting their business and sustaining their profitability.  One thing they’re not focused on is you.  In fact, at times like these the last thing businesses tend to think about is their competitors and their customers.  That’s what makes now such a great time to move against them. For entrepreneurial businesses with the right attitude, the next year could be the best they’ve ever had.

I’m not trying to make a case to go on a spending spree. Hard times demand discipline. But they are also an opportunity to think of new ways to meet customers’ needs on a thinner budget. A rising tide lifts all boats, but a receding tide draws attention to the sailors who are bold enough to leave the harbor.

The Crime Economy

From Innovations, a website published by Ziff-Davis Enterprise from mid-2006 to mid-2009. Reprinted by permission.

Is access to your corporate Web server worth $740?  That’s the average price a computer criminal pays today for information about a security flaw at a specific financial institution, according to a new report from Symantec.  While some exploits command as much as $3,000, information about other corporate security flaws are being sold for as little as $100.

That’s not the only example of corporate security on sale.  Hackers can purchase links to webpages that have known security vulnerabilities for about 40 cents per link in bundles of 500.  Or they can buy their own remote file included (RFI) scanner for about $25 and identify those PHP-induced flaws themselves.

This information and much more is contained in a new report entitled “Symantec Report on the Underground Economy” that can be freely downloaded from Symantec’s website.  The 84-page document paints a picture of a vast marketplace that traffics in the tools and the spoils of computer crime, creating a recursive ecosystem that feeds upon its own success.

The report is hair-raising, not so much because it identifies new vulnerabilities in corporate information systems but because it documents the efficiency of the market that traffics in the tools and spoils of computer crime.

In this new underground economy, tens of thousands of anonymous entities advertise tools that can be purchased for modest sums and used to create spam attacks, phishing farms and direct assaults on corporate servers.  The people who buy these tools then sell the spoils of their work to brokers who remarket the information to other criminals.

Those groups may in turn produce bogus credit cards or orchestrate massive credit fraud and identity theft operations that cost businesses billions of dollars in losses.  One estimate put the cost of phishing attacks alone at $2.1 billion for US consumers and businesses in 2007.

Vulnerability for Sale


Source: Symantec

The electronic flea markets that enable this evil are networks of IRC servers and covert websites that  visitors use to bid upon tools and information.  The average price of a botnet, for example, is just $225 and a single botnet can be rented out for used by other criminals to produce an income stream or specialized service. Hackers can buy all the security-busting software they want for less than $500. Who needs to be a technical expert any more?

Skilled professional criminals can rake in unbelievable sums.  Symantec estimates that one organization that specializes in phishing made $150 million in 2006 from stealing bank credentials alone.  Another operation that mass-produced counterfeit credit cards was reportedly earning up to $100,000 a day.

The disheartening message in these statistics is that the enemy of corporate security managers is no longer a script kiddie working in his basement but a vast and faceless network of entrepreneurs and arbitrage experts cooperating in a strikingly efficient marketplace with total anonymity.  In a one-year period, Symantec observed nearly 70,000 advertisers on various underground economy servers hosting more than 44 million messages.  These criminals are so active because the system works.  Computer crime has become, in effect, a vast peer-to-peer network. And as the recording industry has learned painfully, peer-to-peer networks are nearly impossible to stamp out.

If you’re hoping to hear about the magic pill to cure this problem, you’re out of luck. The Symantec report offers no advice, either. Instead, it documents the sophistication of a distributed operation that is financially motivated to constantly attack the institutions of commerce and government. Our only defense is to be buttoned down, well-educated and prepared for a long struggle.

The Technology Whitewater

From Innovations, a website published by Ziff-Davis Enterprise from mid-2006 to mid-2009. Reprinted by permission.

One of the most valuable newsletters I receive is called Knowledge@Wharton, an information-packed digest of the latest insights from the faculty and associates of the University of Pennsylvania’s famed business school.  An interview this week caught my eye because it deals with tactics for adapting to today’s very challenging business environment.

Gregory Shea and Robert Gunther have written a new book called Your Job Survival Guide: A Manual for Thriving in Change in which they draw a wonderful analogy to today’s chaotic business world and whitewater kayaking. The authors describe the current business environment as one of a permanent whitewater, in which mass confusion reigns and few safe havens appear to exist.

Whitewater is scary to the uninitiated, but to seasoned kayakers, it’s the height of exhilaration.  Whitewater is fun because, while frightening, it doesn’t have to be dangerous if you know what to do.

Experienced kayakers know not to dive in to a whitewater and ride it to the end.  Instead, they navigate the rapids in stages.  Between bursts of activity, they pull off to the side in little pools called eddies, catch their breath and prepare for the next stage in the journey. The trip is basically a process of navigating between eddies.

What a wonderful analogy! The technology world is unquestionably chaotic right now, even without the financial meltdown. Constant change frustrates predictability. The idea of building a career by mastering a single discipline and applying it for decades is as dead as the manual typewriter.  Programmers know this.  Anyone who has navigated the industry’s migration from Perl to Python to Ruby on Rails, for example, knows that expertise in one discipline doesn’t guarantee long-term career success. However, a core expertise in scripting is valuable in all scenarios.

The key is to understand your core skills and to learn to apply them in areas where there is market demand.  Seek the eddies while constantly scanning the horizon for the next set of challenges.

A Personal Story

I’ll tell a personal anecdote that’s relevant.  I was trained as a journalist because that field was a natural outlet for my skills as a writer and storyteller. Early in my career, I discovered that the technology field offered the best opportunity to apply those skills.  That served me well for nearly 20 years, but in the late 1990s, the market began to change.  Print publishing was dying in the technology market, so I jumped to an Internet startup and spent the next six years learning the unique demands of that medium.

Upon striking out on my own in 2005, I quickly discovered that yet another new opportunity was emerging in the field of social media.  Millions of people, and many thousands of businesses, were going online to become, in effect, publishers. It was quickly evident to me that the disciplines I had learned in 20 years of technology journalism were very relevant in this new world.  People had the capacity to publish, but most of them lacked the skill to communicate in compelling ways.  The skills I had learned in 25 years of publishing were still relevant, even though the medium and the audience had completely changed. Today, I’m applying those skills in a manner that I couldn’t have imagined a decade ago.

The process has been scary at times, but it’s also been rewarding never boring.  That brings me back to the analogy of whitewater rafting.

Learn to Roll

In the interview with Wharton, the two authors talk of the Eskimo roll, which is a maneuver that kayakers learn to adjust to tumultuous conditions.  Instead of fighting rough water, veterans can strategically capsize their craft to protect themselves against obstacles.  For most boaters, capsizing is a disaster. But for kayakers it’s an essential coping skill, as long as they can fight the natural human tendency to panic.

Many of us are feeling a little capsized right now, and the panic reflex is kicking in.  Keep in mind that we are all wrestling with the same uncertainties and trying to figure out survival strategies. No one has the answers.  Focus on keeping your core skills sharp and apply them creatively to whatever opportunities this tumultuous market presents.  You will come out OK.  Although it might not hurt to be ready to do a few Eskimo rolls along the way.

The Cloud As Platform

From Innovations, a website published by Ziff-Davis Enterprise from mid-2006 to mid-2009. Reprinted by permission.

Nearly a decade ago, a well-funded startup called Storage Networks promised to revolutionize the data center by moving enterprise storage into the cloud. Customers would keep their production data off-site in a highly secure facility and access it over the Internet. Unfortunately, the concept of cloud computing was unknown at the time, and the Internet itself was neither fast nor robust enough to permit large corporations to get comfortable with the idea. Storage Networks flamed out.

Now EMC is taking a run at a similar idea using the concept of cloud storage. Its technology, called Atmos, offers a glimpse of how far the cloud concept has come in a few short years and how its emergence as a new platform could drive a new wave of innovation.

As described by EMC, Atmos is a lot more than just a new breed of network storage.  The distributed technology uses an object model and inference engine to make intelligent decisions about where to store, copy and serve data.  With the world as its data center, Atmos is said to be able to flexibly move information to the point where it can be most efficiently served to the people who need it.  For example, if a cliffhanger election in Florida causes a surge of interest from local voters, election results data could be automatically routed to nearby servers.

Intelligent routing is just one of the intriguing ideas that the cloud supports, and it doesn’t have to apply just to storage.  In the future, virtual data centers will consist of computing resources spread around the globe.  Server power can be flexibly deployed to regions that need it. Backups could be administered at a high-level. For example, an organization could specify dual redundant backups for some critical data but only a single backup for less important information.  When the entire fabric is virtualized, this kind of flexibility becomes part of the landscape.

At this point, Atmos is still brochureware, and EMC isn’t sharing any customer experiences.  But I think the concept is more important than the product. Very large and distributed cloud networks can theoretically provide users with almost unlimited flexibility and economies of scale. Systems management, which is an expensive and technical discipline that very few companies do well, could be centralized and provided to all users on the network.  Customers should be able to define policies using a simple dashboard and let the inference engine do the rest.

We are only in the early stages of realizing these possibilities, but the emergence of real-world cloud computing platforms will usher in a new era of innovation.  Platform shifts invariably do that. Coincidentally, NComputing this week will announce an appliance that turns a single desktop PC into as many as 11 virtual workstations.  The company claims that the technology lowers the cost per workstation to about $70.

When applied to a cloud of servers, you can imagine technology like this scaling much higher.  Instead of having to run around supporting hundreds of physical workstations, IT organizations would only have to worry about a few powerful servers providing virtual PC experiences to users.  Move those servers into the cloud, and you can begin to apply best-of-breed security, resource and systems management to each user. The economies of scale become very compelling very fast.

The biggest leaps in technology innovation take place whenever platforms shift.  The cloud is now beginning to come into its own as a legitimate platform. Things should get pretty exciting from here.

Assessing the Candidates on Technology

From Innovations, a website published by Ziff-Davis Enterprise from mid-2006 to mid-2009. Reprinted by permission.

Tomorrow, Americans will choose between two presidential candidates who have very different ideologies. Although John McCain and Barack Obama both agree in principle on the need to improve the U.S.’s technology competitiveness, they disagree on some issues that are important to technology professionals. Here is an overview of their similarities and differences on some critical technology policy issues.

Technology education

Both candidates agree on the need to hire and train more teachers with technology skills as well as to improve the competitiveness of American students in science and technology.

McCain proposes to fully fund the Bush administration’s America Competes Act, which provides a variety of educational investments.

Obama supports doubling federal funding for basic research over ten years and promoting the National STEM Scholarship Database Act, which would create a database to coordinate information on financial aid opportunities available in science and technology

Tax policy

Here, McCain is more specific than Obama.  His initiatives include:

  • Make the R&D tax credit permanent
  • Keep capital gains taxes low
  • Allow first-year expensing of new equipment
  • Oppose Internet taxation
  • Oppose higher taxes on wireless service
  • Lower the corporate tax rate to 25%

Obama also supports making the R&D tax credit permanent, but his tax plan is more oriented around individuals and families. He does support tax credits for small businesses and corporations that invest in jobs in the United States.

Government’s role

both candidates believe government should be a standard-bearer for effective use of information technology and each is quite specific in how they will get there.

McCain promises to create a nationwide public safety network by the end of his first term that would support first responders in emergencies. He wants to set up more Cooperative Research and Development Agreements (CRADAs), in which industry and government enter into public/private projects. He believes more government services should be available online and that the government should use videoconferencing and collaborative networks more effectively. Finally, he proposes to “ensure that Administration appointees across the government have adequate experience and understanding of science, technology and innovation.”

Obama focuses on accountability, which he says has been lacking in the Bush administration. He pledges to use “cutting-edge technologies” to create a new level of transparency and accountability for government and to appoint the nation’s first Chief Technology Officer (CTO) to coordinate infrastructure, policies and services across federal agencies. He also pledges to reinvigorate antitrust prosecution.

Obama also proposes specifically to invest $10 billion per year over the next five years to create standards-based electronic health information systems, including electronic health records. He also seeks to invest $150 billion over the next ten years to enable American engineers, scientists and entrepreneurs to advance alternative energy.


The Obama campaign has probably made better use of the Internet as a campaign tool than any previous candidate.  Not surprisingly, Obama supports broad expansion of Internet access to every American. However, McCain’s objectives are similar in many ways.

McCain proposes to “encourage private investment to facilitate the build-out of infrastructure to provide high-speed Internet connectivity all over America…and allow local governments to offer such services, particularly when private industry fails to do so.” He also wants to establish a “People Connect Program” that rewards companies that offer high-speed Internet access services to low-income customers. He opposes “net neutrality” in favor of “an open marketplace with a variety of consumer choices.”

Obama proposes to provide “true broadband to every community in America through a combination of reform of the Universal Service Fund, better use of the nation’s wireless spectrum, promotion of next-generation facilities, technologies and applications, and new tax and loan incentives.” He also wants to give parents more control over information their children see on-line while vigorously enforcing laws against people who try to exploit children. He supports net neutrality.

Global trade

Both candidates want to see America become more competitive in overseas technology markets. Both support cracking down on intellectual property theft abroad.

McCain sees to expand the number of H-1B visas to enable US companies to employ more foreign workers.

Reinventing U.S. Innovation

From Innovations, a website published by Ziff-Davis Enterprise from mid-2006 to mid-2009. Reprinted by permission.

John Kao believes the United States has an innovation crisis, and he’s calling on today’s corps of young technology professionals to sound the alarm.

Citing technology pioneer Vannevar Bush’s assertion more than 60 years ago that “A nation that loses its science and technology will lose control of its destiny,” Kao said the U.S. is in peril of becoming a technology laggard.

“The US public education system is veering further away from preparing kids for the world,” the author of Innovation Nation: How America Is Losing Its Innovation Edge, Why It Matters, and What We Can Do to Get It Back told the MIT Enterprise Forum early this month. “We spend more on education than any country in world, yet we’re between 24th and 29th in math performance.”

By contrast, Finland, a country that suffered a near economic collapse after the Soviet Union fell apart, today produces twice as many Ph.D.s per capita than the U.S. The Finns turned around their economy, in part, by creating a national design focused on science and technology education. As a result, “Two years ago, Finland was the number one competitive economy in the world, according to the World Economic Forum,” Kao said. “Its education system is rated the best in the world. People want to be teachers in Finland.”

We’ve heard this before, of course.  In the late 1980s, Japan famously challenged the US for leadership in technology innovation with initiatives like the Fifth Generation Computer Project and a nationwide commitment to developing artificial intelligence.  Those ambitious plans foundered, but Kao argues that this time is different.

Today, countries like Singapore and China are making technology innovation the centerpiece of a national strategy that’s backed by incentives, equipment and money. Singapore, for example, has set up the Biopolis, a $300 million biomedical research lab spread across a 10-acre campus. Singapore is allocating money to train expatriate scientists in other countries on the condition that they repay the government with six years of service. The country is also promising to remove the layers of bureaucracy and legal approvals that frustrate scientists in the U.S.

Singapore has convinced top researchers from MIT and the U.S. Centers for Disease Control to pull up stakes and move to the tiny nation-state with financial incentives and promises of academic freedom.

This last point is a key difference between the national technology policies of today and the failed models of two decades ago.  Thanks to the Internet and business globalization, countries now have the capacity to build very large local industries serving overseas customers. Kao told of a friend who’s building a global travel business for 1/10th of what it would have cost a decade ago. He farms out much of the development work overseas. “Countries want to ally with American intellectual capital,” he said.

Therein lies a challenge for US competitiveness. The United States has long been able to rely upon the global brain drain from other countries to fuel its innovation economy. Over half of the engineering Ph.D.s awarded in the U.S. now go to foreign-born students. Many of those people have traditionally settled in Silicon Valley or other technology-rich environments. But the lifestyle trade-offs aren’t as dramatic as they used to be. “Now there’s an Apple store and a Starbucks in Bangalore [India],“ Kao said.

With overseas economies offering tax havens, comfortable salaries, research grants and other perks to technology achievers, some countries that used to lose talent to the US have actually reversed the migration.

What can US technology professionals do?  Well, on a purely selfish level there may be some attractive opportunities to pull up stakes and move overseas. Singapore, for example, has earmarked a half billion dollars to fund digital imagine research. But if you’re more interested in improving the domestic situation, then start by voting for candidates that have a vision and a plan for US technology competitiveness.

You can also out into the classroom and share your own experiences with tomorrow’s innovators.  Many teachers would be glad for the help. In John Kao’s words, “Many people who teach math and science in U.S. publice schools are forced to do it.” In contrast, “In China, people with masters degrees in math willingly come in to teach in the schools.”

A Regulatory Boost for the Cloud

From Innovations, a website published by Ziff-Davis Enterprise from mid-2006 to mid-2009. Reprinted by permission.

In a recent podcast interview on Tech Nation, Tim Sanders, author of Saving the World at Work, quotes a remarkable statistic: it’s been estimated that Google could save 750 megawatts of electricity every year by changing the color of its ubiquitous homepage from white to gray.  That’s because monitors require more electricity to energize the brighter white phosphors.

The total cost savings of roughly $75,000 a year may not convince Google to overhaul its site design, but the statistic drives home the effect that economies of scale can have in computing.

There’s a lot of attention being paid to economies of scale these days as IT consumes a growing proportion of natural resources in the US. IT organizations are increasingly going to find themselves on the hot seat to go green not just because it’s the nice thing to do, but because it’s good sense for the bottom line.

Consider these facts:

  • The Environmental Protection Agency has estimated the data centers consume about 1.5% of all electricity in the US and that about a quarter of that energy is wasted;
  • Gartner estimates that more than 2% of global atmospheric carbon emissions can be traced to the IT industry;
  • Gartner expects that more than 925 million computers and one billion mobile phones will be discarded over the next two years.
  • The International Association of Electronics Recycles estimates that about 400 million units of electronic refuse are generated annually. The actual amount of “e-trash” may be higher because nervous businesses have stockpiled old equipment rather than paying for disposal.

The twin effects of spiraling energy costs and environmental hazards are creating a double whammy. Underutilized computers are consuming increasingly expensive energy and also taking a greater toll on the environment. In Europe, businesses are working under a new standard called the Restriction of Hazardous Substances directive, which limits the use of dangerous chemicals in computers.  US businesses are keeping a close eye the directive, not only because it affects their European businesses but because they see it as a role model for similar legislation here.

Some very large businesses are beginning to make green computing part of the core corporate values. An article by the University of Pennsylvania’s Wharton school tells of Bangalore-based IT services firm Wipro Technologies’ energy efficiency mandates. It tracks metrics like carbon dioxide emissions and paper consumption per employee and is outfitting workers with a new kind of energy-efficient workstation.

In the US, the most promising option is hosted or “cloud” computing. This takes advantage of the excess capacity that exists in giant data centers run by companies like Amazon and IBM. Why let that spare power go up in smoke if there are small businesses that can tap into it?

Increasingly, they are. On Palo Alto’s Sand Hill Road venture capital foundry, technology entrepreneurs that propose to run their businesses on internal servers are referred to as having a “Hummer” strategy.  In other words, they’re consuming far more power than they need to drive their computing environment.  New software and services firms are bypassing captive data centers and opting to farm out everything to third parties. Technology entrepreneur John Landry recently told me that the cost of hosted computing is coming down so fast that it no longer makes sense for a new business to even consider investing in a captive data centers. In other words, it will never be cheaper to manage the infrastructure yourself.

In a consolidated hosting environment, every tenant benefits from the economies of scale provided by the host. What’s more, the arrangement shifts responsibility for technology recycling and disposal to a central entity that has a vested interest in best practices. As the cost of energy continues its inevitable rise and legislators stump for stronger regulation, the appeal of the cloud only grows.

Crisis Tests IT’s Influence

From Innovations, a website published by Ziff-Davis Enterprise from mid-2006 to mid-2009. Reprinted by permission.

Between checking retirement portfolios and flipping over to or Moneywatch every hour or two, a lot of people aren’t’ getting much work done this week. Who can blame them? Between the unprecedented bankruptcies of some of Wall Street’s biggest firms, the turmoil in the stock market and dire statements from top officials in the U.S. government, it’s easy to believe that the world is caving in.

I’m nervous, too, but I’m also resisting the urge to forecast disaster because I just don’t think the Great Depression can happen again. Part of the reason is information technology.

There are two ways in which IT can make important contributions to pulling the U.S. economy out of chaos: by empowering rapid decisions and enabling communication. As evidence of the first dynamic, look at the chart below. It shows growth in U.S. gross domestic product from 1930 through 2007. You don’t have to squint much to see where the trend has been going. The huge swings in GDP performance that occurred during the Depression and war years have gradually become gentler and more predictable. Negative growth, which was a regular occurrence during much of the 20th century, has only occurred once in the last 25 years. The further right you go on the chart, the more boring economic performance becomes. That’s precisely how businesses like it. Consistency sets the stage for more confident long-term planning.

Technology’s Shadow Role

There are multiple reasons why the economy has stabilized in recent decades; globalization and the Federal Reserve are certainly two of them. But I would suggest that information technology plays a shadow role. Note that the GDP numbers show a clear smoothing trend beginning around the middle 1960s, which was when computers began to make their way into back offices on a grand scale. The trend becomes even more refined in the late 1980s, when PCs started landing on every desktop.

I think it’s no coincidence that economic cycles became less volatile when managers and regulators began deploying sophisticated models to predict the path of business. Even as the economy has been roiled by financial crises, 9/11 and the bursting of the Internet bubble over the last two decades, recessions have tended to be shallow and brief and recoveries have been smoother and more sustained than in previous cycles. One factor may be that economic plays have more sophisticated means to model the impact of their decisions than they did before. That leads to better forecasting and quicker mid-course corrections, which makes for less volatility. No one’s suggesting that we aren’t in for some difficult times, but if the past is any indication, we’re better equipped to pull out of the tailspin today than ever before.

The other potentially positive IT influence on economic cycles is the Internet, and in particular Web 2.0. Within just the last five years, businesses have embraced robust new ways to communicate with their constituencies. As new economic surprises have turned up almost daily over the past few weeks, people have flocked to their Facebook groups, Twittered their concerns and voiced their opinions on news sites like never before. Smart business leaders should be tapping in to these conversations and using them to help guide their own decisions. If you want to learn how your customers are thinking about the latest dose of bad news these days, you need only to ask them or just listen. Trends that used to take months to identify can now be discerned in a few hours. It’s too early to know what the impact this fact will have on economic performance, but it’s likely to encourage faster and more competent decision-making.

Web 2.0 also enables corporate leaders to communicate directly to their constituents to offer own perspectives on unfolding events. Unfortunately, they aren’t doing much of that yet. A quick check of blogs operated by Chrysler, Marriott, McDonald’s, Whole Foods, Accenture, Boeing, Wal-Mart and Southwest Airlines shows that none has yet departed from delivery cheery good-news fare to comment upon the economic issues that weigh most heavily on American minds. Cheers to General Motors and PriceWaterhouseCoopers for attempting to lend some of their perspective to the conversation. I only hope the others are too busy listening at the moment to make time to state their own views. America certainly wants to hear them.

Innovation From Below

From Innovations, a website published by Ziff-Davis Enterprise from mid-2006 to mid-2009. Reprinted by permission.

On a visit to Compaq Computer (remember them?) several years ago, an executive told me proudly that Compaq had reinvented itself to avoid becoming what he referred to as “the Mercedes of PCs;” in other words, the luxury supplier in a market that was increasingly demanding low cost and high volume.

It turned out that Compaq succumbed to cheaper competition, anyway. It quality-at-all-costs culture couldn’t successfully accommodate the “good enough” demands of the emerging market.

This story came to mind this week as I noted the impending launch of the Tata Nano (below), a new low-cost automobile being introduced next month by India’s giant industrial conglomerate Tata Group. Priced at just $2,500, the Nano is a marvel of simplicity.  At a time when the cheapest American car costs nearly $11,000, it’s a bid to rewrite the economics of the global auto industry in an effort to reach a giant emerging class of consumers.

The Nano is remarkable not for its technological sophistication but for what designers chose to leave out.  For example, the base model has no air conditioning, power brakes or radio.  Even its steering column is constructed of hollowed-out steel to save money on materials and reduce weight.  It has a top speed of just 65 mph from its tiny rear-mounted engine.

Tata made these trade-offs in order to accommodate other needs that it believes are more important to its audience.  Specifically, the Nano can accommodate four passengers while putting around at 50 miles per gallon.  It meets all Indian emission, pollution and safety standards, although admittedly those are looser than in the US.  It’s also designed for modular assembly, meaning that local entrepreneurs can add value by swapping in their own components for those provided by the manufacturer. From its lowest-cost perch, it can easily be scaled up without sacrificing its compelling price advantage.

An Open Source Automobile

The Nano may be the world’s first open-source car.  Tata intentionally chose to design the vehicle around modular components and to single-source most of those parts in order to keep costs down. Modularity also creates an opportunity for innovation in manufacturing, for the Nano is designed to be assembled in small quantities. Entrepreneurs can set up boutique assembly shops to build the vehicles profitably in small quantities for their local markets.  They can plug in the components that their customers value most and still retain significant cost advantage.

This kind of innovation goes against the grain of most American thinking.  In our technology-driven culture, we are wedded to the principle that sophistication and elegance define innovation.  We celebrate product over process and take pride in the number of patents our companies earn.

In contrast to Tata’s bare minimalist approach, American automakers have increasingly been outfitting their vehicles with navigation systems, video entertainment and computerized wizardry encoded in proprietary black-box designs. A high-end DVD entertainment system in an American car costs more than an entire Nano. While there’s no question that these high-tech products are innovative, but they serve a narrow slice of the luxury market that is already saturated with choice.

Tata, in contrast, is focused on volume. It expects to sell over one million Nanos in the first year and to ramp up steeply from there.  It’s targeting the 98 out of every 100 Indians who don’t own a car.  In contrast, the U.S. has 76 vehicles for every 100 citizens.

There’s a lesson here for tech professionals. As I’ve noted before on this blog, disruptive market change almost always comes from below. In computers, consumer electronics, software and even retailing, low-cost providers who unlock new markets invariably displace high-margin boutique businesses at the high end.

Tata is targeting the hundreds of millions of consumers who will constitute the middle class of the future. These people have never owned a car and have no affinity to a brand. Most have also never owned computers, luxury electronics or online bank accounts. They are a green field of opportunity for companies that can meet their need for affordability and convenience. While Tata will no doubt face skeptics and setbacks, it has defined a clear vision and built a company around it.

Is your company positioned to meet this opportunity? Or is it destined to become the Mercedes of your market? Comment below and tell me if you think low-cost markets are an important opportunity to you.