Innovation in Anonymity

From Innovations, a website published by Ziff-Davis Enterprise from mid-2006 to mid-2009. Reprinted by permission.

I recently had two MRI scans on my back. Magnetic resonance imaging is a wonderful technology that enables doctors to see inside the body with depth and precision that conventional x-rays can’t match.

But MRIs are also mysterious and even frightening procedures for patients. A person is drawn inside a small cylinder and subjected to a series of loud noises for as much as 45 minutes. The attending radiologist told me that about 80% of patients experience some kind of claustrophobic stress, forcing technicians to frequently paused the procedure to calm them down.

I should have known about all this because my MRI provider’s website features a wonderful interactive experience that describes the benefits of MRI in a collage of high-resolution images and video tutorials. It also has a multimedia tour of the MRI experience that even includes samples of the odd sounds patients hear. This information would have been immensely useful to me if I’d known it existed, but I didn’t learn of the feature until weeks after the procedures, when I stumbled upon it in the context of a different discussion.

In fact, at no time during my interactions with people at the MRI center did anyone inform me that this resource existed. It wasn’t listed on the company’s letterhead or the preparatory documents sent to patients. A software project that had no doubt cost the company thousands of dollars was barely even referenced on the provider’s homepage.

Failure to Promote

This situation is all too common in businesses. Technology innovators dream up clever new ways to serve their customers and then don’t tell anyone about it. Customer service reps and automated voice response systems routinely refer visitors to generic homepages with meaningless statements like, “more information is available on our website.” But who has the time to go and find it?

Somewhere inside these companies a disconnect has occurred between the technologists and the people who interact with customers. Businesses assume it’s okay to hire service reps who haven’t a shred of technical expertise because those skills aren’t required to interact with the public. IT people are taught to do their jobs and then go home. Cooperating with others to promote the tools they build isn’t part of the job description.

But it should be. Today’s customers are too busy to spend time searching for resources they don’t know exist. The people who commission customer-facing projects may move on to other jobs or companies, leaving their creations without a sponsor.

IT people need to step up to the plate and promote the fruits of their labors because no one else is going to do it. Here are some steps my MRI provider could have taken:

If you are a as if it is a Promote the resource in printed documents – Health-care providers produce lots of paper, yet none of the informational documents I received even mentioned the website experience.

Post signs — A poster in the lobby or window could have alerted me to the existence of this great application.

Train customer service personnel — In my multiple phone calls with the clerical staff, no one recommended that I even visit the website.

Set up a lobby demo — PCs are cheap; why not make it easy for customers in the waiting room to learn what the company has offer?

This adds up to an opportunity missed for some innovative IT person whose creativity and hard work won’t receive the recognition it deserves. Don’t let your good work go to waste because you forgot to tell anyone about it.

Coaxing Web 2.0 Into the Enterprise

From Innovations, a website published by Ziff-Davis Enterprise from mid-2006 to mid-2009. Reprinted by permission.

McKinsey has a new report on enterprise adoption of Web 2.0 technologies, and the findings should give pause to IT organizations planning to roll these tools out to their internal customers.

Overall, these technologies — which include wikis, blogs and social networks — are making steady progress into the organizations represented by the nearly 2,000 respondents to the survey.  What’s striking is the disparity between those companies that have made a commitment and those that are still skeptical. The companies that have drunk the Web 2.0 Kool-Aid report that it’s changing the very nature of their businesses and that they plan to expand their commitment this year.

Among early adopters, tools are being used to develop new products collaboratively, reinvent internal communications and transform the process of communicating with customers.  Only 8% of the executives who describe themselves as satisfied Web 2.0 users say the tools haven’t changed their organizations, compared to 46% of the self-described dissatisfied users.

However, the survey has a disquieting finding for IT organizations. Those companies that showed the least satisfaction with Web 2.0 tended to be the ones in which IT drove the initiative.  Companies that report the overall highest satisfaction with the tools and technologies are those in which IT plays NO role in selection and deployment. Conversely, those with the highest dissatisfaction levels are also the most likely to let IT lead the charge (see chart).

Why does this sad state of affairs exist?  I suspect it has much to do with internal culture and the ways in which the technology’s value proposition is defined for the ultimate users.

Taken at face value, the data suggests that IT is best left out of the Web 2.0 equation, but in my experience, technology groups play a vital role. One of the beauties of these new technologies is that they’re so simple and adaptable. Social networks, for example, can be used for anything from technical support to corporate knowledge management while wikis can perform at the project level or across an entire company. I’ve worked with several companies to implement Web 2.0 technologies, and the successful ones always go about it the same way.  A small number of enthusiastic users are given tools and the means to use them and then their creativity is allowed to filter through the organization.  IT plants the seed and then gives its customers the means to make the garden grow.

This process is invariably supported by managers who trust their people to do the right thing and who support experimentation and risk.  Conversely, I’ve never seen Web 2.0 succeed in companies that mandated it from the top or pushed it through the IT group.  Web 2.0 technology only works when people want to use it. Technology that enhances collaboration must necessarily be driven from the bottom up.

I cringe when I hear questions like this: “We want to start a corporate blog. What should we do with it?” If the technology doesn’t match a perceived need, no one is going to use it.

The best way to manage Web 2.0 adoption is to find those business side sponsors who have the curiosity to experiment and give them the means for discovery. The McKinsey report demonstrates that they quickly figure out their own uses for the tools, and their enthusiasm becomes contagious. It’s fulfilling enough to plant the seed and nurture the flower as it takes root and grows.

All the Best, All For Free

From Innovations, a website published by Ziff-Davis Enterprise from mid-2006 to mid-2009. Reprinted by permission.

The IT world is a better place because Ian Richards is in it.

A few years ago I stumbled across a website called “46 Best-Ever Freeware Utilities.” It contained a fantastic list of software covering many of a PC user’s basic needs ranging from tune-up utilities to security packages, office programs, multimedia and more.  The site was the work of a man who called himself Gizmo Richards. I quickly became hooked.

Gizmo’s site actually covered more than just 46 programs and his weekly “Support Alert” newsletter was a treasure trove of information about how to find free software that met or even exceeded the quality of commercial alternatives.  For two years, it was the only e-mail newsletter I paid for.

Support Alert ended its run as an independent publication last July, when Richards became a Senior Editor at Windows Secrets and merged Support Alert into that organization’s line of newsletters. But the mission that created the Web’s best source of information about free software lives on at Gizmo’s Tech Support Alert, a moderated wiki that’s carefully attended by Richards and a team of 60 volunteers.

I called Richards this week for his insight on the state of free software and found him to be quite unlike the person I imagined.  Ian Richards is an affable 62-year-old Australian, a veteran of the mainframe world who found his calling in the early days of the microprocessor era and who stumbled upon celebrity when his informal list of freeware utilities assembled on a lark became a viral phenomenon.  Tech Support Alert should be in the bookmark list of every PC enthusiast.  Its more than 20,000 citations on delicious.com attest to its popularity.

Richards is passionate about giving his visitors a guide to all that is free and good in the software market. “Ninety-five percent of the software products that people need are available as a freeware version,” he says. “Free” means different things to different people, of course, but in the Tech Support Alert definition, it refers to products that provide the kind of quality and functionality that users might otherwise expect to pay for. In other words, crippled or limited function products need not apply. “Before you buy a product, you should routinely consider getting a freeware version,” he told me. You can listen to my 31-minute interview with Richards here.

Gizmo and his volunteers excel at finding sources of free downloads, sometimes digging through little-known download sites to find early versions of commercial programs that don’t carry a fee.  For example, in the category of free backup programs, they recommend WinBackup V1.86 from Uniblue Systems.  “Although it’s no longer available from the vendor’s site, the program file winbackupfreedr.exe can still be downloaded from a number of sites,” the editors note. “The vendor is offering the older version for free with the hope that users might upgrade at some later time…However, the old program is good enough that most users probably won’t need to.”

Backup is one of more than 200 categories of software that Tech Support Alert lists.  Its freeware database runs into the thousands of programs, which is actually only a tiny fraction of the products available.  Richards and his volunteer team test every package they can get their hands on and recommend only the handful that they judge the best. Reviewers, “don’t have to be technical geniuses, but they do have to be technically competent and have a command of the English language,” he says.

The quality of free software has improved in the decade or so that Ian Richards has been covering the market.  Vendors have discovered that by offering scaled-down versions of their commercial products that meet the needs of the vast majority of customers, they can sell premium versions to those who require the very best.  Richards offers the example of AVG Technologies, a security software company whose antivirus and anti-spyware utilities have been running on my PCs for over two years.  AVG’s giveaway programs meet the needs of most home and small business users, but corporations will probably want to pay for the peace of mind and support that they get from the commercial versions.

AVG and others like it are blazing trails of a new kind of marketing innovation.  In the same way that America Online introduced hundreds of corporations to the power of the Internet, these companies are realizing that the technology that people use at home can create opportunities for commercial business. Tech Support Alert encourages this view by raising visitors’ awareness of the low-cost options that are available to them.  The site’s 100,000 daily page views testify to its value.

Gizmo Richards is at an age when many people think about retirement. Instead, he’s helping forge a new model for the software industry.  We’re lucky to have him.

Listen to the interview.

Opportunity Amid the Rubble

From Innovations, a website published by Ziff-Davis Enterprise from mid-2006 to mid-2009. Reprinted by permission.

During the great dot-com collapse of 2001/2002, I had the good fortune to work for a company that swam against the tide. The Internet startup was in its third year of fighting a battle against big and entrenched competitors. It was a David-and-Goliath story to begin with, and as the markets tanked, some people took on a sense of gathering gloom.

But their fears never came to pass. During a two-year period in which our competitors’ businesses shrank by over 40%, my company actually grew by half, albeit from a much smaller base. The reason is that our investors made a conscious decision to invest in the business at a time when that seemed like the last thing we should do. They believed that by providing customers with superior price/performance at a time of rampant budget-cutting, we would break from the pack. When the recovery arrived, we zoomed forward faster than we had ever thought possible.

It’s hard to think about investing right now. Conventional wisdom holds that a recession is a time to back off on investment and play it safe. Many big businesses will do that during the next few months.

But the problem with the conventional wisdom is that it’s so, well, conventional.  If everyone does the same thing, then nothing changes very much.  Markets are only transformed when businesses take advantage of competitive dislocation and redouble their efforts to change the rules.  The history of the IT industry is filled with examples.

In the last great US economic downturn in the early 1990s, companies like Oracle, Microsoft, Novell and Dell burst out of the pack with value propositions that were superior to those of much larger competitors.  At the time, corporate America was growing weary of custom solutions and wanted to buy more technology off-the-shelf.  Innovative companies answered that demand with attractively priced products that were acceptable, if not ideal alternatives to the more expensive leaders. They timed the market right and quickly reached positions of leadership.

Beating the Bubble

The bursting of the Internet bubble a few years back was no fun for anyone, but it triggered the remarkable rise of Google and a host of web-based services that created new platforms for innovation.  Customers had grown tired of maintaining shelves full of the software they didn’t use and on-demand platforms gave them an attractive alternative that was good enough and considerably cheaper than what they had previously used.

Even large companies can benefit from the self-examination that lean times bring. IBM confronted its own mortality in the recession of the early 90s and decided to transform its business.  Sure, the company went through plenty of cost-cutting, but it also maintained a commitment now to draw down its investment in research and development.  As the economy recovered in the mid-90s, those innovations investments paid off in a stream of new products competitors couldn’t match. The company also took advantage of recession to adjust its culture to delivering value in a few markets instead of dominating them all.

In all the cases above there was one constant: hard times created a need for new value.  The companies that benefited where the ones that invented ways to deliver “good enough” value at significantly lower cost.  There’s no reason to believe this time will be any different. Citi Investment Research is forecasting that corporate IT spending in the U.S. will drop 10% to 20% in 2009. This pullback hurts established players most, but it’s an opportunity for upstarts.

If you compete with giants, I can guarantee you that they are now hunkered down in their foxholes waiting for this whole mess to end.  They’re playing defense, focusing on protecting their business and sustaining their profitability.  One thing they’re not focused on is you.  In fact, at times like these the last thing businesses tend to think about is their competitors and their customers.  That’s what makes now such a great time to move against them. For entrepreneurial businesses with the right attitude, the next year could be the best they’ve ever had.

I’m not trying to make a case to go on a spending spree. Hard times demand discipline. But they are also an opportunity to think of new ways to meet customers’ needs on a thinner budget. A rising tide lifts all boats, but a receding tide draws attention to the sailors who are bold enough to leave the harbor.

The Crime Economy

From Innovations, a website published by Ziff-Davis Enterprise from mid-2006 to mid-2009. Reprinted by permission.

Is access to your corporate Web server worth $740?  That’s the average price a computer criminal pays today for information about a security flaw at a specific financial institution, according to a new report from Symantec.  While some exploits command as much as $3,000, information about other corporate security flaws are being sold for as little as $100.

That’s not the only example of corporate security on sale.  Hackers can purchase links to webpages that have known security vulnerabilities for about 40 cents per link in bundles of 500.  Or they can buy their own remote file included (RFI) scanner for about $25 and identify those PHP-induced flaws themselves.

This information and much more is contained in a new report entitled “Symantec Report on the Underground Economy” that can be freely downloaded from Symantec’s website.  The 84-page document paints a picture of a vast marketplace that traffics in the tools and the spoils of computer crime, creating a recursive ecosystem that feeds upon its own success.

The report is hair-raising, not so much because it identifies new vulnerabilities in corporate information systems but because it documents the efficiency of the market that traffics in the tools and spoils of computer crime.

In this new underground economy, tens of thousands of anonymous entities advertise tools that can be purchased for modest sums and used to create spam attacks, phishing farms and direct assaults on corporate servers.  The people who buy these tools then sell the spoils of their work to brokers who remarket the information to other criminals.

Those groups may in turn produce bogus credit cards or orchestrate massive credit fraud and identity theft operations that cost businesses billions of dollars in losses.  One estimate put the cost of phishing attacks alone at $2.1 billion for US consumers and businesses in 2007.

Vulnerability for Sale

vulnerability_prices

Source: Symantec

The electronic flea markets that enable this evil are networks of IRC servers and covert websites that  visitors use to bid upon tools and information.  The average price of a botnet, for example, is just $225 and a single botnet can be rented out for used by other criminals to produce an income stream or specialized service. Hackers can buy all the security-busting software they want for less than $500. Who needs to be a technical expert any more?

Skilled professional criminals can rake in unbelievable sums.  Symantec estimates that one organization that specializes in phishing made $150 million in 2006 from stealing bank credentials alone.  Another operation that mass-produced counterfeit credit cards was reportedly earning up to $100,000 a day.

The disheartening message in these statistics is that the enemy of corporate security managers is no longer a script kiddie working in his basement but a vast and faceless network of entrepreneurs and arbitrage experts cooperating in a strikingly efficient marketplace with total anonymity.  In a one-year period, Symantec observed nearly 70,000 advertisers on various underground economy servers hosting more than 44 million messages.  These criminals are so active because the system works.  Computer crime has become, in effect, a vast peer-to-peer network. And as the recording industry has learned painfully, peer-to-peer networks are nearly impossible to stamp out.

If you’re hoping to hear about the magic pill to cure this problem, you’re out of luck. The Symantec report offers no advice, either. Instead, it documents the sophistication of a distributed operation that is financially motivated to constantly attack the institutions of commerce and government. Our only defense is to be buttoned down, well-educated and prepared for a long struggle.

The Technology Whitewater

From Innovations, a website published by Ziff-Davis Enterprise from mid-2006 to mid-2009. Reprinted by permission.

One of the most valuable newsletters I receive is called Knowledge@Wharton, an information-packed digest of the latest insights from the faculty and associates of the University of Pennsylvania’s famed business school.  An interview this week caught my eye because it deals with tactics for adapting to today’s very challenging business environment.

Gregory Shea and Robert Gunther have written a new book called Your Job Survival Guide: A Manual for Thriving in Change in which they draw a wonderful analogy to today’s chaotic business world and whitewater kayaking. The authors describe the current business environment as one of a permanent whitewater, in which mass confusion reigns and few safe havens appear to exist.

Whitewater is scary to the uninitiated, but to seasoned kayakers, it’s the height of exhilaration.  Whitewater is fun because, while frightening, it doesn’t have to be dangerous if you know what to do.

Experienced kayakers know not to dive in to a whitewater and ride it to the end.  Instead, they navigate the rapids in stages.  Between bursts of activity, they pull off to the side in little pools called eddies, catch their breath and prepare for the next stage in the journey. The trip is basically a process of navigating between eddies.

What a wonderful analogy! The technology world is unquestionably chaotic right now, even without the financial meltdown. Constant change frustrates predictability. The idea of building a career by mastering a single discipline and applying it for decades is as dead as the manual typewriter.  Programmers know this.  Anyone who has navigated the industry’s migration from Perl to Python to Ruby on Rails, for example, knows that expertise in one discipline doesn’t guarantee long-term career success. However, a core expertise in scripting is valuable in all scenarios.

The key is to understand your core skills and to learn to apply them in areas where there is market demand.  Seek the eddies while constantly scanning the horizon for the next set of challenges.

A Personal Story

I’ll tell a personal anecdote that’s relevant.  I was trained as a journalist because that field was a natural outlet for my skills as a writer and storyteller. Early in my career, I discovered that the technology field offered the best opportunity to apply those skills.  That served me well for nearly 20 years, but in the late 1990s, the market began to change.  Print publishing was dying in the technology market, so I jumped to an Internet startup and spent the next six years learning the unique demands of that medium.

Upon striking out on my own in 2005, I quickly discovered that yet another new opportunity was emerging in the field of social media.  Millions of people, and many thousands of businesses, were going online to become, in effect, publishers. It was quickly evident to me that the disciplines I had learned in 20 years of technology journalism were very relevant in this new world.  People had the capacity to publish, but most of them lacked the skill to communicate in compelling ways.  The skills I had learned in 25 years of publishing were still relevant, even though the medium and the audience had completely changed. Today, I’m applying those skills in a manner that I couldn’t have imagined a decade ago.

The process has been scary at times, but it’s also been rewarding never boring.  That brings me back to the analogy of whitewater rafting.

Learn to Roll

In the interview with Wharton, the two authors talk of the Eskimo roll, which is a maneuver that kayakers learn to adjust to tumultuous conditions.  Instead of fighting rough water, veterans can strategically capsize their craft to protect themselves against obstacles.  For most boaters, capsizing is a disaster. But for kayakers it’s an essential coping skill, as long as they can fight the natural human tendency to panic.

Many of us are feeling a little capsized right now, and the panic reflex is kicking in.  Keep in mind that we are all wrestling with the same uncertainties and trying to figure out survival strategies. No one has the answers.  Focus on keeping your core skills sharp and apply them creatively to whatever opportunities this tumultuous market presents.  You will come out OK.  Although it might not hurt to be ready to do a few Eskimo rolls along the way.

The Cloud As Platform

From Innovations, a website published by Ziff-Davis Enterprise from mid-2006 to mid-2009. Reprinted by permission.

Nearly a decade ago, a well-funded startup called Storage Networks promised to revolutionize the data center by moving enterprise storage into the cloud. Customers would keep their production data off-site in a highly secure facility and access it over the Internet. Unfortunately, the concept of cloud computing was unknown at the time, and the Internet itself was neither fast nor robust enough to permit large corporations to get comfortable with the idea. Storage Networks flamed out.

Now EMC is taking a run at a similar idea using the concept of cloud storage. Its technology, called Atmos, offers a glimpse of how far the cloud concept has come in a few short years and how its emergence as a new platform could drive a new wave of innovation.

As described by EMC, Atmos is a lot more than just a new breed of network storage.  The distributed technology uses an object model and inference engine to make intelligent decisions about where to store, copy and serve data.  With the world as its data center, Atmos is said to be able to flexibly move information to the point where it can be most efficiently served to the people who need it.  For example, if a cliffhanger election in Florida causes a surge of interest from local voters, election results data could be automatically routed to nearby servers.

Intelligent routing is just one of the intriguing ideas that the cloud supports, and it doesn’t have to apply just to storage.  In the future, virtual data centers will consist of computing resources spread around the globe.  Server power can be flexibly deployed to regions that need it. Backups could be administered at a high-level. For example, an organization could specify dual redundant backups for some critical data but only a single backup for less important information.  When the entire fabric is virtualized, this kind of flexibility becomes part of the landscape.

At this point, Atmos is still brochureware, and EMC isn’t sharing any customer experiences.  But I think the concept is more important than the product. Very large and distributed cloud networks can theoretically provide users with almost unlimited flexibility and economies of scale. Systems management, which is an expensive and technical discipline that very few companies do well, could be centralized and provided to all users on the network.  Customers should be able to define policies using a simple dashboard and let the inference engine do the rest.

We are only in the early stages of realizing these possibilities, but the emergence of real-world cloud computing platforms will usher in a new era of innovation.  Platform shifts invariably do that. Coincidentally, NComputing this week will announce an appliance that turns a single desktop PC into as many as 11 virtual workstations.  The company claims that the technology lowers the cost per workstation to about $70.

When applied to a cloud of servers, you can imagine technology like this scaling much higher.  Instead of having to run around supporting hundreds of physical workstations, IT organizations would only have to worry about a few powerful servers providing virtual PC experiences to users.  Move those servers into the cloud, and you can begin to apply best-of-breed security, resource and systems management to each user. The economies of scale become very compelling very fast.

The biggest leaps in technology innovation take place whenever platforms shift.  The cloud is now beginning to come into its own as a legitimate platform. Things should get pretty exciting from here.

Assessing the Candidates on Technology

From Innovations, a website published by Ziff-Davis Enterprise from mid-2006 to mid-2009. Reprinted by permission.

Tomorrow, Americans will choose between two presidential candidates who have very different ideologies. Although John McCain and Barack Obama both agree in principle on the need to improve the U.S.’s technology competitiveness, they disagree on some issues that are important to technology professionals. Here is an overview of their similarities and differences on some critical technology policy issues.

Technology education

Both candidates agree on the need to hire and train more teachers with technology skills as well as to improve the competitiveness of American students in science and technology.

McCain proposes to fully fund the Bush administration’s America Competes Act, which provides a variety of educational investments.

Obama supports doubling federal funding for basic research over ten years and promoting the National STEM Scholarship Database Act, which would create a database to coordinate information on financial aid opportunities available in science and technology

Tax policy

Here, McCain is more specific than Obama.  His initiatives include:

  • Make the R&D tax credit permanent
  • Keep capital gains taxes low
  • Allow first-year expensing of new equipment
  • Oppose Internet taxation
  • Oppose higher taxes on wireless service
  • Lower the corporate tax rate to 25%

Obama also supports making the R&D tax credit permanent, but his tax plan is more oriented around individuals and families. He does support tax credits for small businesses and corporations that invest in jobs in the United States.

Government’s role

both candidates believe government should be a standard-bearer for effective use of information technology and each is quite specific in how they will get there.

McCain promises to create a nationwide public safety network by the end of his first term that would support first responders in emergencies. He wants to set up more Cooperative Research and Development Agreements (CRADAs), in which industry and government enter into public/private projects. He believes more government services should be available online and that the government should use videoconferencing and collaborative networks more effectively. Finally, he proposes to “ensure that Administration appointees across the government have adequate experience and understanding of science, technology and innovation.”

Obama focuses on accountability, which he says has been lacking in the Bush administration. He pledges to use “cutting-edge technologies” to create a new level of transparency and accountability for government and to appoint the nation’s first Chief Technology Officer (CTO) to coordinate infrastructure, policies and services across federal agencies. He also pledges to reinvigorate antitrust prosecution.

Obama also proposes specifically to invest $10 billion per year over the next five years to create standards-based electronic health information systems, including electronic health records. He also seeks to invest $150 billion over the next ten years to enable American engineers, scientists and entrepreneurs to advance alternative energy.

Internet

The Obama campaign has probably made better use of the Internet as a campaign tool than any previous candidate.  Not surprisingly, Obama supports broad expansion of Internet access to every American. However, McCain’s objectives are similar in many ways.

McCain proposes to “encourage private investment to facilitate the build-out of infrastructure to provide high-speed Internet connectivity all over America…and allow local governments to offer such services, particularly when private industry fails to do so.” He also wants to establish a “People Connect Program” that rewards companies that offer high-speed Internet access services to low-income customers. He opposes “net neutrality” in favor of “an open marketplace with a variety of consumer choices.”

Obama proposes to provide “true broadband to every community in America through a combination of reform of the Universal Service Fund, better use of the nation’s wireless spectrum, promotion of next-generation facilities, technologies and applications, and new tax and loan incentives.” He also wants to give parents more control over information their children see on-line while vigorously enforcing laws against people who try to exploit children. He supports net neutrality.

Global trade

Both candidates want to see America become more competitive in overseas technology markets. Both support cracking down on intellectual property theft abroad.

McCain sees to expand the number of H-1B visas to enable US companies to employ more foreign workers.

Reinventing U.S. Innovation

From Innovations, a website published by Ziff-Davis Enterprise from mid-2006 to mid-2009. Reprinted by permission.

John Kao believes the United States has an innovation crisis, and he’s calling on today’s corps of young technology professionals to sound the alarm.

Citing technology pioneer Vannevar Bush’s assertion more than 60 years ago that “A nation that loses its science and technology will lose control of its destiny,” Kao said the U.S. is in peril of becoming a technology laggard.

“The US public education system is veering further away from preparing kids for the world,” the author of Innovation Nation: How America Is Losing Its Innovation Edge, Why It Matters, and What We Can Do to Get It Back told the MIT Enterprise Forum early this month. “We spend more on education than any country in world, yet we’re between 24th and 29th in math performance.”

By contrast, Finland, a country that suffered a near economic collapse after the Soviet Union fell apart, today produces twice as many Ph.D.s per capita than the U.S. The Finns turned around their economy, in part, by creating a national design focused on science and technology education. As a result, “Two years ago, Finland was the number one competitive economy in the world, according to the World Economic Forum,” Kao said. “Its education system is rated the best in the world. People want to be teachers in Finland.”

We’ve heard this before, of course.  In the late 1980s, Japan famously challenged the US for leadership in technology innovation with initiatives like the Fifth Generation Computer Project and a nationwide commitment to developing artificial intelligence.  Those ambitious plans foundered, but Kao argues that this time is different.

Today, countries like Singapore and China are making technology innovation the centerpiece of a national strategy that’s backed by incentives, equipment and money. Singapore, for example, has set up the Biopolis, a $300 million biomedical research lab spread across a 10-acre campus. Singapore is allocating money to train expatriate scientists in other countries on the condition that they repay the government with six years of service. The country is also promising to remove the layers of bureaucracy and legal approvals that frustrate scientists in the U.S.

Singapore has convinced top researchers from MIT and the U.S. Centers for Disease Control to pull up stakes and move to the tiny nation-state with financial incentives and promises of academic freedom.

This last point is a key difference between the national technology policies of today and the failed models of two decades ago.  Thanks to the Internet and business globalization, countries now have the capacity to build very large local industries serving overseas customers. Kao told of a friend who’s building a global travel business for 1/10th of what it would have cost a decade ago. He farms out much of the development work overseas. “Countries want to ally with American intellectual capital,” he said.

Therein lies a challenge for US competitiveness. The United States has long been able to rely upon the global brain drain from other countries to fuel its innovation economy. Over half of the engineering Ph.D.s awarded in the U.S. now go to foreign-born students. Many of those people have traditionally settled in Silicon Valley or other technology-rich environments. But the lifestyle trade-offs aren’t as dramatic as they used to be. “Now there’s an Apple store and a Starbucks in Bangalore [India],“ Kao said.

With overseas economies offering tax havens, comfortable salaries, research grants and other perks to technology achievers, some countries that used to lose talent to the US have actually reversed the migration.

What can US technology professionals do?  Well, on a purely selfish level there may be some attractive opportunities to pull up stakes and move overseas. Singapore, for example, has earmarked a half billion dollars to fund digital imagine research. But if you’re more interested in improving the domestic situation, then start by voting for candidates that have a vision and a plan for US technology competitiveness.

You can also out into the classroom and share your own experiences with tomorrow’s innovators.  Many teachers would be glad for the help. In John Kao’s words, “Many people who teach math and science in U.S. publice schools are forced to do it.” In contrast, “In China, people with masters degrees in math willingly come in to teach in the schools.”

A Regulatory Boost for the Cloud

From Innovations, a website published by Ziff-Davis Enterprise from mid-2006 to mid-2009. Reprinted by permission.

In a recent podcast interview on Tech Nation, Tim Sanders, author of Saving the World at Work, quotes a remarkable statistic: it’s been estimated that Google could save 750 megawatts of electricity every year by changing the color of its ubiquitous homepage from white to gray.  That’s because monitors require more electricity to energize the brighter white phosphors.

The total cost savings of roughly $75,000 a year may not convince Google to overhaul its site design, but the statistic drives home the effect that economies of scale can have in computing.

There’s a lot of attention being paid to economies of scale these days as IT consumes a growing proportion of natural resources in the US. IT organizations are increasingly going to find themselves on the hot seat to go green not just because it’s the nice thing to do, but because it’s good sense for the bottom line.

Consider these facts:

  • The Environmental Protection Agency has estimated the data centers consume about 1.5% of all electricity in the US and that about a quarter of that energy is wasted;
  • Gartner estimates that more than 2% of global atmospheric carbon emissions can be traced to the IT industry;
  • Gartner expects that more than 925 million computers and one billion mobile phones will be discarded over the next two years.
  • The International Association of Electronics Recycles estimates that about 400 million units of electronic refuse are generated annually. The actual amount of “e-trash” may be higher because nervous businesses have stockpiled old equipment rather than paying for disposal.

The twin effects of spiraling energy costs and environmental hazards are creating a double whammy. Underutilized computers are consuming increasingly expensive energy and also taking a greater toll on the environment. In Europe, businesses are working under a new standard called the Restriction of Hazardous Substances directive, which limits the use of dangerous chemicals in computers.  US businesses are keeping a close eye the directive, not only because it affects their European businesses but because they see it as a role model for similar legislation here.

Some very large businesses are beginning to make green computing part of the core corporate values. An article by the University of Pennsylvania’s Wharton school tells of Bangalore-based IT services firm Wipro Technologies’ energy efficiency mandates. It tracks metrics like carbon dioxide emissions and paper consumption per employee and is outfitting workers with a new kind of energy-efficient workstation.

In the US, the most promising option is hosted or “cloud” computing. This takes advantage of the excess capacity that exists in giant data centers run by companies like Amazon and IBM. Why let that spare power go up in smoke if there are small businesses that can tap into it?

Increasingly, they are. On Palo Alto’s Sand Hill Road venture capital foundry, technology entrepreneurs that propose to run their businesses on internal servers are referred to as having a “Hummer” strategy.  In other words, they’re consuming far more power than they need to drive their computing environment.  New software and services firms are bypassing captive data centers and opting to farm out everything to third parties. Technology entrepreneur John Landry recently told me that the cost of hosted computing is coming down so fast that it no longer makes sense for a new business to even consider investing in a captive data centers. In other words, it will never be cheaper to manage the infrastructure yourself.

In a consolidated hosting environment, every tenant benefits from the economies of scale provided by the host. What’s more, the arrangement shifts responsibility for technology recycling and disposal to a central entity that has a vested interest in best practices. As the cost of energy continues its inevitable rise and legislators stump for stronger regulation, the appeal of the cloud only grows.