Innovation Through Precision

From Innovations, a website published by Ziff-Davis Enterprise from mid-2006 to mid-2009. Reprinted by permission.

Does the following scenario sound familiar to you? An internal customer has come to you with a problem.  Her group is consistently missing deadlines because of poor communication.  E-mails frequently go unread for days, group members don’t respond to questions in a timely fashion and too many meetings are required to get everybody back on track.  The manager read in an airline magazine about wikis, and thinks they are the perfect solution.  She wants you to get one up and running as quickly as possible.

What do you do?  Experienced project managers will tell you that the last thing would be to install a Wiki.  Better solutions may be available, and your job as an IT professional is to analyze the needs the manager has defined and identify the most appropriate solution.

Tony Ulwick and his team at Strategyn would tell you to take one more step back. They’d see all kinds of problems in the needs statement that was just presented.  For example, words like “consistently,” ”frequently,” “timely” and “too many” are vague and subjective. Furthermore, the solution that the manager seeks — better performance against deadline — may be far short of the bigger goal of improving group performance.  You need to define the problem better before tackling a solution.

Optimizing inputs

Strategyn specializes in helping companies optimize customer inputs to improve innovation. Ulwick, who has published widely on this topic, believes that most projects fail not because customers don’t understand the problem but because the people trying to solve the problem don’t ask the right questions.  Strategyn’s methodology starts with helping stakeholders define needs very specifically so that vendors and internal service organizations can innovate from them. That means discarding adjectives and subjective statements, talking about jobs instead of outcomes and using very specific terms.

Tony Ulwick

Lance Bettencourt

Over the next two blog entries, I’ll present an interview with Tony Ulwick and Lance Bettencourt, a senior adviser at Strategyn.  You can also find some helpful free white papers on this subject at the Strategyn website (you need to register to view them).

Q: You say businesses often respond to perceptions of customer need rather than actual defined needs. What are some governance principles you believe internal services organizations can embrace to address these needs? Is a structured approach to needs definition necessary?

Bettencourt: Businesses try to respond to too many customer needs because they don’t know which needs are most unmet, so they hedge their bets. An organization must have a clear understanding of what a need is. Without a clear understanding and a structured approach to needs definition, anything that the customer says can pass for a need.

Many organizations are trying to hit phantom needs targets because they include solutions and specifications in their needs statements, use vague quality descriptors and look for high-level benefits that provide little specific direction for innovation.

Customer needs should relate to the job the customer is trying to get done; they should not include a solution or features. They should not use ambiguous terms. They should be as specific and consistent as possible to what the customer is trying to achieve. It’s important that a diverse body of customers be included in this research because different needs are salient to different customers. The ultimate goal is to capture all possible needs.

Q: Your advice focuses a lot on terminology. At times, you recommendations are as much an English lesson as a prescription for innovation! Why the emphasis on terms?

Ulwick: What distinguishes a good need statement from a bad need statement is not proper English, but precision. If a so-called need statement includes a solution, for example, then it narrows the scope of innovation to something the customer is currently using rather than what the customer is trying to get done.

For example, if a need statement includes ambiguous words such as “reliable,” then it undermines innovation in multiple ways. Different customers won’t agree on what that means when, say, printing a document. Or they won’t agree on what “reliable” means in general. This leads to internal confusion and ultimately to solutions that may not address the actual unmet need. Customer need statements have to be precise and focused if you want to arrive at an innovative solution.

Q: You suggest focusing on the job. What’s the definition of “job” for these purposes?

Ulwick: We mean the goal the customer is trying to accomplish or the problem the customer is trying to solve. A job is an activity or a process, so we always start with an action verb when we’re creating a job statement. Positions can’t be jobs. In fact, there may be multiple distinct jobs associated with a given position. For innovation purposes, “job” is also not restricted to employees. We’re not just talking about people trying to get their work done, but all their daily activities.

The best way to get at a clear definition of the job is to begin with the innovation objectives of the organization or, in your example, the IT department. If the goal is to create new innovations in an area where there are already solutions in place, then the organization should understand what jobs the customer is trying to get done with current solutions. If the goal is to extend the product line into new areas, then the definition should begin by understanding what jobs the customer is trying to do in a different but adjacent space. Defining the job helps the organization to achieve its innovation objectives.

Next week, Tony Ulwick and Lance Bettencourt tells how development organizations can ask the right questions to assess customer needs.

Utility Computing Train is Coming, But It May Be Late to Your Station

From Innovations, a website published by Ziff-Davis Enterprise from mid-2006 to mid-2009. Reprinted by permission.

The move to utility or “cloud” computing shows every sign of reaching critical mass over the next couple of years.  But it won’t be driven by corporate data centers.  The momentum, instead, is coming from two factors that increasingly dictate the pace of innovation: startups and fear.

In 1991, noted technology columnist Stewart Alsop wrote, “I predict that the last mainframe will be unplugged on 15 March 1996.”  Yet as of last year, there were still 10,000 mainframes running worldwide, according to IBM.  Was Alsop wrong? Technically, yes, but the shift that he foresaw is happening.  It’s just being driven by different factors than he expected.

Technology innovation today follows a strikingly consistent pattern. New companies with no legacy base make the switch first while the people with the most to lose are the last ones to change. Instead, they jump on board when they discover that new technology addresses a significant pain point.

Both forces are evident today in utility computing. Robert Scoble wrote persuasively last November about the “serverless” Internet company. His comments were prompted by a meeting with the CEO of Mogulus, a streaming video firm the claims not to own a single server.  What interested me most about Scoble’s remarks is the 65 comments that follow.  Many are from other small companies that are building IT infrastructure from the ground up without servers.  Some of these companies are offering high-bandwidth services on a very large scale, demonstrating scalability and reliability aren’t a problem. In fact, any startup business today should look first at outsourcing its IT infrastructure before investing in a single square foot of computer room space.

Meanwhile, utility services are actually achieving critical mass in a corner of the mainstream corporate IT market: storage. Services like Amazon’s S3 now have well over 300,000 customers.  EMC just joined the fray by launching an online backup service and hiring a top former Microsoft executive to lead its cloud computing initiative.

The storage industry has been a technology innovator recently because storage is a major pain point for many companies.  With capacity requirements expanding at 30% to 50% annually, people are desperate to do something to manage that growth.

The rapid adoption of utility computing seems likely to continue, but with a curve that looks like a landscape of the Himalayan mountains.  In some segments of the market — like startups — utility infrastructures will become the status quo.  In others — like corporate data centers — adoption will come only as the technology addresses specific pain points.

This jagged adoption curve is why there’s so much debate today over the future of the cloud.  Contrast Scoble’s observations, for example, with a recent CIO Insight article in which a CTO outlines his reservations about cloud computing or a CIO Insight reader forum where IT managers take issue with Nicholas Carr’s forecast that IT will increasingly become a central utility.

This debate is happening because the need for utility computing is not perceived to be compelling in all cases.  Perhaps this is why Gartner predicts that early technology adopters will purchase 40% of their IT infrastructure is a service by 2011. Which means that the other 60% will still be acquired conventionally.

The utility computing train is coming but its arrival won’t occur the same time for all organizations. Check your local schedule.

The Connected Generation

From Innovations, a website published by Ziff-Davis Enterprise from mid-2006 to mid-2009. Reprinted by permission.

The concept of “presence” is altering business communications. Will you be ready to speak to the next generation of professionals on their own terms?

The chart below should tell you a lot about the future of the workplace. For the past three years, the Associated Press and America Online have measured the use of e-mail and instant messaging (IM) by teens and adults.  In all three surveys, the results have been similar: usage patterns are nearly reversed between the two groups, with teens overwhelmingly preferring IM.

AP-AOL

Source: Associated Press/America Onlin, Nov., 2007

Why?  In part, teens admit, it’s to avoid confrontation and embarrassment by taking the face-to-face element out of awkward situations.  But equally important is that IM reflects teenagers’ always-connected lifestyles.  IM is instantaneous, requires little forethought and lends itself well to use with cell phones, the ubiquitous teen accessory.

Today’s young people expect that their friends will always be available to them, regardless of where they are. Teens are no less communicative than their parents; 50% of them use IM more than one hour a day, compared to 24% of adults, according to the poll. It’s that the nature of their communications is different.  They don’t have time to get to a computer or to carefully compose their thoughts before stating them.  When they have something to say, they want to just say it.

Experts call this “presence:”  our availability and our preferred communications media are a matter of record to the people who need to reach us, whether they’re family, colleagues or customers.  Presence reflects the fact that people are no longer anchored to their desks.  They work at home or on the road most of the time.  Location is a critical element of presence.  Increasingly, our online profiles will include up-to-date information on where we are and how available we want to be. Sophisticated cell phone tracking technology and global positioning systems may even make this transparent to us.

Presence will redefine workplace communications. The New York Times recently reported on the evolution of social networking to include cell phones, which are the primary Internet access points for most of the developed world.  Many of these services factor location into member profiles.  At work, we will need to broadcast our location constantly, since the hyperactive business world no longer tolerates delay.  Corporate directories are evolving to include rich information about people’s background and expertise, along with the means to tap into their knowledge whenever someone in the organization needs it.

The trick will be to balance our need for concentration with the requirement of availability.  I imagine many people blanch at the idea that they would be expected to turn on a dime anytime someone else in their organization needed a question answered. That’s a problem that will be addressed through standards of conduct that will emerge as the technology takes hold. However, the new rules are becoming clear. The next generation of business professionals won’t tolerate the delays that are inherent in business communications.  They are always online, and they expect their colleagues to be there as well.

Tips For a Greener – And Cheaper – Data Center

From Innovations, a website published by Ziff-Davis Enterprise from mid-2006 to mid-2009. Reprinted by permission.

The EPA estimates that data centers eat up about 1.5% of all electricity in the United States and that nearly a quarter of that power is wasted.

As I noted last week, energy waste is one of the dirty little secrets of corporate data centers.  Add on to that the money lost due to PCs sitting idle overnight and the waste inherent in abandoned and underutilized servers and you have a lot of money just sitting out there waiting to be found.

Energy-saving is mostly a matter of common sense.   The simplest approach may be to start, literally, at the ground floor.  If you peer under the raised flooring in your data center, you’ll probably find pockets of cables clustered together. These could be inhibiting air flow.  By re-cabling and deploying vented floor tiles in strategic locations, you can cut energy waste with almost zero capital expense.  Many consultants now specialize in this area, and bringing in an expert can save time and money in the long run. This article tells the story of one company that saved him $1 million annually in power costs through this simple measure.

The next step is to analyze your server use to determining what can be shut down and consolidated.  One speaker at last fall’s Data Center World conference proposed a radical idea: if you don’t know who’s using a server, shut it down. Mark Monroe, Sun Microsystems’ director of sustainable computing, said that his group tried this approach and discovered that nearly 12% of its servers weren’t being used for anything.  The application owners had moved on and no one had bothered to shut down the application.

Consolidate the servers you’re using into one location and direct your cooling resources to that hotspot.  Use virtualization to pack more physical servers onto fewer virtual ones.  By some estimates, about 70% of servers in the data center are only supporting one application.  Utilization rates of less than 15% on single servers are not uncommon. These are ideal candidates for virtualization.

Air conditioning is responsible for the greatest energy waste.  The problem is that most data center managers don’t know where all their hotspots are, so they take a brute force approach and cool the entire data center to a certain level. The reality is that the hottest servers probably consume a minority of floor space.

While high-density servers can consume less power overall than the individual machines they replace, there’s no reason to structure your cooling plan around the needs of maybe 10% of your hardware. Several vendors now sell server racks that are optimized for cooling.  Also, the water cooling technique that was common in the mainframe days two decades ago is staging a revival as server consolidation comes back into vogue.

Crank up the heat

Once you’ve isolated your most power-consumptive servers, turn up the thermostat on the rest of them.  Most servers can operate perfectly well at temperatures of as much as 100° (be sure to check with your supplier before trying that, though) and each 1° increase in temperature can save about 4% in energy costs, according to Sun’s Monroe.

You should also become familiar with the EPA’s Energy Star initiative. This program sets standards for efficient energy use and publishes lists of products that meet them.  Needless to say, computers are a major Energy Star focus. Did you know, for example, the EPA estimates that enabling basic power management features that come with the Windows operating system can save up to $75 per computer per year? While there are legitimate reasons to leave PCs on all night at times, simple open source network tools can enable systems managers to shut down unused computers and still have the flexibility to power them on when needed. The Energy Star website has a list of software tools for remote power management as well as a power management savings calculator.

A somewhat more radical option is to outsource all or part of the data center.  While there are many factors involved in this decision, the potential energy savings of such a move shouldn’t be underestimated.  In the case of total data center outsourcing, contractors should be able to provide you with power savings estimates to factor into your calculations.  Amazon’s S3 storage service is one of many specialized offerings that are emerging. Amazon sells cheap off-site data storage. One of its appeals is that users don’t need to pay for — and cool — on-site storage area networks.

Most technology vendors now have green initiatives, and you should become familiar with what your key vendors are doing. For example, IBM has made its own IT operations a showcase of energy efficiency. In the course of consolidating 155 data centers worldwide down to just seven, it’s cut operational costs by $1.5 billion. This podcast tells more.

What are you doing to save energy in your data center? Write your suggestions in the comments are below.

Computer Industry Finally Going Green

From Innovations, a website published by Ziff-Davis Enterprise from mid-2006 to mid-2009. Reprinted by permission.

Data center heat dispersionThe graphic at right may look kind of cool, but it’s anything but.  It’s actually a simulation of the heat distribution of a typical data center prepared by Innovative Research, a computational fluid dynamics company.  It demonstrates graphically what all data center managers already know: the data center is nearly impossible to keep cool.

Unfortunately, this fact is costing us a fortune.  As the price of oil breaches $100 a barrel, new attention is being focused on the possibilities of wringing big savings out of data centers by attacking their notoriously lousy energy efficiency.  Some stats:

  • The amount of electricity consumed by US data centers doubled between 2000 and 2006 and is expected to double again by 2011 according to the U. S. Environmental Protection Agency (EPA).
  • A typical 50,000-square-foot data center consumes about 57 barrels of oil per day.
  • Data centers consume 1.5% of all electricity in the U.S., the EPA says.
  • About 40% of the power used by data centers goes to cooling, according to several estimates. About 60% of that expense is wasted, however, because of what you see in the graphic to the left.  Data center heat distribution is extremely erratic and spot cooling is complicated. Instead, companies use brute force and over-cool most of their equipment just to be sure the hottest machines don’t melt.
  • Over half the power that companies use to run their desktop computers is wasted because the machines aren’t shut off overnight or don’t power down when not in use, according to ClimateSaversComputing.org.  Most companies could save between $10 and $50 per PC per year by using basic power management software, according to Greener Computing. That adds up.

These numbers are deplorable, but the Network World research identified an interesting explanation.  Its survey found that 68% of IT manager respondents weren’t responsible for their energy bills. In most cases, those costs were paid by the facilities department like PC Doctor, PC Doctor is a Computer Repairs Company in Edinburgh. If IT never even sees the electric bill, it has no incentive to reduce it.

There is good news. Data centers are getting unprecedented attention right now as sources of significant cost savings, even if it’s only because there’s so much room for improvement. A recent PriceWaterhouseCoopers study found that 60% of 150 senior executive respondents rated energy costs as a top priority, which means at their IT managers will be getting an e-mail.  IBM has made green data centers a key part of its marketing strategy. Dell recently launched an international competition to design technology products with a greener focus. Then there’s ClimateSaversComputing.org, an initiative sponsored by Google and Intel in which technology providers agree to hit certain energy consumption targets.

Members of the technology CEO Council were in Washington just a few weeks ago to pitch the idea that investments in IT can save energy.  While there agenda was self-serving, there’s no question that the industry as a whole is turning its attention to fixing this mess.

And its such an obvious mess to fix.  Whether your motivations are the rapid payback, the positive environmental impact or the simple satisfaction of knowing that you’re not flushing money down the drain, why wouldn’t you want to make your IT operation more power-efficient?  Next week, will look at a few ideas for just how to do it.

Reinventing U.S. Innovation

From Innovations, a website published by Ziff-Davis Enterprise from mid-2006 to mid-2009. Reprinted by permission.

John KaoJohn Kao (right) believes the United States has an innovation crisis, and he’s calling on today’s corps of young technology professionals to sound the alarm.

Citing technology pioneer Vannevar Bush’s assertion more than 60 years ago that “A nation that loses its science and technology will lose control of its destiny,” Kao said the U.S. is in peril of becoming a technology laggard.

“The US public education system is veering further away from preparing kids for the world,” the author of Innovation Nation: How America Is Losing Its Innovation Edge, Why It Matters, and What We Can Do to Get It Back told the MIT Enterprise Forum early this month. “We spend more on education than any country in world, yet we’re between 24th and 29th in math performance.”

By contrast, Finland, a country that suffered a near economic collapse after the Soviet Union fell apart, today produces twice as many Ph.D.s per capita than the U.S. The Finns turned around their economy, in part, by creating a national design focused on science and technology education. As a result, “Two years ago, Finland was the number one competitive economy in the world, according to the World Economic Forum,” Kao said. “Its education system is rated the best in the world. People want to be teachers in Finland.”

We’ve heard this before, of course.  In the late 1980s, Japan famously challenged the US for leadership in technology innovation with initiatives like the Fifth Generation Computer Project and a nationwide commitment to developing artificial intelligence.  Those ambitious plans foundered, but Kao argues that this time is different.

Today, countries like Singapore and China are making technology innovation the centerpiece of a national strategy that’s backed by incentives, equipment and money. Singapore, for example, has set up the Biopolis, a $300 million biomedical research lab spread across a 10-acre campus. Singapore is allocating money to train expatriate scientists in other countries on the condition that they repay the government with six years of service. The country is also promising to remove the layers of bureaucracy and legal approvals that frustrate scientists in the U.S.

Singapore has convinced top researchers from MIT and the U.S. Centers for Disease Control to pull up stakes and move to the tiny nation-state with financial incentives and promises of academic freedom.

This last point is a key difference between the national technology policies of today and the failed models of two decades ago.  Thanks to the Internet and business globalization, countries now have the capacity to build very large local industries serving overseas customers. Kao told of a friend who’s building a global travel business for 1/10th of what it would have cost a decade ago. He farms out much of the development work overseas. “Countries want to ally with American intellectual capital,” he said.

Therein lies a challenge for US competitiveness. The United States has long been able to rely upon the global brain drain from other countries to fuel its innovation economy. Over half of the engineering Ph.D.s awarded in the U.S. now go to foreign-born students. Many of those people have traditionally settled in Silicon Valley or other technology-rich environments. But the lifestyle trade-offs aren’t as dramatic as they used to be. “Now there’s an Apple store and a Starbucks in Bangalore [India],“ Kao said.

With overseas economies offering tax havens, comfortable salaries, research grants and other perks to technology achievers, some countries that used to lose talent to the US have actually reversed the migration.

What can US technology professionals do?  Well, on a purely selfish level there may be some attractive opportunities to pull up stakes and move overseas. Singapore, for example, has earmarked a half billion dollars to fund digital imagine research. But if you’re more interested in improving the domestic situation, then start by voting for candidates that have a vision and a plan for US technology competitiveness.

You can also out into the classroom and share your own experiences with tomorrow’s innovators.  Many teachers would be glad for the help. In John Kao’s words, “Many people who teach math and science in U.S. public schools are forced to do it.” In contrast, “In China, people with masters degrees in math willingly come in to teach in the schools.”

Wisdom of Crowds, Yes; Democratic Innovation, No

From Innovations, a website published by Ziff-Davis Enterprise from mid-2006 to mid-2009. Reprinted by permission.

Technology makes it possible to involve customers intimately in product development, but experts must still make the decisions.

In 2003, two Australian entrepreneurs accomplished something no one thought was possible. Knowing nothing about the business of brewing and distributing beer, they successfully penetrated the duopolistic Australian beer market and, in four years, achieved a base — and 50,000 customers —  in 46 nations. Brewtopia, which is now a publicly traded company, has since expanded into bottled water and soft drinks.

The secret to their success was their customers. The two founders set up a Web site and invited beer enthusiasts to vote on everything from the style and alcoholic content of the beer to the design of the labels.

Their inspiration was the story of PK-35, a Finnish soccer team. PK-35’s coach tried an experiment, asking fans to vote on nearly every aspect of the team’s operations, even its on-field strategy. What Brewtopia’s founders didn’t know was that the results of the soccer experiment were so bad that the coach was fired and the idea was scrapped after just one season.

Both of these stories are related in an inspiring and entertaining new book, We are Smarter than Me, by Barry Libert, Jon Spector “and thousands of contributors.” Using anecdotes and homespun logic, the authors make a compelling case for involving customers directly in a business’ product design and strategic direction. This idea is all the rage today, thanks to visible initiatives like Procter & Gamble’s pledge to derive half of its new product ideas from its customers by the end of the decade.

IT’s Central Role

IT organizations will increasingly find themselves at the center of these customer campaigns. That’s because only robust technology can effectively harness the contributions of thousands — or millions — of voices.

This is an exciting place for IT folks to be: at the center of corporate strategy. But it’s also an arena that demands discipline. As the soccer experiment demonstrated, community governance is not always the best strategy.

Many business executives will be enchanted by the concepts described in this book and will quickly ask their technology groups to set up forums and voting sites to accept customer contributions. The technology side of this challenge is fairly straightforward, but the business implications aren’t so simple.

Customer input is always desirable, but management by consensus or election isn’t. We know, for example, that democracy is a superior form of government, yet voters sometimes elect terrible leaders.

Businesses aren’t governments. They don’t exist to serve the public good, and they need to make difficult decisions in order to remain viable. Customer input needs to be tested and weighed against business objectives.

Companies that are successfully experimenting with this “crowd-sourcing” concept are finding ways to achieve that balance. Dell Computer, for example, has set up a Web site called IdeaStorm, which invites comments from customers and applies community voting tactics and discussion forums to developing innovations.

P&G has tapped into idea networks like Innocentive to get advice and solve problems. It has had spectacular results, but ideas from the community are still vetted by experts within the company for viability and relevance.

Customer innovation is an exciting opportunity for IT organizations to contribute to the business, but it’s important to resist the temptation to simply throw technology at the problem. Innovative organizations will seek to stretch the limits of what technology can do in order to sift through a mountain of suggestions. But at some point, human beings must still step in and make decisions.

What We Can Learn From Web 2.0 Innovation

From Innovations, a website published by Ziff-Davis Enterprise from mid-2006 to mid-2009. Reprinted by permission.

Few icons of the so-called Web 2.0 revolution are as visible – or as controversial – as Wikipedia.org. This massive (11th most popular site on the Internet, according to Alexa.com) collectively edited encyclopedia has been hailed as both a shining example of collective wisdom and a chaotic cesspool of half-truths and misinformation.

One thing is for sure: Wikipedia is a bold experiment in transparent product design, and its lessons shouldn’t be lost on business. Whether you like Wikipedia or not, you can’t deny that its open-development model is just one very visible example of changes that are sweeping business.

David Weinberger last week spoke of a remarkable irony about Wikipedia. In admitting to the shortcomings of its community-edited content model, the encyclopedia actually makes itself more credible. Weinberger is a co-author of Cluetrain Manifesto, a 1999 essay that is generally accepted as the Declaration of Independence for Web 2.0. In a keynote address to the New Communications Forum in Las Vegas, he praised the culture of transparency that pervades Wikipedia, open-source software and the emerging open-development paradigm.

Wikipedia’s flaws enhance its credibility because it is so open about them, he said. Every entry into Wikipedia – including changes to facts, grammar and punctuation – is logged and frequently commented upon in the accompanying documentation pages. These history logs can be epic; for example, more than 1,000 revisions have been made to the entry on Saddam Hussein just since the first of this year. But the result is a window on the product development process that attempts to obscure nothing. Any flaws are right out there for anyone else to find and correct.

Wikipedia “is more interested in informing us than speaking as the voice of God,” Weinberger said, referring to the opaque process by which traditional publishers create their products. You’ll never see error logs or disclaimers in a daily newspaper or printed encyclopedia, he noted, because to expose mistakes implies fallibility. However, “The attempt to be infallible drives out credibility.” Human beings instinctively relate to human foibles. An organization that exposes its weaknesses and seeks help is more credible that one that covers them up.

You don’t have to be a publisher to see wisdom in these words or the creative potential that interactive media can unlock. Not long ago, software developers built their products under a shroud of secrecy and non-disclosure documents. It was as if only a select few people in the inner circle had the wisdom to innovative.

Today, the new breed of software developers is learning that their “public beta” programs inspire customers to contribute useful suggestions to a process that is never ending and to create products that constantly improve. For example, Ambient Devices, a maker of what it calls “glanceable” information displays, publishes detailed technical specifications of its products online and invites customers to improve its products. The company will even tell you how to build its product from scratch without paying Ambient a dime.

This kind of openness is an early-stage trend being pioneered by the technology markets, but it’s not hard to see the idea spreading into other spheres. Who will be the first auto maker to create a beta program for a new line of cars by posting specs and asking for input? For that matter, why would anyone want to try to keep new products secret any more, when so much creative energy exists out in the field?

Businesses have all but lost the ability to keep secrets. Why not consider turning a problem into a virtue by inviting comments on your ideas? You might find it enhances your credibility.

Customer Innovation Can Spearhead New Product Ideas

Go to Google Maps Mania and prepare to be amazed. This labor-of-love site, which is run by Mike Pegg, a Canadian software sales manager, has logged thousands of innovative mashups based on Google Maps. It’s an impressive testament to Google Maps’ flexibility and a great ad for the service.

But you don’t have to be a high-tech company to tap into customer enthusiasm. Karmaloop is a Boston-based maker of ultra-hip street clothes. It has a program that enlists enthusiastic customers to become reps for the company, earning points for referring friends to buy clothes. The company also encourages reps to snap photos of cool new fashion ideas that they see on the street and upload them to a website. In making that feature available, Karmaloop actually offloads some of its product development costs to its customers.

Welcome to the new world of customer engagement. Author and consultant Patty Seybold calls it Outside Innovation and she’s written a book by the same title. Seybold believes that it’s increasingly practical and desirable for businesses to encourage customers to innovate around its products using rich interactive media. She believes that many companies should be managing as much as half of their new product development this way.

That may be a difficult concept for a lot of businesses to accept, but once you get your brain around the idea, it’s exciting. In her book, Seybold cites Lego Mindstorms, a line of programmable robots. “Within two weeks after the retail product hit the market in 1998, adult hackers reverse-engineered the firmware and developed a number of additional software programs that could be used to program these robots,” she wrote on her blog. “And, a small industry emerged of sensors and peripherals that could be added to these robots. Lego encouraged the customer-extensions to the product line, giving hackers a license to extend its software and firmware and encouraging a healthy ecosystem.”

You don’t have to be a software company to involve your customers. Fidelity Investments maintains a site, FidelityLabs.com, where customers can try out services the company is considering, including a search engine tuned for financial content and a service that finds free checking accounts. Fidelity is being widely praised as an innovator for opening the corporate kimono it this way.

It takes guts to let customers hack your inventions, and the idea doesn’t sit well with every company. Toyota Prius customers have hacked the car’s software to drive fuel efficiency as high as 100 miles per gallon. They’ve also come up with ways to work around Toyota factory settings that, for example, disable the visual navigation system while the car is in motion.

Toyota has discouraged this practice and, given that the company’s products are made to move at 70 mph, you can understand their concern. Nevertheless, I’d be surprised if the company’s engineers haven’t learned a few things from the hacks applied by customers.

The culture of secrecy and propriety that has invaded corporate culture for many years is finally giving way to a new realization that companies don’t have a monopoly on innovative ideas. In fact, those innovations often come from customers themselves. This attitude is epitomized in the practices of Web 2.0 software companies, many of which openly encourage customers to enhance and extend their products. But as the examples of Lego, Fidelity and Karmaloop demonstrate, even mature businesses can find gold in customer innovation.

What do you think? How can your company deputize customers to help you develop new ideas? Contribute your ideas in the comments section below.

Tips and Tricks for Raising Your Online Visibility

From Innovations, a website published by Ziff-Davis Enterprise from mid-2006 to mid-2009. Reprinted by permission.

One of the least expensive and most effective ways to boost your market these days is through search engine optimization (SEO). SEO involves using a few basic tricks to make it easier for search engines like Google and Yahoo! to find you and elevate your site in their search results.  SEO should be a check-off item for any business; however, a lot of people barely even know what is.  Here are some basics.

It helps to understand first why SEO is important.  When people search on terms that are relevant to your business, they’re demonstrating an active interest in something you sell.  What better way to identify prospects them by popping up on their screens at exactly the moment they express interest in your product or service?

We’ll use Google as an example, but all major search engines use the same basic tactics these days. Google’s search results algorithm is called PageRank. It’s proprietary, but some basics are understood about it.  For one thing, it looks hard at page titles, which are the labels that appear in the upper left corner of your browser. The more specific the title, the better PageRank likes it. This means that, all other things being equal, a webpage titled “All About Bowling Balls” will perform better on a query about bowling balls than one titled “Resources for the Avid Bowler,” even if both pages have exactly the same content.

This importance of page titles is one of the reasons blogging can be so effective for your business. Most blog software uses the title of an individual blog posting as the page title, and blogs are collections of individual pages.  This means that if you are careful to label your entries appropriately, you can move very high in the rankings very fast. If you work in a very focused field –  nanotubes, for example – and you know your customers are searching on that term a lot, you should be sure to include “nanotubes” in as many page titles as possible.

Google also gives priority to pages that it believes to be explanatory or educational in nature over product listing pages. This is why your online catalog may perform very poorly in search results, while an article about how to use your product may do quite well.  Keep this in mind when developing site content.  A nice collection of “how to“ articles will serve you well, helping to drive traffic to your product pages.

Perhaps the best known innovation in Google is link popularity. All major search engines now use this technique in somewhat different forms. This proprietary and frequently changing algorithm assigns extra weight to pages that are linked to by a lot of other pages. Link popularlity is an imperfect formula that lends itself to manipulation (a Googlebomb is one of the more creative exploits) and constantly changed by the search engine companies for that reason.

Nevertheless, some basic principles are common. A page that is linked to by many other pages in different domains (links within a domain aren’t counted) will rise in the rankings above a page with fewer inbound links. The quality of the link is important: spam blogs are bogus sites set up specifically to influence link popularity rankings. Search engines are learning to quickly filter out this junk. Your best bet is always to post something that people in your field will find valuable, and then alert other site owners to it and ask for a link. The new breed of blog search engines like Technorati rely heavily metric as an indication of a blog’s popularity.

Finally, you can improve your site’s search ranking bt commenting on blogs, support forums and newsgroups. Search engines routinely index these busy venues and pick up on keywords or URLs that appear there. If your people are busy contributing to the community in which they work, the benefits will come back to you.

A note of caution: There are many shady operators who will promise to optimize your site through tactics like spam blogs, link farms and comment spam. Avoid these shysters. Not only are their tactics disruptive and annoying, but search engine providers often blacklist businesses that employ these devious tactics. Keep your SEO efforts positive and above-board and you’ll enjoy a much better quality of result.

What tricks do you use to improve your search engine rankings? Share a tip with your colleagues below.