Attacking the Data Center Energy Crisis

From Innovations, a website published by Ziff-Davis Enterprise from mid-2006 to mid-2009. Reprinted by permission.

For more than 15 years, Ken Brill has preached the gospel of optimizing data center availability. Now he’s evangelizing a more urgent message: data centers must reduce their environmental footprint or risk letting their spiraling power demands run away with our energy future.

Brill’s Uptime Institute serves Fortune 100-sized companies with multiple data centers that represent the largest consumers of data processing capacity in the world. A typical member has 50,000 square feet of computer room space and consumes “a small city’s worth” of utility power. For these companies, uptime has historically been the brass ring; their need for 24 x 7 availability has trumped all other considerations. That has created a crisis that only these very large companies can address, Brill says.

Basically, these businesses have become power hogs. They already consume more than 3% of all power generated in the US, and their escalating data demands are driving that figure higher. A research report prepared by the Uptime Institute and McKinsey & Co. (registration required) found that about one third of all data centers are less than 50% utilized. Despite that fact, the installed base of data center servers is expected to grow 16% per year while energy consumption per server is growing at 9% per year.

“During the campaign, John McCain talked about building 45 nuclear power plants by 2030,” Brill says. “At current growth rates, that capacity will be entirely consumed by the need for new data centers alone.”

Potential for Abuse

That might not be so bad if the power was being well used, but Uptime Institute and its partners believe that data centers are some of the country’s worst power abusers. In part, that’s because uptime is no longer the key factor in data center performance, even though data center managers treat it that way. Data centers used to serve the needs of mission-critical operations like credit card processing. Response times and availability were crucial and companies would spend lavishly to ensure perfection.

Today, many data centers serve non-critical application needs like search or large social networks. Uptime isn’t a top priority for these tasks, and spending on redundancy and over-provisioning is a waste if the only impact on the user is a longer response time for a search result.

The move to server-based computing as a replacement for mainframes has actually increased data center inefficiency. Whereas mainframes historically ran at utilization rates of 50% or more, “Servers typically operate at less than 10% utilization,” Brill says, “and 20% to 30% of servers aren’t doing anything at all. We think servers are cheap, but the total cost of ownership is worse than mainframes.”

Wait a minute: not doing anything at all? According to Brill, large organizations have been lulled into believing that servers are so cheap that they have lost control of their use. New servers are provisioned indiscriminately for applications that later lose their value. These servers take up residence in the corner of a data center, where they may chunk away for years, quietly consuming power without actually delivering any value to the organization. In Brill’s experience, companies that conduct audits routinely find scores or hundreds of servers that can simply be shut off without any impact whatsoever on the company’s operations.

And what is the implication of leaving those servers on? Brill compares the power load of a standard rack-mounted server to stacking hairdryers in the same space and turning them all on at once. Not only is the power consumption astronomical, but there is a corresponding need to cool the intense heat generated by this equipment. That can consume nearly as much power as the servers themselves.

The strategies needed to combat all this waste aren’t complicated or expensive, he says. Beginning by auditing your IT operations and turning off servers you don’t need. Then consolidate existing servers to achieve utilization rates in the 50% range. He cites the example of one European company that merged more than 3,000 servers into 150, achieving a 90% savings in the process. “People think this is an expensive process, but it actually saves money,” he says.

In the longer term, making data center efficiency a corporate goal is crucial to managing the inevitable turnover that limits CIOs to short-term objectives. The commitment to environmental preservation and power efficiency needs to come from the top, he says.

Uptime Institute has white papers, podcasts and other resources that provide step-by-step guides to assessing and improving data center efficiency. This is a goal that every IT professional should be able to support.

Nailing the Interview, Web 2.0 Style

From Innovations, a website published by Ziff-Davis Enterprise from mid-2006 to mid-2009. Reprinted by permission.

Job-hunting using Web 2.0 technologies can simplify the job discovery process and enable you to unearth opportunities that aren’t advertised. But to nail down that offer, you sometimes have to employ tried-and-true tactics.

Interviews are key to the process. They establish a personal connection between the job seeker and the offering manager and solidify your image as a poised and competent professional. Your credentials will get you through the screening process, but you won’t nail down the position without establishing rapport with your future boss.

The key to success in the interview process is to be memorable. Hiring managers may interview 10 or 20 candidates for a position, and believe me when I say that recollections tend to run together in their minds after a while. Anything you can do to stand out from the crowd increases your chances of making the final consideration list.

Start by cleaning up any online tracks you’ve left that may come back to embarrass you. Get rid of the photos on your Facebook account from the senior year beer blast or tailgate party. Check any accounts you have on photo sharing or video sharing sites and delete anything you wouldn’t want an employer to see. Google yourself and look up your profile on Zoominfo. You may think you control what’s in your profiles, but human resources people have ways of getting around firewalls.

Create a profile on LinkedIn. This is the professional network of choice for business people and having your profile there shows that you’re serious about your career. Use a professional photo, post a well-edited resume and reach out to others to post their recommendations.

Create an online resume. Include links to as many of your accomplishments as can be displayed online. Bonus points if you can work in a video, slideshow or PowerPoint summary of your experience. Make it personal. Showcase hobbies and interests that portray you as energetic, multifaceted and fun. Each item should communicate something positive about you.

When you get that interview, research the company. As a longtime hiring manager, I’ve often been amazed at how many people show up for interviews without having done any background research on the company or the position they’re interviewing for. There is simply no excuse for not knowing some basic facts about your potential employer. Having this information shows that you are serious about the position and not just casting around for anything.

You get bonus points for researching the manager who’ll interview you. Many of us leave interesting facts about ourselves scattered around the Internet these days, even if we don’t publish the information ourselves. If you can learn that the hiring manager is a marathon runner, fishing enthusiast or skilled horseman, that’s a great conversation icebreaker. Be careful not to get too invasive with personal information – discussion of the person’s recent divorce is off-limits – but remember that interviews are personal interactions. If the interviewing manager knows you’ve gone to the trouble to find about her as a person, you get credit for resourcefulness.

Create a remarkable resume. People differ on whether this is a good idea, but as someone who’s hired more than 200 candidates, I believe anything you can do to stand out works in your favor. Hiring managers are overwhelmed with resumes that look more or less the same. Dress up yours with a photo, some nice typography, an unusual design or color. This demonstrates your creative side and makes you more memorable.

You get bonus points if you create an online message that’s unique to the job. Don’t skip the traditional cover letter, but point a potential employer something you’ve created just for them. Maybe it’s a song, a short work of poetry or a clever video. You have so many ways to show your creativity these days, it’s a shame not to take advantage of them. I wouldn’t necessarily recommend the approach used by Michael Spafferty (right), but this video certainly is creative. For the right employer, it could be the dealmaker.

Follow-up. Send a thank you e-mail within a few hours of your interview. Reserve one small accomplishment to note in this letter. This makes the communication memorable and actionable. Then follow-up with a handwritten note. Yes, that personal touch is essential. The very fact that something arrives in the mail these days with an address written in cursive script is noteworthy.

How about you? How have you used Web 2.0 to help in your job search? Share your good ideas below.

Job-Hunting 2.0

From Innovations, a website published by Ziff-Davis Enterprise from mid-2006 to mid-2009. Reprinted by permission.

Traffic to the popular business social networking site LinkedIn.com doubled following the stock market meltdown of last fall, according to media reports. That’s not surprising, given LinkedIn’s utility as a way to create and nurture business relationships. Business professionals have a vast variety of tools available to them today to look for jobs and in an economy like this one, it behooves you to use as many as you can.

I’ve hired more than 200 people in my 20 years of management, and I’ve learned what makes a candidate stand out as memorable and potentially hirable. Here are some ideas for incorporating Internet services into your search.

Get recommendations — One of LinkedIn’s more intriguing features is the ability to ask business colleagues for recommendations. You should do this on an ongoing basis, not only when you’re looking for a job. The best time to ask is when a person’s impressions of you are still fresh. They’re more likely to give you an enthusiastic endorsement if you’ve just help them with a big project. Always give back a recommendation as a way of thanking them for their time.

Find jobs that aren’t advertised — aOne of the coolest features of social networks is their continuous status updates. Whenever someone in your circle of contacts gets promoted or takes a new job, you can find out immediately. Remember that when someone assumes a new job, it usually creates an opening in their old one. That’s an opportunity for you. Also, when a person assumes a new role, they often want to hire people they trust to work under them. Be sure to send in a congratulatory note and let them know you’re available.

Research opportunities — Even when I worked for an Internet company hiring people who are supposedly Web-savvy, I was often stunned by how few job candidates showed up with any knowledge of the company or job they were interviewing for. There simply is no excuse for that today. Before you arrive for your interview, be sure you spend at least a half hour learning about the company’s business, its objectives, competition and challenges. Be ready to tell the hiring manager what you can do to help. Believe me, they will remember that.

Research people — People reveal lots of information about themselves in social networks, blogs and online profiles these days. Even if they don’t volunteer that information, you can often learn about them from the groups and organizations that they frequent. Spend some time learning about the person you’ll meet in your interview. Mine some personal nuggets that can help you establish a more personal relationship. Perhaps you share an interest in a particular author, film genre or sport. That’s a basis for discussion outside of the business context. Anything you can do to personalize the engagement will help your chances.

Make yourself memorable — I can’t emphasize this enough. Hiring managers often interview 30 or 40 candidates before making a selection. Names and faces tend to run together, so anything you can do to distinguish yourself will increase your chances of making the cut. Create a video or a screencast demonstrating some special skill you bring to the assignment. If you’re musically inclined, send an audio clip of yourself singing a song of introduction. Write a personalized letter describing three ways you can address a challenge the company faces. Show that you’ve invested some time and brain power to apply for the job.

Next week, we’ll look at how the Web can help you nail down the position and how to prepare for future job hunts. We’ll also talk about how you sometimes need to abandon the keyboard to cement those personal connections.

Web 2.0 Carrots

From Innovations, a website published by Ziff-Davis Enterprise from mid-2006 to mid-2009. Reprinted by permission.

Back in the dark ages of the early Internet, some colleagues and I got hooked on instant messaging. We loved its immediacy, and IM quickly replaced e-mail as the preferred way to communicate among our far-flung staff. This frustrated our IT organization, which didn’t even know about our activities for over a year. IT briefly tried to restrict IM use but ultimately gave up and just shrugged its shoulders.

The group didn’t have time to wrestle with the problem. It was too busy trying to shove a corporate mandated group collaboration package down our throats. This expensive and over-engineered solution had been selected by someone at the corporate level without any input from the people who would have to use it. For two years, our IT organization tried to teach users how to tap the software’s powerful but Byzantine capabilities with little success. By the time I left the company, the collaboration suite was basically a bloated e-mail client. Meanwhile, IM flourished.

Mandates From Above
Top-down implementation comes naturally to IT organizations. Much of what they’ve been tasked to do over the years has involved driving technology into their organizations to achieve executive mandates for efficiency. But the new breed of Web 2.0 tools presents a new challenge.

New research by McKinsey reveals that Web 2.0 tools are turning in decidedly mixed results in organizations that are experimenting with them. Half of the 50 respondents to a detailed set of interviews indicated that they are dissatisfied with the performance of these collaborative tools so far.

Successful innovators are learning that a “high degree of participation” is required to make the tools pay off. This involves not only grassroots activity but also a different leadership approach: senior executives often become role models and lead through informal channels.”

That’s a cultural disconnect for the traditional command-and-control approach to IT management. Big hairy systems projects like enterprise resource planning, supply chain management and customer relationship management have always been mandated from the top in the name of efficiency and cost reduction. The technology didn’t work unless everyone used it, so employees had no choice.

But knowledge capture tools like wikis and social networks don’t succeed unless users embrace them, researchers found. “Efforts go awry when organizations try to dictate their preferred uses of the technologies…rather than observing what works and then scaling it up.”

In fact, Web 2.0 initiatives often yield results where least expected. McKinsey researchers cite the example of one company that put software in place to quickly train new hires. The package failed in that context, but the company’s human resources people discovered that the same application was effective in sharing information about job candidates. They turned out to be the ultimate end users.

Culture of Sharing

Web 2.0 technologies excel at helping people capture and share information, but that process works best when the motivation comes from within. The “give to get” culture of the new interactive web has tapped this human compulsion in a powerful way. It turns out that the desire to help one’s peers is more powerful than the motivation to fulfill a management mandate.

Not surprisingly, McKinsey found that incentives work better than commands in making organization successful with Web 2.0. Steel producer ArcelorMittal, for example, “found that when prizes for contributions were handed out at prominent company meetings, employees submitted many more ideas for business improvements than they did when the awards were given in less-public forums,” the report says. Celebrating the generosity of individual employees was also effective in stimulating activity by their peers.

Which means that when it comes to Web 2.0 technology adoption, the carrot proves far more effective than the stick.

Netbooks Attack From Below

From Innovations, a website published by Ziff-Davis Enterprise from mid-2006 to mid-2009. Reprinted by permission.

It’s been about a decade since someone first placed a tablet computer in my hands and waxed eloquent about its wonders. I remember thinking, as I cradled the 4 pound object with its delicate screen close to my chest, “I just hope I don’t drop it.”

I’ve toyed with many tablet computers in the years since, and my reaction has always been about the same.  While there’s no debating the value of a system that doesn’t require a keyboard and that can capture and even recognize handwriting, the size, weight and fragility of these machines has always made me uneasy.  And then there’s the price, which has typically been higher than that of a comparably equipped laptop.

The arrival of the latest version of the Amazon Kindle 2 has got me reconsidering my reservations, though.  With its sleek design, color display and built-in networking, the Kindle is the kind of limited-use device that has the potential to accomplish what the tablets couldn’t.  At $359, it also threatens to disrupt portable computing in a more fundamental ways.

Here Come Netbooks

The Kindle 2 is one of a proliferating breed of limited-function portable computers called “netbooks” that have taken the market by storm over the last two years.  ABI Research estimates that some 35 million netbooks will ship this year, rising to nearly 140 million in 2013. Netbooks are positioned as essentially slimmed-down laptops at a lower price, but I don’t think those devices will take a market by storm. The Kindle 2 difference is that it attacks the portability problem from below. History has shown that disruptive innovation almost never comes from scaling down existing technology but rather from growing the power and functionality of low-cost, limited use devices. The personal computer is the classic example of this.

Critics have been quick to jump all over the earlier Kindle’s limitations, including a closed architecture and limited application support.  This is to be expected.  Disruptive products are almost always dismissed in their early forms.  The reason Kindle is different is that it possesses the essentials of what users need in a portable computer: a readable display, excellent battery life, a usable if limited user interface and seamless connectivity.  Amazon has had no incentive to open up the product to third-party developers, but as competition emerges, believe me, it will.

Kendall is one of the crowd of low-end computers that are emerging from the One Laptop per Child project, which I’ve written about before.  Founded in 2005, the project attempted to innovate from below by delivering a basic level of computer functionality at a very low cost for people who could never otherwise dream of affording a conventional laptop. This is a completely different approach to innovation from that which is usually practiced in the US: instead of trying to cram more features of questionable marginal value into the same space, OLPC tries to deliver core functions at the lowest possible price.

Attack From Below

The intended audience is third world countries, but the approach will pay dividends on US shores as well.  The weight, limited battery life and awkward hinged keyboard of conventional laptop computers have always made them a weak solution to the portability problem. Apple’s iPod has demonstrated that people are willing to trade off some laptop conveniences for portability.  Kindle will squeeze its way into the middle of this market.

Although currently positioned as strictly a reader machine, the Kindle will evolve rapidly as market forces demand.  I believe to could turn out to fulfill the promise of netbooks, particularly as Amazon peels back the proprietary shell that limits its potential.  This race has only begun, and past experience supports the idea that the Kindle, coming from the low end of the market, will be in the pole position.

It’s been about a decade since someone first placed a tablet computer in my hands and waxed eloquent about its wonders. I remember thinking, as I cradled the 4 pound object with its delicate screen close to my chest, “I just hope I don’t drop it.”

I’ve toyed with many tablet computers in the years since, and my reaction has always been about the same. While there’s no debating the value of a system that doesn’t require a keyboard and that can capture and even recognize handwriting, the size, weight and fragility of these machines has always made me uneasy. And then there’s the price, which has typically been higher than that of a comparably equipped laptop.

The arrival of the latest version of the Amazon Kindle 2 has got me reconsidering my reservations, though. With its sleek design, color display and built-in networking, the Kindle is the kind of limited-use device that has the potential to accomplish what the tablets couldn’t. At $359, it also threatens to disrupt portable computing in a more fundamental ways.

Here Come Netbooks

The Kindle 2 is one of a proliferating breed of limited-function portable computers called “netbooks” that have taken the market by storm over the last two years. ABI Research estimates that some 35 million netbooks will ship this year, rising to nearly 140 million in 2013. Netbooks are positioned as essentially slimmed-down laptops at a lower price, but I don’t think those devices will take a market by storm. The Kindle 2 difference is that it attacks the portability problem from below. History has shown that disruptive innovation almost never comes from scaling down existing technology but rather from growing the power and functionality of low-cost, limited use devices. The personal computer is the classic example of this.

Critics have been quick to jump all over the earlier Kindle’s limitations, including a closed architecture and limited application support. This is to be expected. Disruptive products are almost always dismissed in their early forms. The reason Kindle is different is that it possesses the essentials of what users need in a portable computer: a readable display, excellent battery life, a usable if limited user interface and seamless connectivity. Amazon has had no incentive to open up the product to third-party developers, but as competition emerges, believe me, it will.

Kendall is one of the crowd of low-end computers that are emerging from the One Laptop per Child project, which I’ve written about before. Founded in 2005, the project attempted to innovate from below by delivering a basic level of computer functionality at a very low cost for people who could never otherwise dream of affording a conventional laptop. This is a completely different approach to innovation from that which is usually practiced in the US: instead of trying to cram more features of questionable marginal value into the same space, OLPC tries to deliver core functions at the lowest possible price.

Attack From Below

The intended audience is third world countries, but the approach will pay dividends on US shores as well. The weight, limited battery life and awkward hinged keyboard of conventional laptop computers have always made them a weak solution to the portability problem. Apple’s iPod has demonstrated that people are willing to trade off some laptop conveniences for portability. Kindle will squeeze its way into the middle of this market.

Although currently positioned as strictly a reader machine, the Kindle will evolve rapidly as market forces demand. I believe to could turn out to fulfill the promise of netbooks, particularly as Amazon peels back the proprietary shell that limits its potential. This race has only begun, and past experience supports the idea that the Kindle, coming from the low end of the market, will be in the pole position.

Flushed With Success

From Innovations, a website published by Ziff-Davis Enterprise from mid-2006 to mid-2009. Reprinted by permission.

Although he was awarded more than 1,000 patents during his life, Thomas Edison famously said that he had failed more than 10,000 times in the process.

Inventors know that success usually comes only after running down many blind alleys. Failure can’t and shouldn’t be avoided, but if inventors can be guided down the most promising paths, they can dramatically cut down on time, cost and frustration.

When David Pearson was presented with the challenge of reinventing one of the world’s most common household objects recently, he applied automation to the task of guiding him toward the goal. The result: A better product designed at half the time and less than half the cost.

Pearson works for the Manufacturing Advocacy & Growth Network (Magnet), a Cleveland-based nonprofit organization that incubates inventors’ ideas and helps bring them to market using local resources. Pearson’s task was to take a novel idea by inventor Wally Berry to reinvent a core component of the everyday toilet and turn it into reality.

Flushing the Old
According to the plumbing services in Staten Island & Brooklyn, the average toilet can actually stand an overhaul, for it is one of the chief sources of water waste in the US. In the south central and southwestern US, where water supplies are dwindling and costs are soaring, the steady drip of aging toilets has become a major concern of financially strapped municipalities. Berry had an idea to build a better toilet.

He knew that one of the chief causes of water waste is a device called the flapper valve. That’s a spring-loaded plate that permits fresh water to flow into the tank but blocks backflow. Most conventional flapper valves have a rubber seal that is prone to erosion by chemicals in the water supply and household cleaning agents. As tiny cracks develop in the flapper valve, toilets can start leaking gallons of water each day without their owners even knowing. Multiply that by an estimated 220 million toilets in US households, and it adds up to a lot of wasted water.

Berry had conceived of the idea of a leak-proof toilet while solving a problem in his own home. He set out to reinvent the flapper valve and submitted his idea it to Magnet, which chose to fund it. Pearson’s job was to make the concept practical, economical and capable of being mass-produced.

For help, he turned to Goldfire, a software suite from Boston-based Invention Machine, that applies automation to the inspiration of product design. Goldfire can be programmed to tap into databases of original art, products and materials and extract their essential elements. The software prompts designers through the process of deconstructing a problem into its most basic components. Goldfire that helps the designer probe for alternative technologies or materials that can solve the problem better or more cheaply.

For Pearson, designing a new flapper valve involved describing the function it fulfilled in the most basic terms. He built a model in Goldfire that removed the existing flapper valve and replaced it with a description of the function it performed. Instead of looking for a better value, “I needed to find a flexible element that changes with fluid level,” he said. Once he discarded the idea of building a better valve, alternatives opened up.

The most intriguing idea was a spiral-wound hose, which could expand an compress with fluid levels. The problem with spirals, however, is that they spin when compressed. That wouldn’t work within the confines of a small water tank. So Pearson defined the characteristics he needed and sent Goldfire out to look for spiral-wound hoses that wouldn’t spin.

It turns out such a technology is used in surgical applications. Bingo. Pearson incorporated the surgical hose into his model. He applied the same discipline to several other components of the flush valve and began converging the elements, using the software to search for weak points in the model at every step. “I basically narrowed the design down to the few basic elements that I needed,” he says.

A Simpler Approach

What was left after the entire cycle had been completed was a valve retrofit kit that was vastly simpler than the original equipment it replaced. A standard toilet has about a dozen discreet parts; Siphon Flush has just four.

Simplicity also yielded better reliability, enabling American Innovative Products to go to market with an unusual 20-year guarantee. And there was an unanticipated side-benefit: The more efficient flushing mechanism is able to flush twice as much matter with one-quarter less water, further improving its economy.

American Innovative Products will begin shipping Siphon Flush in early March at a price of $27.95. The small firm already has 10,000 orders in its pipeline. Siphon Flush is a radical new design of a product that’s more than 100 years old. Without Goldfire’s guided innovation, “I’m not going to say I couldn’t have done it, but it would have cost twice as much and taken two to three years,” says Pearson.

American Innovative’s Berry says the software cut development costs by 75%, and that’s assuming the product could have been developed in the first place. “Without Goldfire’s definition of what we needed, we’d probably still be looking,” he says.

The Virtue of Failure

From Innovations, a website published by Ziff-Davis Enterprise from mid-2006 to mid-2009. Reprinted by permission.

Is your company tuned to do things wrong – a lot?

Few organizations are. Mistakes are considered to be something to avoid, an embarrassing lapse in judgment or misreading of the market. Bonuses are lost and heads may roll. But in an Internet market that seems to move faster by the day, a tolerance for trial and error is actually becoming a virtue.

The New York Times wrote last week about Google’s decision to shut down four services, including a virtual world called Lively and a Twitter alternative called Jaiku. The article has some interesting insight into one of the world’s most admired companies and its institutional tolerance for getting things wrong.

Dog-Fooding
Google tests its products internally with employees as kind of a first-tier reality check, a process it calls “dog-fooding.” It then rolls them out to the market with its famous “beta” label and aggressively listens. Its blogs are a water cooler around which customers cluster and comment. The company also monitors prominent bloggers to gauge their reactions.

Google doesn’t pursue projects unless they’re remarkable. That means they must inspire not only willingness but the enthusiasm among employees, developers and ultimately customers. It even has fun with its shortcomings. While using the new offline feature in Gmail recently, I was confronted with this message: “Sorry, but Gmail offline doesn’t support attachments yet. We know this is lame, but consider that the first version didn’t even have folders.” How many companies can make fun of themselves that easily?

This cycle of open review and frequent adjustment is becoming the way fast companies do business in rapidly changing markets. Contrast that with the accepted standard of a decade ago, when product development involved nondisclosure agreements, locked rooms and media silence.

And contrast the results. In the old days, hackers and the media gleefully jumped on the new release of any technology product in a rush to find flaws. Software companies actually failed because their products were too buggy.

How that equation has flipped. Today, innovative companies make public bug hunts a core part of the development process. Instead of allowing flaws to be cast in a negative light, they become part of the process of continual improvement. It turns out that when you don’t hide from your mistakes, people actually are happy to help you fix them.

Do It Wrong
In his recent book Do it Wrong Quickly, Mike Moran argues informed trial-and-error is essential for modern companies. And it’s not limited to technology. Moran believes that marketers need to conduct their business this way as well. With dozens of avenues now available to them to deliver a message, the only way to succeed is to constantly try new ideas. The ability to double down on winners and quickly scrap losers is more important than methodically studying every option in advance.

This is easier said than done. Over the past 20 years, nearly every region in the US has tried to duplicate Silicon Valley remarkable success at incubating new technologies. All these efforts have failed. In my view, a major reason is that Silicon Valley companies have a constructive attitude toward failure. Good ideas fail for all kinds of reasons. Sometimes the value isn’t clearly defined or the timing is simply bad. Most Valley millionaires have a few flops on their resume. The difference is that the culture of the region doesn’t penalize them for that. After all, we learn more from our failures than we do from our successes.

Democratized Insight

From Innovations, a website published by Ziff-Davis Enterprise from mid-2006 to mid-2009. Reprinted by permission.

One of the few segments of the IT industry that has stubbornly resisted the efficiencies of Moore’s Law is research. The services provided by big analyst firms like Gartner and Forrester Research are a $3 billion industry that still conducts business pretty much the same way it did 20 years ago. High-priced analysts using the phone and the speaking circuit to tap into what’s on the minds of their IT management customers. Clients pay five- and six-figure annual fees to tap into their insight. A few prominent opinion-leaders affect the path of billions of dollars in IT investment.

Now David Vellante is disrupting that model. His Wikibon.org is the kind of Web 2.0 project that just might cause the big players to re-evaluate their value propositions. That could be very good for customers.

Vellante knows the research business. For years he ran the largest division of International Data Corp., a market intelligence firm whose opinions can  make or break companies. Vellante left IDC a few years ago to start Barometrix, an advisory firm focused on IT investment optimization. That team started Wikibon as an experiment nearly two years ago.

Wikibon uses Web 2.0 technology to turn the IT research model on its head. Its collaborative wiki engine makes it easy for a vast community of practitioners to share expertise and experience. It turns out that when you roll up all that information, you have a resource that helps people make the kinds of decisions that used to involve expensive analysts. And it’s all free.

Research Goes Open Source

Call it open source advice.  The first Wikibon community is centered on data storage, and more than 3,000 people have joined.  A core group of 30 to 40 independent consultants and experts use the site to share their advice with the broader community. Before Wikibon, they had no way to reach that audience of storage specialists. Now they give away advice in hopes of winning consulting business. Members get the benefit of their years of experience for free.

The bottoms-up model is incredibly cost-efficient. Wikibon has just three employees. Quality control is outsourced to the members, who have contributed some 20,000 articles and edits to the archive. This democratized approach “hasn’t been as much of a limitation as I expected,” Vellante told me.

And the value of the information is evidenced by the time members spend on the site. “It’s Facebook-like,” Vellante says.  “We’re getting 20 to 30 page views per visitor.”

Wikibon has now grown to the point that the team is beginning to carve the library into subsections; one new area focuses on data protection and another on storage networks.  The small company hopes to monetize its business through value-added programs, such as a new service that helps vendors qualify for energy rebate programs.

Wikibon epitomizes the innovative power of Web 2.0. In the traditional model, insight was communicated from the top down because that was the only affordable way to do it.  With thousands of experts now able to assemble an advisory resource of their own, the opportunity exists to flip that model.  “It feels disruptive,” Vellante says.

Communities like Wikibon won’t put Gartner out of business, but they provide an affordable alternative that will pressure the market leaders to innovate.  That’s the kind of disruption that we can all feel good about.

Innovation in Anonymity

From Innovations, a website published by Ziff-Davis Enterprise from mid-2006 to mid-2009. Reprinted by permission.

I recently had two MRI scans on my back. Magnetic resonance imaging is a wonderful technology that enables doctors to see inside the body with depth and precision that conventional x-rays can’t match.

But MRIs are also mysterious and even frightening procedures for patients. A person is drawn inside a small cylinder and subjected to a series of loud noises for as much as 45 minutes. The attending radiologist told me that about 80% of patients experience some kind of claustrophobic stress, forcing technicians to frequently paused the procedure to calm them down.

I should have known about all this because my MRI provider’s website features a wonderful interactive experience that describes the benefits of MRI in a collage of high-resolution images and video tutorials. It also has a multimedia tour of the MRI experience that even includes samples of the odd sounds patients hear. This information would have been immensely useful to me if I’d known it existed, but I didn’t learn of the feature until weeks after the procedures, when I stumbled upon it in the context of a different discussion.

In fact, at no time during my interactions with people at the MRI center did anyone inform me that this resource existed. It wasn’t listed on the company’s letterhead or the preparatory documents sent to patients. A software project that had no doubt cost the company thousands of dollars was barely even referenced on the provider’s homepage.

Failure to Promote

This situation is all too common in businesses. Technology innovators dream up clever new ways to serve their customers and then don’t tell anyone about it. Customer service reps and automated voice response systems routinely refer visitors to generic homepages with meaningless statements like, “more information is available on our website.” But who has the time to go and find it?

Somewhere inside these companies a disconnect has occurred between the technologists and the people who interact with customers. Businesses assume it’s okay to hire service reps who haven’t a shred of technical expertise because those skills aren’t required to interact with the public. IT people are taught to do their jobs and then go home. Cooperating with others to promote the tools they build isn’t part of the job description.

But it should be. Today’s customers are too busy to spend time searching for resources they don’t know exist. The people who commission customer-facing projects may move on to other jobs or companies, leaving their creations without a sponsor.

IT people need to step up to the plate and promote the fruits of their labors because no one else is going to do it. Here are some steps my MRI provider could have taken:

If you are a as if it is a Promote the resource in printed documents – Health-care providers produce lots of paper, yet none of the informational documents I received even mentioned the website experience.

Post signs — A poster in the lobby or window could have alerted me to the existence of this great application.

Train customer service personnel — In my multiple phone calls with the clerical staff, no one recommended that I even visit the website.

Set up a lobby demo — PCs are cheap; why not make it easy for customers in the waiting room to learn what the company has offer?

This adds up to an opportunity missed for some innovative IT person whose creativity and hard work won’t receive the recognition it deserves. Don’t let your good work go to waste because you forgot to tell anyone about it.

Coaxing Web 2.0 Into the Enterprise

From Innovations, a website published by Ziff-Davis Enterprise from mid-2006 to mid-2009. Reprinted by permission.

McKinsey has a new report on enterprise adoption of Web 2.0 technologies, and the findings should give pause to IT organizations planning to roll these tools out to their internal customers.

Overall, these technologies — which include wikis, blogs and social networks — are making steady progress into the organizations represented by the nearly 2,000 respondents to the survey.  What’s striking is the disparity between those companies that have made a commitment and those that are still skeptical. The companies that have drunk the Web 2.0 Kool-Aid report that it’s changing the very nature of their businesses and that they plan to expand their commitment this year.

Among early adopters, tools are being used to develop new products collaboratively, reinvent internal communications and transform the process of communicating with customers.  Only 8% of the executives who describe themselves as satisfied Web 2.0 users say the tools haven’t changed their organizations, compared to 46% of the self-described dissatisfied users.

However, the survey has a disquieting finding for IT organizations. Those companies that showed the least satisfaction with Web 2.0 tended to be the ones in which IT drove the initiative.  Companies that report the overall highest satisfaction with the tools and technologies are those in which IT plays NO role in selection and deployment. Conversely, those with the highest dissatisfaction levels are also the most likely to let IT lead the charge (see chart).

Why does this sad state of affairs exist?  I suspect it has much to do with internal culture and the ways in which the technology’s value proposition is defined for the ultimate users.

Taken at face value, the data suggests that IT is best left out of the Web 2.0 equation, but in my experience, technology groups play a vital role. One of the beauties of these new technologies is that they’re so simple and adaptable. Social networks, for example, can be used for anything from technical support to corporate knowledge management while wikis can perform at the project level or across an entire company. I’ve worked with several companies to implement Web 2.0 technologies, and the successful ones always go about it the same way.  A small number of enthusiastic users are given tools and the means to use them and then their creativity is allowed to filter through the organization.  IT plants the seed and then gives its customers the means to make the garden grow.

This process is invariably supported by managers who trust their people to do the right thing and who support experimentation and risk.  Conversely, I’ve never seen Web 2.0 succeed in companies that mandated it from the top or pushed it through the IT group.  Web 2.0 technology only works when people want to use it. Technology that enhances collaboration must necessarily be driven from the bottom up.

I cringe when I hear questions like this: “We want to start a corporate blog. What should we do with it?” If the technology doesn’t match a perceived need, no one is going to use it.

The best way to manage Web 2.0 adoption is to find those business side sponsors who have the curiosity to experiment and give them the means for discovery. The McKinsey report demonstrates that they quickly figure out their own uses for the tools, and their enthusiasm becomes contagious. It’s fulfilling enough to plant the seed and nurture the flower as it takes root and grows.