Bulldog Reporter’s Faux Pas Shows Why Not to Take Research at Face Value

This lead from a recent Bulldog Reporter case study on business blogging certainly caught my attention:

“Recent research reveals that 64% of American companies will launch their own corporate blogs in 2014, and the average budget for corporate blogging will increase by nearly one-sixth. What’s more, 12% of American companies plan to hire a full-time blogger in 2014.”

Holy cow! Blogging is one of the oldest forms of social media and is not generally considered a high-growth field. In fact, statistically valid research conducted by the University of Massachusetts at Dartmouth over the past several years has documented that about only about one-third of the Fortune 500 and fewer than half of the Inc. 500 have public blogs, and those numbers aren’t growing very fast.  What new research now predicts this kind of mind-blowing growth?

It turns out to be research that’s not very good. A little background checking revealed that the numbers cited by Bulldog Reporter came from a study conducted by a company called DeskAlerts, which makes messaging software for use inside organizations. In a press release, the company summarized its methodology this way: “DeskAlerts asked businesses around the US a single question: what would inspire you to create a corporate blog?”

That’s all. Nothing about how the survey was conducted, who the respondents were or how many people responded. This is kind of critical information to know if you’re going to cite the results in a responsible publication.

From Russia With Love

Rise of corporate bloggingI tried to reach the contact listed on the press release, whose name is Natasha Chudnova. I e-mailed Ms. Chudnova via PRWeb but got no response. I couldn’t find a direct e-mail address for her on the company website or anywhere else. Her LinkedIn profile says she’s in the Russian Federation, which isn’t surprising given that DeskAlerts’ website says that’s where its development is done. The headquarters are listed as being in Alexandria, Va., but when I tried to call the company using the phone number listed on the website, I got a recording saying only that I had reached a voicemail box. The recording didn’t even identify the name of the company.

So I’m having my doubts about the quality of this research. But you don’t have to do any detective work to figure out that these numbers are suspicious. The most obvious question is how DeskAlerts derived so much data from a question that didn’t ask for any? There is simply no way that response to a single verbatim question could be interpreted to reach these stunning conclusions.

That’s assuming the question is valid, which it isn’t. A professional researcher would never use a word like “inspire” in a survey because it creates bias. It’s like asking, “What would cause you to take on the drudgery of creating a corporate blog?” The term “you” is also indefinite. Does it refer to the person or the person’s company? Even if the research was conducted over a statistically valid sample, the results would be meaningless if the question was asked that way.

But the most damning evidence that the research is flawed is the data itself. If we accept the UMass research as a baseline, then DeskAlerts is telling us that 100% of American companies will be blogging by the end of this year. Um, no, they won’t. Then there’s the statistic that 12% of companies will hire a full-time blogger in 2014. Given that there are about six million employers in the U.S., this would represent the addition of more than 700,000 skilled jobs to the workforce. If that were true, the President would be holding a press conference to declare victory over unemployment.

Despite all these problems, I don’t blame DeskAlerts for releasing bogus research into the wild or for producing the obligatory infographic above. Bad data is only a problem if people believe it. The real problem is when respected brands like Bulldog Reporter put the badge of legitimacy on information that is so clearly wrong. Publishers owe it to their readers to at least run a basic reality check before validating third-party research, particularly when it’s from an unknown party. Bulldog Reporter publishes a lot of good information, but it dropped the ball on this one.