ComMetrics on Crowdsourcing Innovation: You’re Doing It Wrong

ComMetrics is a social media analytics company, a division of CyTRAP Labs GmbH. ComMetrics is well-known in the industry, including its FT ComMetrics Blog Index.

The company published a useful piece, Crowd-wisdom fails businesses. The basic premise is that crowds do not innovate. It’s useful, because it contains both truths and misconceptions about the role of communities in the innovation process.

Let’s break it down.

Innovation via a stadium crowd?

Photo credit: Ian Ransley

The initial point of the post is that “Crowds Innovate – NOT”. And it’s true in its literal sense.

This may be one of my favorite misconceptions about the role of communities in innovation. That crowdsourcing is some sort of mind meld where innovations spring from a collective brain wave.

This quote by ComMetrics both sums up the truth, and the common misconception:

It seems a bit naive to think that going to Dodger Stadium or the LA Coliseum in the hope that crowdsourcing will show people exhibiting the above [innovation] behaviors, and therefore help us innovate faster…

Really now…

Actually, it wouldn’t be naïve if you were soliciting the stadium’s feedback on ways to improve the sport event experience. Understand the different “jobs” the sport event is supposed to do:

  • Outlet for aging or non-practicing athletes
  • Family adventure
  • Business social events and networking

You mean you wouldn’t solicit the stadium crowd for ideas related to what they’d like to see on those fronts? How about their feedback on the stadium management’s and others’ ideas?

The stadium example is a good one, because it offers a chance to parse out the role of crowdsourcing into three dynamics:

  1. Crowdsourcing involves collecting ideas in aggregate
  2. Community feedback brings a diversity of viewpoints to the ideas
  3. Crowdsourcing does not mean 100% of the world’s population

Collecting ideas in aggregate. Stop for a moment and consider that. I’m contrasting that view of crowdsourcing from the hivemind singularity that operates off a single brain wave. While the employees of a business have more of a vested in its success, the actual users of a product or service have a pretty good sense of what they want to accomplish.

Diversity of feedback. Research demonstrates the power of information diversity in increasing the quality of ideas. And crowdsourcing is a marvelous way to capture a broad spectrum of opinion and understanding. If you’re going to get a range of opinions, including wild cards that weren’t expecting, soliciting a community’s feedback is a powerful approach.

Crowdsourcing doesn’t mean the whole world. When I read the stadium crowd quote, I get a subtle ‘dis’ in it. Namely, that there some serious nimrods in the crowd, and what the hell would they know about your business? But that’s a stereotype. For instance, look at the open source operating system Linux. Linux is a great example of crowdsourcing. But you’re not going to find me contributing anything there. I have no knowledge, opinion or interest in it. Crowdsourcing attracts parties interested in the product/service being examined. It’d be too demanding to participate otherwise.

The problems with popularity

The ComMetrics post has two separate points around the problems with popularity. First, is the issue of superusers having too much control over crowd opinion:

The notion that a book might be a must-read because it is highly ranked by many on Amazon does not make it Nobel prize material. The earth did not stand still just because Galileo fell out of favor, nor has evolution been shown to be false due to the faith of believers.

Hence, product reviews driven by superusers and crowds who follow just means that the wisdom of crowds can only be conventional. Volume against quality.

The second point is that simple votes don’t provide enough input on an idea’s value:

Thumbs Up or Down works but fails to explain why: Crowds do not drive and bring innovation to successful fruition in the form of a marketable product. Nor are they the best source for assessing quality – the one that shouts the loudest is heard the most.

Nevertheless, crowds can tell you if they like or dislike something.

There are truths in both of these observations. Amazon superusers are the modern equivalent of tastemakers in pre-Internet society. The people the crowd followed to find the best of things, often read in the newspapers. There are cases where the opinion of an A-Lister can have too much sway.

One key difference is this: today, people have to re-earn their influence over time. If over a sustained period someone falls down and no longer looks forward to the fresh, to the new, they lose their influence. The crowd moves on to someone else who is at the leading edge. Humans have a natural affinity for the new.

Perhaps more importantly, one cannot argue that no one has solid authority over a particular innovation domain. We don’t all wake up as blank slates every morning, having to relearn expertise during that day’s work cycle. There are bona fide, honest-to-goodness authorities on subjects who are motivated for improvement.

Which brings me to the second point about simple up-down votes. These votes do provide valuable feedback. You get an early read on what is resonating with the crowd, which is a valuable filter. But they lack nuances that can help identify the best among ideas that are resonating.

Microsoft’s Wilson Haddow’s observation is spot-on. Companies ought to be able to leverage both the wisdom of the crowd in getting feedback, but also leverage the opinion of authorities as well. Going back to what I wrote earlier…

  • The crowd can provide ideas in aggregate
  • The crowd can collectively weigh in on ideas’ merits
  • Individual authorities are generally needed at later stages of evaluation

And the role of these authorities should include finding valuable ideas the crowd overlooks.

In the blog post Corporate Innovation Is Not a Popularity Contest, I argue that binary feedback mechanisms – up-down votes – fall short. They are valuable, but not enough. And this is something Spigit does with its integration of reputation scores into the innovation process.

ComMetrics makes good points here. And kudos to ComMetrics for taking the time to weigh in on this topic. Their post provides a good framework for considering both the problems and opportunities of working with communities in the innovation process.


About Hutch Carpenter
Chief Scientist Revolution Credit

9 Responses to ComMetrics on Crowdsourcing Innovation: You’re Doing It Wrong

  1. productfour says:

    I like to say that its not the wisdom of crowds, but the aggregated wisdom of individuals. Thanks for fleshing out these important issues.

  2. Pingback: Daily Digest for December 10th | My Blog

  3. Pingback: ComMetrics Crowdsourcing #FailBucket « The SiliconANGLE

  4. Dear Hutch

    I am humbled by the fact that you took the time to write such a great post and were a bit influenced by my thoughts on my ComMetrics blog.

    Thank you. I might have another post coming up soon that interests you – just sign up for my RSS feed or subscribe :-).

    In the meantime, why not register yourself at and benchmark your blog.

    Hope you will leave a comment on my blog soon. Thank you for sharing these GREAT insights.

    Thanks for sharing your insights.



  5. Pingback: ComMetrics on Crowdsourcing Innovation: You’re Doing It Wrong | CloudAve

  6. Good discussion, thanks for sharing!

  7. Pingback: £1m prize for citizen participation platform | Talkin' bout a revolution

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: