About these ads

Four categories of enterprise gamification

When you think of gamification, what are the common things that come to mind? Points, badges, leaderboards. These items are in the cognitive toolkit. But looking at the sheer variety of game mechanics, you can see that’s it’s a much broader field than that:

Game mechanics list

These 48 different mechanics (via SCVNGR and Badgeville) aren’t the complete list, but they provide a sense for the possibilities. However, the quantity of game mechanics makes its difficult to coherently analyze what, if any, means are relevant for an initiative. I found myself facing that in some work I was preparing for a client. My job-to-be-done? Provide an accessible way to understand the different gamification techniques relevant to crowdsourced innovation.

Having done some gamification work previously as a product manager, I called on that experience and various research on the topic. The following are the categories that made sense to me in the context of the enterprise environment:

Gamification categories

You might notice that I’ve couched the descriptive statement of each in the first person. That fits the approach to gamification, which is about motivations of individuals, what matters to each of us. Here’s a bit more about each.

Achievement: I work to attain an objective. This category calls on the desire many of us for mastery. To be well-versed and proficient in something. There is a sort of competition, but it’s against a standard, a benchmark. Not others.

Recognition: My contribution is acknowledged. Recognition is a form of feedback, an affirmation of one’s capabilities or position and a manifestation of status among peers. Recognition strikes me as the most powerful form of motivation.

Competition:  I compete for a limited number of awards. These gamification techniques appeal to the desire to compete. They can elevate people to moments of excellence in their participation (think of sports you’ve participated in previously). Powerful when used in an appropriate context.  But it’s a category that needs to be treated with care. Clumsy implementation of competition gamification can poison an initiative.

Valuables: I want to secure something of value. Valuables can address avoiding the loss of something or gaining something new. Valuables include the things you might expect: points-based rewards systems. But they can include countdowns to do something (I need to do something before I lose the opportunity), or competition to win funding for an idea, for example. Very useful, but Valuables need to be handled with care to avoid unintended consequences (e.g. high volumes of low value contributions; mindset that participation only happens when there’s a reward).

I’ve applied these different gamification categories to different innovation scenarios in my new post: The gamification framework for business innovation. I also look at the purpose of gamification there, some common misperceptions about it, and five key design principles.

I’m @bhc3 on Twitter, and I’m a Senior Consultant with HYPE Innovation.

About these ads

Gmail offers surprising innovation lessons for the Fortune 500

If you’re familiar with the story of Gmail, you know – for a fact – that it was a 20% time employee project by Paul Buchheit. A little bottom-up experimentation that grew into something big.

Surprise! That story is wrong.

It was a desire by Google, the company, to offer its own email. From Harry McCracken’s great piece How Gmail Happened: The Inside Story of its Launch 10 Years Ago:

Gmail is often given as a shining example of the fruits of Google’s 20 percent time, its legendary policy of allowing engineers to divvy off part of their work hours for personal projects. Paul Buchheit, Gmail’s creator, disabused me of this notion. From the very beginning, “it was an official charge,” he says. “I was supposed to build an email thing.”

Gmail’s creation has more in common with innovation inside large enterprises than it does with the start-up world. Read on if you recognize these:

  1. Job-to-be-done thinking
  2. Reports of the death of company innovation are greatly exaggerated
  3. Corporate antibodies are everywhere
  4. Senior executive support
  5. Big Innovation takes time

Job-to-be-done thinking

Yahoo email screenshot

Image via Variable GHZ, “Why Yahoo Mail is Still an Epic Catastrophe

Anyone remember life before Gmail? We had low storage limits. ‘OK’ search. Poor spam control. Yahoo, one of the dominant players at the time, pursued the freemium strategy that required paying for more storage and better controls. Which isn’t unheard of, mind you.

It’s just…

Think of the core job-to-be-done: When I want to update others, I want to send and receive communications. Some key job tasks that define that job include:

  • Easily send pictures to others
  • Read emails from real people and organizations that I care about
  • Find old emails when I need them
  • Expand my usage of email economically

Yahoo, Hotmail, AOL were fine as far as they went, but they each were challenged on these key job tasks. Back when I had a Yahoo email, I remember the spam being awful and it seemed impossible to control.

Google looked at the offerings in the market, and recognized an opportunity to better satisfy people’s expectations for these important job tasks. Larger size limits, stellar spam control, excellent search and ongoing improvements through Gmail Labs.

Lesson: ABI (Always Be Improving) on the customers’ jobs-to-be-done. Think of the entire job flow and determine which areas are ripe for a better service and experience. Big companies can too easily focus on executing what they have rather than thinking about customers need. 

Reports of the death of company innovation are greatly exaggerated

Image via Family Life Resources

Somewhere along the line, a narrative has emerged that pretty much every big company cannot innovate its way out of a bag. Admittedly, the increasingly rapid turnover of the S&P 500 and the fast rise and decline of companies fuels this narrative. But it’s glib to say companies just don’t do it.

Google’s 20% time is espoused as the antidote to this issue. Middle management stifling innovation? Let everyone experiment on their own. But Gmail wasn’t a 20% time project. It was actually something planned and resourced for development for the organization at large.

This is an important point. If companies set their mind to innovate in an area, people will contribute and provide fantastic ways to get there. Tony Vengrove advised on a key element for success here:  “A compelling vision statement describes what the company wants to become in the future. It not only needs to inspire but ideally it should inform the innovation agenda.”

Lesson: Innovation is not dead inside companies. It does require leadership to set a vision that employees can focus on.  

Corporate antibodies are everywhere

Google is rightly perceived as one of the most innovative companies on the planet. Given that, one might assume that the innovation wheels are well greased there. But I was struck by these quotes from McCracken’s story about the birth of Gmail:

“A lot of people thought it was a very bad idea, from both a product and a strategic standpoint,” says Buchheit of his email project. “The concern was this didn’t have anything to do with web search. Some were also concerned that this would cause other companies such as Microsoft to kill us.”

Within Google, Gmail was also regarded as a huge, improbable deal. It was in the works for nearly three years before it reached consumers; during that time, skeptical Googlers ripped into the concept on multiple grounds, from the technical to the philosophical. It’s not hard to envision an alternate universe in which the effort fell apart along the way, or at least resulted in something a whole lot less interesting.

Inquisitor vs. Corporate AntibodyIn those two quotes, you see critiques that aren’t really about specific elements of Gmail, the concept.

In Four Personality Types that Determine Innovation Success or Failure, a distinction is drawn between Inquisitors, who reflect thoughtfully on issues facing an idea, and Corporate Antibodies, who just want the idea dead. Here are hypothetical responses to Gmail by the two different personality types:

Inquisitor: “Won’t we spook people when they see ads related to the email they’re reading?”

Corporate Antibody: “Email has become a commodity. There are other products we should be building.”

Lesson: Corporate antibodies will always be with us. Recognize legitimate probing for faults versus efforts to undermine the idea in total. Spend time figuring out how to get around Corporate Antibodies, not appeasing them.

Senior Executive Support

Senior executives matter in innovation

In a land of radical transparency and holacracy, the traditional top-level support needed for initiatives is a thing of the past. Alas, we are not in that land. For the 99.9% of people who live with today’s reality, top-down support continues to be the effective way things get done.

It does put pressure on top executives then. They are held accountable by the C-suite, the Board and shareholders. Already in this post, senior executives are called on to ensure innovation moves forward in two different ways.

Set the innovation course: Leadership – be it in business, community, military – has a role in establishing the objectives for people. Indeed, set objectives and get out of the way. In Gmail’s case, Larry Page and Sergey Brin saw a future that extended beyond just search. Paul Buchheit was charged to figure out what a Google email app would look like.

Remove obstacles to innovation: We saw previously that Corporate Antibodies are alive and well. But they didn’t stop Gmail’s progress. From McCracken’s article: “Fortunately, the doubters didn’t include Google’s founders. ‘Larry [Page] and Sergey [Brin] were always supportive,’ Buchheit says. ‘A lot of other people were much less supportive.’ “

Lesson: If senior management isn’t paying attention to innovation, it’s a safe bet no one in the company is either. Employees respond to the agenda set by executives. Organic growth comes from a clear focus that involves executives and employees.

Big Innovation takes time

One of my favorite perspectives on innovation comes from Jeff Bezos. In an interview on Harvard Business Review:

ADI IGNATIUS: Jeff, you’ve said that you like to plant seeds that may take seven years to bear fruit. Doesn’t that mean you’ll lose some battles along the way to companies that have a more conventional two or three-year outlook?

JEFF BEZOS: Well, maybe so, but I think some of the things that we have undertaken I think could not be done in two to three years. And so, basically if we needed to see meaningful financial results in two to three years, some of the most meaningful things we’ve done we would never have even started. Things like Kindle, things like Amazon Web Services, Amazon Prime. The list of such things is long at Amazon.

2014 2019Note that he’s referencing Big Innovation. Concepts that are market changers. There are plenty of opportunities for small-ball innovation (or improvements). But for the really big stuff, executives need to back away from the notion that it can be done in one year.

This was seen with Gmail as well. It was in the works for three years before it was launched to consumers. Continual effort was applied to the product features, the user experience, the business model and the infrastructure to support it. During this time, the project was assailed internally, but as noted previously, senior management supported its ongoing development. Similar to the way Bezos sticks with groundbreaking projects for the long term.

Lesson: Senior management must recognize the magnitude of the innovation it seeks and commit the right time horizon, resources and support to it. This applies for small ball innovation and Big Innovation.

Google, of course is now a HUGE company, on par with the biggest in the world. Its Gmail experience provides valuable lessons for Fortune 500 firms seeking to innovate.

I’m @bhc3 on Twitter, and I’m a Senior Consultant with HYPE Innovation.

Is it innovation or just an improvement? Does it matter?

On the LinkedIn Front End of Innovation group, I saw this post:

Interesting (and heated) discussions @ Unleashing Innovation Summit in Amsterdam earlier in the month: Incremental innovation is NOT innovation – it’s just marketing. REAL innovation is breakthrough/transformational… Agree or not?

I’ve seen this debate before. Attempts to finally, once-and-for-all establish just where improvement ends and innovation begins. People end up with a Maginot Line that fails to defend the sanctity of innovation. Quick: Amazon 1-click purchasing…improvement or innovation?

Does it matter that we define innovation? I once collected a bunch of people’s definitions of innovation to celebrate the multiple ways people think about it. That was a nod to the different ways people think about it. It was divergence, not convergence.

But there are times people want a clearer line between innovation and improvement. Let’s see how some smart folks have articulated the difference.

Perspectives on defining innovation

Scott Berkun: Innovation is significant positive change. This is a high bar, and it should be. What does significant mean? I’d start with the invention of the light bulb, constitutional governments, wireless radio and maybe web browsers. Perhaps you could say significant is a 30% or more improvement in something, like the speed of an engine or the power of a battery. If you know the history of your profession you know the big positive changes people made over the last 50 years, giving you perspective on the scale of brilliance you need to have to be worthy of that word. (#)

Alan Lepofsky: Both innovation and improvement are important concepts, but unfortunately the two terms are often used interchangeably. Innovation reimagines an existing process or market, or creates a brand new one. Improvement enhances an existing process or market, but does not create disruption.  (#)

Chris AndrewsI think your point highlights something important: there’s a pretty fine line between business-as-usual product improvements and real innovation, and it’s important not to confuse the two. (#)

Jon Van Volkinburg: I don’t see innovation as something that merely creates value for a customer and/or for the provider. Expanding or adding a service, feature, or function is not innovation, and these things create value. These things are growth, novelty, and invention. They are great, necessary, and can lead to innovation if the environment and timing is right. If you want, I guess you can call it incremental innovation… but I wouldn’t, to me the term “incremental innovation” is an oxymoron. (#)

What if we actually settled this debate once and for all?

Assume for a moment that we, as a society, agree on what constitutes Innovation. Then what? What is the logical flow of events and decisions that follow such a conclusion?

The reality it that doesn’t matter what something is called ex post facto. It only matters what impact it has on the consumers of the improvation.

Innovation or improvement comic

Before seeking improvation:

  • Understand the problem you’re addressing (no easy task itself)
  • Develop a sense of the magnitude of what’s required (shave a few $1,000? develop a $1 billion market?)
  • Be prepared to follow through on the ideas generated at a level commensurate with their scale

Here, it is important to understand how you define what you’re seeking. And it doesn’t matter whether you call it an improvement or an innovation. Afterwards, after the idea has become real? Again, it doesn’t matter what anyone calls it. It’s about how well it addresses the job-to-be-done. Call it what you want.

You like to-may-to,
And I like to-mah-to…

I’m @bhc3 on Twitter, and I’m a Senior Consultant with HYPE Innovation.

 

How deep does crowdsourced problem-solving go?

On the recent post, Why crowdsourcing works, Michael Fruhling of BFS Innovations asked:

A couple of related questions: for most current crowd sourced problem solving endeavors, how “deep” does the problem solving routinely go? And do the results meaningfully change if incentives are introduced?

It was a good, thoughtful question. I answered it in the comments there, and wanted to make the answer into its own blog post, below.


Tim O'Reilly tweet on crowdsourcingThe depth of the problem-solving in a crowdsourcing endeavor is wholly dependent on:

  • The question that is asked
  • The engagement of the question sponsor
  • Who is asked to participate
  • Why people would want to participate

A few points on each of those factors.

Question that is asked

As you can imagine, the question impacts the depth of problem-solving. In-depth question = in-depth problem-solving. The more specific the question, the better the quality of people’s contributions. “Specific” here doesn’t mean asking a tactical, low-level question. Rather, it means clearly delineating what is sought in a way that people can relate to .

Engagement of the question sponsor

Crowdsourcing works best (obviously?) when solving a specific problem that someone has. People will respond to the question with different concepts and questions. The feedback of the question asker (aka “sponsor”) provides the back-n-forth that breaks through initial responses to build a deeper response.

Who is asked to participate

Getting cognitive diversity is the key, as described in the post. But also, you want people who have some connection and interest in the question. Think holistically about that. Upstream, downstream, adjacent fields. Problem-solving depth requires matching a question with people who will give a damn.

Why people would want to participate

The question of “why” is closely related to the preceding question of “who”. If a question’s answer potentially affects a person, there is built-in motivation to participate: steer things in a way that makes sense to you. This works well for internal employee-based crowdsourcing. However, there are certainly questions where the personal impact may be less acute. Other incentives come in to play. Engagement with a sponsor – with attendant acknowledgments, thank you’s, feedback – are great incentives. Opportunities to see an idea through is a powerful stimulant. And prizes have great power. Prizes work best when they establish an opportunity to see an idea one is passionate about become real (e.g. investment funds). Or when the question is not one that directly impacts you. In such a case, they are compensation for putting your brainpower to work problem-solving.

I’m @bhc3 on Twitter, and I’m a Senior Consultant with HYPE Innovation.

The Journey of an Idea

I’d bet most of us understand an the initially proposed idea and its ultimate implementation are going to differ. Ideas are cheap, as they say. It’s what happens after the idea is proposed where success or failure is determined. Typically, the “after proposal”  focus is on the execution of the idea. But there’s a phase between the idea proposal and the execution of it. It’s a phase where the idea is molded and sharpened.

An idea essentially goes through a journey prior to its implementation:

The probability of an idea becoming reality is affected by different types of participation. Four different personalities act of the idea during its journey:

  1. Creator
  2. Inquisitor
  3. Helper
  4. Doer

On the HYPE Innovation blog, I’ve written about them in: Four personalities that determine innovation success or failure.

I’m @bhc3 on Twitter.

Will customers adopt your innovation? Hope, fear and jobs-to-be-done

When will a customer decide your innovative product or service is worth adopting? It’s a question that marketers, strategists and others spend plenty of time thinking about. The factors are myriad and diverse. In this post, let’s examine two primary elements that influence both if an innovation will be adopted, and when it would happen:

  1. Decision weights assigned to probabilities
  2. Probability of job-to-be-done improvement

A quick primer on both factors follows. These factors are then mapped to the innovation adoption curve. Finally, they are used to analyze the adoption of smartwatches and DVRS.

Decision weights assigned to probabilities

Let’s start with decision weights, as that’s probably new for many of us. In his excellent book, Thinking, Fast and Slow, Nobel laureate Daniel Kahneman describes research he and a colleague did that examined the way people think about probabilities. Specifically, given different probabilities for a gain, how do people weight those probabilities?

Why?

Classic economics indicates that an outcome has a 25% probability, then 25% is the weight a rational person should assign to that outcome. If you’ve taken economics or statistics, you may recall being taught something along these lines. However, Kahneman and his colleague had anecdotally seen evidence that people didn’t act that way. So they conducted field experiments to determine how people actually incorporated probabilities into their decision making. The table below summarizes their findings:

Decision weights vs probability

The left side of the table shows that people assign greater weight to low probabilities than they should. Kahneman calls this the possibility effect. The mere fact that something could potentially happen has a disproportionate weight in decision-making. Maybe we should call this the “hope multiplier”. It’s strongest at the low end, with the effect eroding as probabilities increase. When the probability of a given outcome increases to 50% and beyond, we see the emergence of the uncertainty effect. In this case, the fact that something might not happen starts to loom larger in our psyche. This is because we are loss averse. We prefer avoiding losses to acquiring gains.

Because of loss aversion, an outcome that has an 80% probability isn’t weighted that way by people. We look at that 20% possibility that something will not happen (essentially a “loss”), and fear of that looms large. We thus discount the 80% probability to a too-low decision weight of 60.1.

Probability of job-to-be-done improvement

A job-to-be-done is something we want to accomplish. It consists of individual tasks and our expectation for each of those tasks. You rate the fulfillment of the expectations to determine how satisfied you are with a given job-to-be-done. This assessment is a cornerstone of the “job-to-be-done improvement” function:

Job-to-be-done improvement function

Dissatisfaction: How far away from customers’ expectations is the incumbent way that they fulfill a job-to-be-done? The further away, the greater the dissatisfaction. This analysis is really dependent on the relative importance of the individual job tasks.  More important tasks have greater influence on the overall level of satisfaction.

Solution improvement: How does the proposed innovation (product, service) address the entirety of the existing job? It will be replacing at least some, if not all, of the incumbent solution. What are the better ways it fulfills the different job tasks?

Cost: How much does the innovation cost? There’s the out-of-pocket expense. But there are other costs as well. Learning costs. Things you cannot do with the new solution that you currently can. The costs will be balanced against the increased satisfaction the new solution delivers.

These three elements are the basis of determining the fit with a given job-to-be-done. Because of their complexities, determining precise measures for each is challenging. But it is reasonable to assert a probability. In this case, the probability that the proposed solution will provide a superior experience to the incumbent solution.

Mapping decision weights across the innovation adoption curve

The decision weights described earlier are an average across a population. There is variance in those. The decision weights for each probability of gain in job-to-be-done will differ by adoption segment, as shown below:

Decision weights across innovation adoption curve

The green and red bars along the bottom of each segment indicate the different weights assigned to the same probabilities for each segment. For Innovators and Early Adopters, any possibility of an improvement in job-to-be-done satisfaction is overweighted significantly. At the right end, Laggards are hard-pressed to assign sufficient decision weights to anything but an absolutely certain probability of increased satisfaction.

Studies have shown that our preferences for risk-aversion and risk-seeking are at least somewhat genetically driven. My own experience also says that there can be variance in when you’re risk averse or not. It depends on the arena and your own experience in it. I believe each of us has a baseline of risk tolerance, and we vary from that baseline depending on circumstances.

Two cases in point: smartwatches and DVRs

The two factors – decision weights and probability of improved job-to-be-done satisfaction – work in tandem to determine how far the reach of a new innovation will go. Generally,

  • If the probability of job-to-be-done improvement is low, you’re playing primarily to the eternal optimists, Innovators and Early Adopters.
  • If the probability of improvement is high, reach will be farther but steps are needed to get later segments aware of the benefits, and to even alter their decision weights.

Let’s look at two innovations in the context of these factors.

Smartwatches

SmartwatchSmartwatches have a cool factor. If you think of a long-term trend of progressively smaller computing devices – mainframes, minicomputers,  desktops, laptops, mobile devices – then the emergence of smartwatches is the logical next wave. Finally, it’s Dick Tracy time.

The challenge for the current generation of smartwatches is distinguishing themselves from the incumbent solution for people, smartphones. Not regular time wristwatches. But smartphones.  How much do smartwatches improve the jobs-to-be-done currently fulfilled by smartphones?

Some key jobs-to-be-done by smartphones today:

  • Email
  • Texting
  • Calls
  • Social apps (Facebook, Twitter, etc.)
  • Navigation
  • Games
  • Many, many more

When you consider current smartphone functionality, what job tasks are under-satisfied? In a Twitter discussion about smartwatches, the most compelling proposition was that the watch makes it easier to see updates as soon as they happen. Eliminate the pain of taking your phone out of your pocket or purse. Better satisfaction of the task of knowing when, who and what for emails, texts, social updates, etc.

But improvement in this task comes at a cost. David Breger wrote that he had to stop wearing his smartwatch. Why? The updates pulled his eyes to his watch. Constantly. To the point where his conversational companions noticed, affecting their interactions. What had been an improvement came with its own cost. There are, of course, those people who bury their faces in their phones wherever they are. The smartwatch is a win for them.

If I were to ballpark the probability that a smartwatch will deliver improvement in its targeted jobs-to-be-done, I’d say it’s 20%. Still, that’s good enough for the Innovators segment. I imagine their decision weights look something like this:

Decision weights - Innovators

The mere possibility of improvement drives these early tryers-of-new-things. It explains who was behind Pebble’s successful Kickstarter campaign. But the low probability of improving the targeted jobs-to-be-done dooms the smartwatch, as currently conceived, to the left side of the adoption curve.

DVRs

DVRDigital video recorders make television viewing easier. Much easier. Back when TiVo was the primary game in town, early adopters passionately described how incredible the DVR was. It was life-changing. I recall hearing the praise back then, and I admit I rolled my eyes at these loons.

Not so these days.

DVRs have become more commonplace. With good reason. They offer a number of features which improve  various aspects of the television viewing job-to-be-done:

  • Pause a live program
  • Rewind to watch something again (your own instant replay for sports)
  • Set it and forget it scheduling
  • Easy playback of recorded shows
  • Easy recording without needing to handle separate media (VCR tape, DVD)

But there are costs. If you’ve got a big investment in VCR tapes or DVDs, you want to play those. It does cost money to purchase a DVR plan. The storage of the DVR has a ceiling. You have to learn how to set up and work with a DVR. It becomes part of the room decor. What happens if the storage drive crashes?

My estimate is that the DVR has an 80% probability of being better than incumbent solutions. Indeed, this has been recognized in the market. A recent survey estimates U.S. household adoption of DVRs at 44%. Basically, knocking on the door of the Late Majority. I imagine their decision weights look like this:

Decision weights - Late Majority

On the probability side of the ledger, they will need to experience DVRs themselves to understand its potential. For the Late Majority, this happens through experiencing an innovation through their Early Majority friends. They become aware of how much an innovation can improve their satisfaction.

On the decision weight, vendors must do the work of addressing the uncertainty that comes with the innovation. This means understanding the forces – allegiance to the incumbent solution, anxiety about the proposed solution – that must be overcome.

Two influential factors

As you consider your product or service innovation, pay attention to these two factors. The first – jobs-to-be-done – is central to getting adoption of any thing. without the proper spade work there, you will be flying blind into the market. The second factor is our human psyche, and how we harbor hope (possibility) and fear (uncertainty). Because people are geared differently, you’ll need to construct strategies (communication channels, messaging, product enhancements) that pull people toward your idea, overcoming their natural risk aversion.

I’m @bhc3 on Twitter, and I’m a Senior Consultant with HYPE Innovation.

I’m joining HYPE to help companies get more value from innovation

HYPE Innovation logoIt is my pleasure and honor to announce that today I’ve joined HYPE Innovation as a full-time Senior Consultant. HYPE provides an enterprise innovation management software platform – HYPE Enterprise – used by large companies around the globe. In my consulting role, I’ll be working hands-on with customers across the phases of innovation maturity:

  • Beginning the journey toward a more collaborative innovation approach
  • Expanding usage as they gain experience and see results
  • Developing advanced ecosystems to drive next generation business models and products

This role is a change for me, moving from product to consulting.  But it’s one I embrace and I’m looking forward to. I’ve talked a lot here about the need to understand customers’ jobs-to-be-done. By working side-by-side with organizations, I’m going to have a deep understanding of their jobs-to-be-done for innovation and problem-solving. And even better, an opportunity to help make them successful.

HYPE is headquartered in Bonn, Germany, and I’ll be working from San Francisco. In this post, I want to cover two areas:

  1. State of the innovation management market
  2. What makes HYPE special

State of innovation management market

Enterprise traction

Over the past five years, I’ve worked with a number of customers and thought leaders in the innovation management space. People that are committed to and passionate about this. The first thing to know is that enterprises are actively exploring ways to be better at innovating. Many, IDC Predictions 2014many of the companies you know and buy products and services from. From its roots as online suggestions boxes, innovation management has become a full-fledged corporate discipline. In fact, research firm IDC forecasts that by the end of 2016, 60% of the Fortune 500 will be using social-enabled innovation management solutions. Which, if you follow the innovation diffusion lifecycle, means we’ll start to see the late majority taking it up.

Focused ideation

When I began working in the innovation field, the primary use case for innovation management software was to be an open suggestion box, equipped with social features (visibility, commenting, voting). Anytime someone had an idea, they had a place to post it. Unfortunately, that approach proved limited in engagement and value. Thus, that model has changed significantly the past few years. Organizations are now running campaigns that target narrow, specific topics. They are time-boxed events, which in a broad  sense is a form of game mechanic that spurs greater participation. Campaigns offer these advantages:

  • Ready recipients – campaign sponsors – to engage, elaborate and select ideas
  • Continuously refreshing the program and reason for people to participate
  • Address specific organization needs

Beyond innovation

Innovation – however you define it – continues to be a prominent use case. And with good reason, as CEOs rate it a top priority. There are multiple disciplines that address innovation: crowdsourcing, design thinking, TRIZ, incubators, lean startup, etc. Generally, innovation is considered creating something new which adds value.

But I’m seeing signs that crowdsourcing  is being applied in other ways outside the traditional view of innovation. Here are three examples:

  • Problem-solving: An example of this is cost-saving initiatives. People out on the front lines are seeing opportunities for improvement that are hidden from decision-makers in the headquarters.
  • Positive deviance: In every large organization, there are people who have figured out a different, better way to do something. Crowdsourcing helps find these people, and their novel approaches can be identified and shared.
  • Trend-spotting: With an army of employees out in the field, organizations have a ready way to canvas an area. People can post what they’re seeing, a valuable source of raw insight.

Idea development, evaluation and selection take center stage

When I talk with people not familiar with the innovation management field, I find their understanding often to be, “Oh, so it’s an idea collection app.” That is a necessary feature of course – no ideas, no innovation. But it’s a comical under-representation of what innovation management is. As Professor Tim Kastelle notes:

“Generating ideas is the easiest part. Most organisations already have enough ideas. The challenge for them is not generating more but implementing their existing ideas more effectively.”

As the market matures, companies are seeking ways to better advance the most promising ideas. This is where the puck’s heading.

Innovation becomes part of the purposeful collaboration canon

In the broader enterprise 2.0 social business market, the integration of ‘social’ into core business functions has emerged as the basis of value. This is a change from the movement’s early roots. Constellation Research VP Alan Lepofsky nicely illustrates this evolution to Generation 3 as follows:

Alan Lepofsky socbiz generations

Innovation is a prominent use case that benefits from the application of social and collaboration. You can see more in Alan’s Slideshare presentation on innovation and purposeful collaboration.

What makes HYPE special

From my experience in the industry and in my meetings with the team, three things about HYPE stand out in the innovation management field

  1. Singular focus on customers’ innovation jobs-to-be-done
  2. Market leadership
  3. Demonstrated customer excellence

Singular focus on customers’ innovation jobs-to-be-done

HYPE has over a decade of experience in the innovation market. It’s roots were in the R&D world, with a deep emphasis on how to maximize the value of ideas. In industry parlance, this is sometimes called the “back-end” of innovation. It’s a sophisticated activity with variance in process for each organization. Through the years of working with customers, HYPE has become adept at handling this phase of innovation. I know it’s not easy – I did some initial product work myself in this realm previously. Success here hinges on understanding what customers seek to achieve, and acting on it.

With the rise of social business and increased interest in better utilizing the collective smarts of employees, HYPE moved forward to the “front-end” of innovation. Powerful features include campaign development, participation management, idea surfacing, collaboration and evaluation. With this investment of time and effort, HYPE offers the most functional full-cycle innovation process in the industry:

HYPE - full lifecycle innovation process

With deep expertise built throughout the platform, HYPE is well-positioned to address organizations’ innovation jobs-to-be-done.

Market leadership

Forrester Wave - Innovation Management 3Q13 - rotatedIn the past few years, HYPE has increased its presence in the market, following an investment from ViewPoint Capital Partners. From its roots in Germany, the company has become the leader in Europe. It is now seeing good growth in broader EMEA, the United States and South America.

Recently, Forrester published its Wave for Innovation Management Tools. Analyst Chip Gliedman reviewed 14 of the most significant vendors in the space.  The analysis included:

  • Innovation lifecycle: the components of a complete cycle
  • CIO concerns: governance, security, architecture, integration
  • Product roadmap
  • Management team
  • Vision

HYPE achieved the top overall ranking, the coveted “top right” position of the Wave.

Demonstrated customer excellence

HYPE Customers

HYPE has over 170 customers from around the world. Consistent with my experience, the industries are varied. Some representative names are shown to the left. This is something one sees when it comes to innovation: everyone does it. There’s really not a specific sector that pursues innovation and problem-solving more than others.

HYPE has a number of long-term relationships. And it’s fair to say that once you’re a client of HYPE, you’ll be happy, satisfied and get results. Annual churn is less than 4%. On a monthly basis, that’s roughly 0.3%, at the magic level for enterprise software companies.

That level of customer satisfaction doesn’t “just happen”. Rather, it comes from being dedicated to customers’ success and working to make them successful at their jobs-to-be-done.

That HYPE logo?

Finally, about the HYPE logo. I actually do not yet know the background on it. But take a look at it. See some similarities to different hand gestures?

HYPE logo meaning

I’m looking forward to joining the team.

I’m @bhc3 on Twitter.

When customers want a product roadmap, do this instead

Product roadmaps suck.Roadmap

There, I said it. <exhales>

OK, let’s explain that. Roadmaps that are real, living documents representing what you will deliver…are awesome. But that may not be the case for you; it wasn’t for me. Instead, a product roadmap was at its core a sales document for a prospect call. A lot of effort, with various people weighing in on what should show up there. Ginning up dates over the next 24-36 months for when features will be delivered. A visually lickable timeline.

And it’s defunct as soon as it’s published. Poor roadmap, it never had a chance. If anyone actually remembers what was in the roadmap months later, you’re left explaining that, “um…yeah…things changed”.

To be fair, this happens more in industries where the level of uncertainty is high. You’re assembling the future, learning as you go along and making adjustments. Industries with stability can put a roadmap out there and stick to it. But if your industry has a lot of fluctuation in its future, roadmaps are an  exercise in futility.

Given this, what’s the point of creating them? For me, a better way to handle the inevitable roadmap requests was needed. Internally for client-facing peers; and with sales prospects and current clients. I took the view that the customer’s roadmap request was essentially about these three questions:

  1. Where will your development resources be focused over the 12-36 months?
  2. Does your view of what’s needed for successful outcomes matches mine?
  3. What are the core values of your platform philosophy?

In other words, knowing that X feature would be rolled out in 12 months wasn’t really what influenced the customer. It wasn’t as if they said, “Oh, that feature will be there in a year? I’ll pay $X for your platform today and begin to use it once that feature is ready.”

I wanted to find a better way. Answer the questions the customer has while avoiding unrealistic commitments and schedules.  So I developed a different approach to requests for a roadmap. It focuses on two core elements:

  • Product themes
  • How we’ll work with the client

Themes are the future the customer is buying. Work with the client describes the ongoing interactions around product design. Both are part of the decision calculus of the customer. Should I go forward with this company or not?

Product themes

Product themes are the core areas that are the means to the outcomes customers seek. When I worked at Spigit, I developed five core themes (conceptualized in below graphic):

Themes

Themes are the broad areas in which the platform needs to excel. They are selected because they are key to satisfying high-level jobs-to-be-done. They will vary by product. An accounting app might have themes around ‘accuracy’, ‘sync with GAAP’ and ‘integration with other apps’. A supplier of chemicals might need to concern itself with ‘potency of compounds’ and ‘safety’.

Themes are where an analytical approach meets a flair for artistry. Internally, they are great for organizing future release efforts. I would actually grade the platform on the themes, using the A to F scale, to help prioritize future effort.

For customers, themes provide a peek into what makes your platform special. You’re communicating a promise for what future releases will address. Customers develop a sense of the platform today, and the platform of the future.

Past + possible features = proof

For the themes, plan on doing more than stating them. Bring them to life by talking features. Yes, this sounds like the roadmap rat-hole. But it’s a different way to do that:

Theme + features

Past features are proof that you are focused on the themes, and they illustrate how you have approached enhancing the themes for clients thus far. They connect the experience of your product today to the themes.

Possible features are a source of excitement, and proof that you’re focused on the themes in future development. They’re not supposed to be a committed list of features over the next 3 years. Rather, they provide a sense for how you’re approaching fulfillment of customers’ jobs-to-be-done. This gives you the chance to talk about some of the ideas floating around in your organization while avoiding the farce of putting dates on when (and if) they’ll be delivered. When asked, I put it to them straight: “These are several ideas we currently have for this theme. What are your thoughts on them?”

Which leads nicely into the other major point to cover…

How we’ll work with the client

In the B2B market, customers want to have direct input into the product design process. Not so much in the consumer market, where we simply stop buying something if it doesn’t satisfy us. But the dollars and reputation that can be on the line in the corporate market translate into greater interest in where the product is going.

To address this desire, communicate how you will work with your customer in the product design process. I would talk about three areas:

Customer insight in product design

Jobs-to-be-done: Ongoing learning about the different things customers seek to accomplish, what they rank as most important and their level of satisfaction with achieving those goals. This is a deeper dive into motivations, how outcomes will be measured and current pain points.

Ideas: As the most active users of your product (often more than you), customers will see opportunities for improvement.  Maintain a site for ongoing suggestions as they occur, and run targeted ideation campaigns for specific areas of development.

Design feedback: Prior to committing to production of a product, run several designs by them. The designs will emphasize different functions and looks, and customers give an early read on how they will be received.

The combination of themes and the ways you’ll work with customers answers the key questions they have. It actually goes way beyond the normal roadmap, providing philosophical underpinnings for your product.  And for the product manager, it’s something you can discuss with integrity and enjoyment.

I’m @bhc3 on Twitter.

Prescribing Success with Disruptive Innovation

This is a guest post by Michael Mayers, an experienced innovation & new product development leader who has launched successful products in financial services, marketing services and health & wellness.  He tweets about innovation, entrepreneurship, and the joy of being a Brit in NYC at @mikemayers25.

At the time of writing Amazon lists 932 books released in the past 90 days under “innovation.”  Many of these will espouse a theory of one sort or another that seeks to systematize the process of successfully bringing new products to market.  And while most will deliver a superficially cogent model for the limited historic cases provided, nearly all of them will fail to deliver a prescription for future success that works in practice. (That’s right Stage Gate, I’m looking at you!)

Innovators Solution

But some time-tested theories do have prescriptive value.  In this blog post I take one such canonical model – Professor Clayton Christensen’stheory of Disruptive Innovation – and use three of its defining characteristics to help identify entrepreneurial opportunities and kick-start successful innovation strategy.

Look for 5 blade razors.

As markets mature industry leaders seek to capture increasing value from their best customers in an effort to drive profitable growth.  M&A aside this is typically achieved through the development of incremental, sustaining innovations – a process which most businesses become adept at executing.  The inevitable challenge comes when the pace of these innovations oversupply product performance for even the best and most demanding customers.  This oversupply, easily measured, is a key indicator that a sector is ripe for disruption. 

This tendency is surely no more obvious than Gillette’s flagship razor; the Fusion ProGlide with its parody-inducing 5 blades.  Arguably the maximal desired performance from the humble safety razor was surpassed decades ago with the twin-blade Trac II and the addition of a lubricating strip.

Enter Dollar Shave Club who, in 2011, recognized this performance oversupply and launched an innovative business model to attack the razor blade industry’s most over-served customers.  It’s first offering was a twin-blade razor – 70s “technology” in blade terms – sold online via a monthly subscription model at a fraction of the price of its big brand competitors.  And the attack plan is working:  The company raised $12MM in a Series B financing in October last year.

Ignore the best customers.

Take an industry’s best customers – those premium, big ticket whales with the eye-watering margins – and forget about them.  Successful disruptive innovations typically target current category low-end or non-consumers, and for two very good reasons:

Firstly, when a new entrant targets non-consumption the competitive response from industry incumbents is often muted to the extent that they may be ignored altogether; just as the personal computer industry was by mainframe manufacturers in the 1980s.  And if the interloper seeks to serve and incumbent’s low-end, low-margin customers they may even be happy to give up that pesky, profit-dilutive segment in the pursuit of better financial ratios; just as US car manufacturers gave way to Toyota in the 1980s in the compact economy segment.  And let’s face it, who wanted to sell low-margin compacts like the Corolla when you’ve earned the right to sell the S-Class? 

Secondly, the initial performance of new technologies usually fall short of the demands of an industry’s best and most sophisticated customers:  PC performance was a joke for mainframe computer buyers.  Either way a new entrant attack on an industry’s best customers is ill-advised.  Either a crippling competitive response or sophisticated customer demands will bring the upstart to its knees.  Much better to compete for low-end markets or outright non-consumption where your product can be the best alternative to nothing at all. 

Classic examples of this approach include Sony who offered a generation of new consumers access to personal, portable transistor radios while RCA doggedly stuck to heavy vacuum tube technology which afforded significantly better sound quality for the most discerning customer.  Or Nucor’s electric arc furnace, suitable only for the production of rebar at its outset, versus Armco’s integrated steel mills which produced much higher grade sheet steel.  Or Netflix’s streaming DVDs, utterly at the mercy of fickle bandwidth issues in those early days, versus Blockbuster’s DVD rental business with its significantly higher fidelity picture quality. 

When each of these technologies first launched they did not satisfy the current needs of their industry’s best customers.  Crucially – and fatally for the businesses who ignored them – each quickly shed their growing pains and enjoyed a higher performance gradient over time than existing ones.  The consequence?  Having established a beachhead amongst non- or low-end consumers those nascent technologies intercepted and then surpassed the performance of prior technologies.  Each successively picked off “low-value” customer segments from the bottom up until they adequately addressed the performance requirements of the entire market.  And it’s worth noting that Dollar Shave Club now offer 4- and 6-blade razor lines on their subscription model:  The move upmarket has begun.

Understand why the product is hired.

Why did you hire that non-fat latte you had this morning?  It’s an odd turn of phrase but you did, in a sense, hire it to do some jobs.  Maybe it was to give you a boost of energy.  Or to stave off hunger pangs until lunch.  Perhaps it was just a useful distraction or a focal point around which to socialize with colleagues.  Maybe it was all of those things.  Once you accept that we hire products and services to perform “jobs-to-be-done” (JTBD) on functional, emotional and social dimensions a whole world of insight will open up about consumer motivations and the building blocks of competition and product performance from the consumer perspective.

The story of Febreze illustrates this well.[1]  Now a $1B product Febreze was very nearly a total failure for P&G.  Functionally, it performs an obvious JTBD:  It neutralizes odors in household fabrics.  Early marketing efforts focussed heavily on this performance dimension but sales were muted.  Both the R&D and marketing teams were bewildered as to why such an obviously useful product was failing to gain traction.

Post-launch research highlighted an interesting but troubling phenomenon:  Customers were often desensitized and oblivious to even the most pungent odors in their own homes.  Functionally, Febreze solved an issue for a home’s visitors rather than its owners. 

But the P&G team had a stroke of luck:  They observed a few customers who, having worked hard to clean their homes, were then liberally applying Febreze with a final flourish.  Diving deeper into this behavior the researchers found that some customers were hiring the product to provide a visceral emotional reward at the end of a period of cleaning – a means to enhance the satisfaction of having a cleaned a room.  The penny dropped and Febreze was quickly repositioned for its emotive, rather than functional, qualities and sales soared. 

This story is often positioned as a flash of marketing positioning brilliance; a Eureka! moment that’s difficult to replicate.  But when viewed through the lens of JTBD it becomes clear that a more systematic understanding of the emotional dimensions to cleaning a home may have saved P&G from considerable heartache.  And while Febreze is now a roaring success one has to wonder how many potential category killers have been pulled from the market for want of a better understanding of what the customer was actually trying to achieve?

Flip the coin.

I’ve been looking at this in terms of using the Disruptive Innovation framework to find innovation opportunities.  However, if you are an enterprise manager working in an established industry then you can easily turn this around to find corresponding threats.  Ask yourself whether you are oversupplying your customers and delivering your own version of the 5 blade razor.  For a moment, put your best customers out of your mind and think about how traditional non-consumers and low-end markets might be better served by competing technologies.  And put the product marketing brochures to one side and ask what jobs to be done you satisfying amongst your customer base.  And finally, if and when you identify a threat or technology that is picking off your least profitable customers, don’t flee upmarket:  Stand, fight and innovate right back.

[1] This story is recounted from “The Power of Habit: Why We Do What We Do in Life and Business” by Charles Duhigg.

Decision flow for customer feature requests

If you  manage a product or service in the business-to-business (B2B) market, customer requests for features will be a regular part of your work. Requests come in through the sales team, service reps, and senior management, as well as directly from customers themselves. It’s a disruptive insertion of new items for your agenda. That disruption isn’t necessarily bad, but it does distract you from other planning and execution you’re working on.

Reflecting on my own experiences here, I realized that each request needs to go through a series of decisions. These decisions make sure you know why you would agree to or decline the request, and are aware of the bigger picture effects of your decision. They make up the customer request decision flow:

B2B customer request decision flow

The flow is a series of decisions, in priority order. My perspective is product management, but they apply to other areas as well (service, contracting processes, etc.).

Firm request from a priority customer?

This decision point is made up of two criteria: priority customer and firm request.

Priority customer

The first decision point may be somewhat offputting, especially if you operate in the small business or consumer markets. It matters who makes the request. In the enterprise market, just a few customers will be a significant share of your revenue. These customers’ revenue help you meet the payroll. They help keep the lights on. If you’re public, they help keep the stock price up.

In addition to high revenue, some customers are also valuable for non-monetary reasons. Lighthouse customers are important for establishing credibility with other companies.

Whether based on revenue or marketing value, some companies will be priority customers. They are a reality in every B2B company. Keeping them happy is part of the job.

Firm request

Sometimes a request is urgent, and vitally important to the customer. Other times, it’s merely a suggestion, a minor nit or a fleeting idea. It’s important to understand the difference.

Firm requests often come freighted with emotional terms, or subtle threats. “We really need this to make sure our sponsors continue to support you.” When they’re firm, pay attention, immediately.

Not all requests are firm. The customer may couch the request with wiggle room. Or directly say “it’s not a big deal”. Often, they have bigger things they want to tackle (on the product, on processes, on strategy) and look at their request as a suggestion-in-passing.  They will move on to the bigger items and not focus on the request.

The ability to recognize the difference gets better with experience.

Multiple similar requests?

If the request is not a firm one from a priority customer, the next decision point is: are multiple customers are asking for the same feature? What the request lacks in priority, it may make up in commonality.  If customers are making multiple requests for a similar feature, you’ve got a pain point on your hands that needs to be addressed.

A key issue is this: how do you know multiple customers have the same request? A common way is to utilize software which allows customers to post ideas, suggestions and requests. There are idea management providers that are good for this. Or you can user customer feedback  sites. These asynchronous, always-on, open-to-all sites are well-suited for capturing suggestions.

In addition, you may need to check other areas. Bad as it is,  your email often contains customer suggestions. Or you have a service ticket database you can check. Relevant knowledge will be in people’s heads, those who directly work with customers.

Once you know where to look, the process of determining commonality has two steps:

  1. Identify all similar requests that have been made by different customers
  2. Find all signals of support from customers

If you’re using an ideas or feedback site, finding similar requests is easier. Search on terms that relate to the request. Also, look at the ‘Likes’ and comments the suggestions have. I look at the number of companies represented in these signals of interest.

After gathering this information, you will have a sense of how wide the support is for the suggestion. If it’s sufficient, consider adding the request to your roadmap.

Meaningfully enhances outcomes?

Assume that the request is not a firm one from a priority customer, or one that has yet to be shared by multiple customers. There’s one final decision point: will the suggested feature meaningfully enhance customers’ outcomes?

Outcomes has a specific meaning here. It is the definition of when a job task has been satisfied. It should reflect the customer’s expectations. Remember, they only agree to use (and pay for) your product because you’re making them successful.

To apply this criteria effectively, you need working knowledge of what customers want to get done, and where they’re falling short. If you can see that the request will improve outcomes for a significant number of customers, it should be addressed.

Committed to maintaining feature?

For each of the previous three decision points, if the answer is ‘yes’, there is one more decision to make. Are you committed to maintaining the feature? While this may seem like a simple enough question, there are a number of considerations to it. Below are six factors to consider before answering ‘yes’.

Economics: What are the costs to build and maintain the feature? The expected upside of the feature should cover these. Upside is a holistic concept, including money for the new feature, new sales contracts and renewals because of the feature and increased customer satisfaction that translates into informal marketing for your company.

Release velocity: Every new feature added to a product increases the complexity of future releases. In software, a given configuration can have ongoing downstream impacts. Yammer’s V:P Engineering Kris Gale sees the additional complexity as a tax on product velocity. Your ability to release quality products quickly is impacted with each new feature. It’s worth it to add features, but think carefully about velocity impact.

User experience: The ability to use the product or service effectively is a core requirement for customers. If they find that it too complex, they will not fulfill their jobs-to-be-done. Joshua Porter nice summarizes the issue of feature creep: “No single feature addition is a big deal, but taken together change everything.” The value of the request must be greater than any negative effects on user experience.

Tip of the iceberg: sometimes, a request is a “jump” from the current product or service. And it’s only part of a broader offering needed to really address the need. You can look at a request and see how additional features will be needed over time to make it deliver value. And that may take the product in a direction you don’t want to go. Understand the longer term plan related to the request.

Mass market: You’re building a product or service for the mass market. It needs to address a large swath of customers’ needs. In that light, look at the current request. Is it the umpteenth time that this customer, or one of a handful of customers have requested something? Too many ‘outside-the-market’ requests can undermine your broader strategy. You win the battle for the lighthouse customer, but lose the war with the broader market.

Outcome prioritization: Smart product management is organized according to customers’ jobs-to-be-done and expected outcomes. Some outcomes may be currently underserved. Customers’ expectations are being met, and that needs to be addressed. The new request will delay the implementation of features to address these outstanding pain points. Determine if the new request outweighs the currently underserved outcomes.

Decide on the request

Decline the request

If the request cannot cleanly get through the six criteria of the “Committed to maintaining feature?” decision point, it is reasonable to decline the request. Indeed, you now have specific reasons for doing so. That alone is a big improvement versus what often happens: the request sits in the equivalent of a “dead letter” file. Or if there is a response, there’s only a vague, “we can’t do that right now.”

Address the request

If the request makes sense, then it’s full steam ahead. However, notice I’ve used the term “address the request”. This is different than “implement the request”. Maintain a philosophy that:

 Customers know their jobs-to-be-done better than you, but you will know potential solutions better than them.

Not to say the customer hasn’t provided a specific feature solution that is right. But avoid just passing through exactly what what was requested without giving thought to different ways the job-to-be-done can be addressed.

Customer requests will be a constant in the B2B product manager’s life. Knowing how you’re going to handle them is key to the success of the product and the business.

I’m @bhc3 on Twitter.

Collecting and analyzing jobs-to-be-done

via the Daily Mail

I’ve previously written about collecting jobs-to-be-done from customers. Because I was analyzing a broad topic across the entire innovation lifecycle, it was a good way to get a breadth of insight. However, it doesn’t work as well in the more common situation for product managers and innovators: analyzing a specific flow. In that case, there are three requirements for collecting jobs-to-be-done:

  • Comprehensive capture of job elements
  • Map collection as closely as possible to the actual job flow
  • Understand importance and satisfaction of individual tasks

Comprehensive is important, because you can’t address what you don’t know. A limitation of my previous effort was that it was not comprehensive. Actual job flow is a powerful framework. Needs captured in context are more valuable, and it’s critical to follow the steps in the job. Importance and satisfaction become the basis for prioritizing effort.

To address these requirements, I’ve put together a process to understand customers’ jobs-to-be-done. The major elements are:

1 Job flow 2 Job task 3 Collect job tasks per activity
4 Job canvas 5 Task importance & dissatisfaction 6 Number of customer interviews
7 Create affinity groups 8 Label the groups 9 Calculate group importance and dissatisfaction

For purposes of this write-up, assume you’re an automotive product manager. You’re tasked with understanding people’s needs to get work done on the commute to the office. Note this is a job that becomes more readily enabled by self-driving cars.

Start with the job flow

A job-to-be-done has a flow. For example, take this job:

When I commute to the office, I want to get work done.

A job flow consists of the job’s major activities, in sequence. The job flow looks something like this:

Job flow

The purpose of the flow is to provide a framework for capturing specific tasks. Putting this together is primarily the responsibility of the product manager (or innovation team). By stating the major activities that define the job, expect a much more comprehensive capture of all the job elements.

Job task

Each activity consists of a series of tasks. Task are what the customer actually does. They are independent of specific features, although may often be intertwined. Here’s an example task:

Job task

Previously, I’d focused on including context in job statements. But when these tasks are organized according to the job flow, the context is readily known. So task statements don’t include a context element.

But they do include an expectation statement. For every task we do, we have an expectation for it. It defines whether we consider the current experience wonderful or painful. This expectation is formalized for each task, captured in the customer’s own words. It’s valuable to know what the customer expects, as that becomes the basis of design.

Collect jobs tasks for each activity

Next step is to conduct the actual customer interviews. Whether done in the customer’s environment (best) or via a web conference call (acceptable), the job flow provides a familiar framework to the customer.

Job activity + tasks

When I worked at eFinance, I conducted brown paper sessions with clients to understand their commercial credit processes. A staple of the Capgemini consulting model, the brown paper is a step-by-step process flow of what the customer does today. Collecting the job tasks is similar here. Similarities and differences:

  • Brown papers are conducted with groups of people together. Job-to-be-done capture will more often be solo interviews.
  • Brown papers are done in a strict step-by-step flow, captured visually on a wall. If doing this for job-to-be-done interviews works for you, go for it. But a simpler post-it note capture style works as well.
  • After capturing the steps in a brown paper, the group is invited to post stickies describing points where improvement is needed. In the job-to-be-done interviews, each task includes a statement of what the customer expects for it.

A key element of the interview process is to probe the responses of the customer. In a perfect world, they will lay out the individual tasks and easily express their expectations. But likely, customers will talk a lot about features. Which is valuable in its own right. But the objective here is to capture what they are trying to get done. So apply the simple question why. Not in a robotic way. But make sure to probe past the expression of features. These are the tasks – versus features – to place on the job canvas.

Here’s an example of the approach:

Customer: I want a 4G internet card.
Interviewer: Why do you need that?
Customer: So I can connect to email and the web.
Interviewer: What is your expectation for connecting to email and the web?
Customer: Always-on internet.

One tip: Use different color sticky notes for each major activity’s group of tasks. This color coding will help later in identifying where the tasks occur in the job flow.

Job canvas

For each major activity, job tasks are collected onto that customer’s job canvas. An example (with fewer tasks than would actually be there):

Job flow + tasks

In reality, there will be  a large number of tasks per customer interviewed. Strategyn’s Tony Ulwick states there will be between 50-150 outcomes collected from multiple customer interviews. Gerry Katz of Applied Marketing Science sees 100 or more needs collected as well. Sheila Mello of Product Development Consulting says it’s not unusual to extra several hundred images from the customer interviews.

Top tasks by importance and dissatisfaction

Once the job tasks have been captured, the customer selects the tasks:

  • That are most important
  • That are least satisfied

The customer will select the 3-5 tasks that are most important for each major activity in the flow (e.g. there are 3 major activities shown in the job canvas above). These tasks will be assigned points. For example, assume three tasks are identified as important. The most important task would receive 3 points, the next most important 2 points, and so on.

The customer will also select the 3-5 tasks that are least satisfied for each major activity in the flow. Assuming three selected tasks, the task that is least satisfied receives 3 points, the next least satisfied task receives 2 points, and so on.

Job tasks - importance and satisfaction

Keep this insight handy, but separate from the collected stickies (or however you’ve collected the job tasks). We’ll come back to how to use this information.

Note: it will help to apply unique numbers to the individual job tasks. You’re going to want to know the most important and least satisfied tasks across multiple customers later in the process.

Number of customer interviews

A general rule of thumb is that 15-20 customer interviews will provides solid coverage of customers’ needs. You can take it further, as George Castellion advocates 40 interviews. Each interview starts with a blank canvas containing only major activities.

Create affinity groups

After conducting multiple interviews, you will have a large number of job tasks, with information on which ones are most important and least satisfied. Working with a large number of statements by people is a challenge that others have faced. They key is to reduce the large number to a manageable set of insights. There’s a proven approach called the KJ-Method to systematically abstract hundreds of statements into a few key groups.

UX expert Jared Spool provides a detailed series of steps to run the K-J Method. I’ll use his description here.

Bring together group of people do the affinity grouping

The first step is to determine who will do the affinity grouping with you. Try to keep this group at 5 people or fewer. Draw on people from different disciplines.

Put all the job tasks on a wall

In a single space, all the job tasks should be visible and accessible. They need not be laid out in the job flow, which might introduce a bias to the grouping. The color of the stickies will be the basis for knowing where the tasks fall in the flow.

Group similar items

The next step is for the team members to group like  job tasks together. The process is one of looking at pairs of tasks, and determining if they share characteristics. This how Jared instructs clients to do this:

“Take two items that seem like they belong together and place them in an empty portion of the wall, at least 2 feet away from any other sticky notes. Then keep moving other like items into that group.”

“Feel free to move items into groups other people create. If, when reviewing someone else’s group, it doesn’t quite make sense to you, please feel free to rearrange the items until the grouping makes sense.”

“You’re to complete this step without any discussion of the sticky notes or the groups. Every item has to be in a group, though there are likely to be a few groups with only one item.”

Label the groups

Each participant then gets to label each group. This entails looking at the grouped job tasks and determining the common theme. Again, here’s how Jared instructs teams on this process:

“I want you to now give each group a name. Read through each group and write down a name that best represents each group on the new set of sticky notes I just gave you.”

“A name is a noun cluster, such as ‘Printer Support Problems’. Please refrain from writing entire sentences.”

“As you read through each group, you may realize that the group really has two themes. Feel free to split those groups up, as appropriate.”

“You may also notice that two groups really share the same theme. In that case, you can feel free to combine the two groups into one.”

“Please give every group a name. A group can have more than one name. The only time you’re excused from giving a group a name is if someone has already used the exact words you had intended to use.”

Note that part of the exercise in this step is to give one more consideration to the groupings. If, upon trying to determine a label one finds that the groups doesn’t actually make sense, the groups can be split up as needed.

I’ll add this caveat to Jared’s instructions. For purposes of this affinity group work, lots of different labels for each group of tasks are not important. It’s OK to go with one person’s good label for a group, a point to emphasize more strongly.

Here’s an example of labeling a group of job tasks:

Job tasks grouped

Once done, you’ve organized a solid group of job tasks into major themes for what customers are trying to do.

Calculate the importance and dissatisfaction score for each group

Remember asking the customers to rate the three most important and three least satisfied job tasks? Now it’s time to use those ratings. In each group, calculate the following for both importance and dissatisfaction:

  • Total points
  • Average points per task

For each group, you’ll have something like this:

Job task groups with scores

The Total score gives a sense for where the customer energy is. Large scores on either metric will demand attention. The Average score is good for cases where a group has only a few, highly scored job tasks. It ensures they don’t get overlooked.

Prioritize roadmap

You now have major groups scored with customers’ view of importance and dissatisfaction. Within each group are the tasks and expectations that customers have. This is the good stuff, the insight that fuels design efforts. It’s also the data-driven, factually based input that helps clear the fog when tackling a new area for development.

The expressed customer insight – what they want to do, what is important, what is not satisfied – becomes the foundation for constructing a roadmap. The team can layer on other considerations: business strategy, adjacent initiatives that impact the effort, internal priorities. Balance these with what customers actually value. Anything that ignores this hard-won customer insight needs a compelling reason, and an understanding of the higher risk it entails.

I’m @bhc3 on Twitter.

The Folly of Inside-Out Product Thinking

Inside out jacket

Inside out just doesn’t fit right

Ever run into this deductive reasoning?

  1. Customers like our existing products and our company
  2. We are building a new product that reflects the priorities of a company executive
  3. Therefore, customers will like our new product

It’s a clear violation of the First Law of Product: Customers decide what products they like, not companies.

Inside-out thinking is a situation where the wrong reasons are applied to decide which products are to be developed:

  • That market is so big, let’s build something for it
  • My intuition says this is the next big thing
  • This new product will position our company for what is important to us

Those reasons are actually not entirely out of the question for success either. The things that define truly inside-out thinking are (i) an impulse guided by a “we need” , not a “the customer needs” mentality; and (ii) skipping customer validation or ignoring troubling feedback from customers during validation. When you see those two dynamics at play, you’ve left the realm of sophisticated decision-making. You’re in the land of gambling with shareholders’ money. Sure, some inside-out products will succeed. But that’s analogous to saying that some lottery ticket holders win too. It’s a sucker’s bet.

Inside-out thinking is a pervasive thing. I came across this table in Gerry McGovern’s book, The Stranger’s Long Neck. McGovern surveyed  SMB users of a website Microsoft runs – Pinpoint – that helps find IT solutions built on Microsoft technologies. The SMBs were asked what their top tasks were when they visited Pinpoint. McGovern then did something interesting: he asked the Microsoft team what they thought users’ top tasks were.

The table below outlines the results:

Customer Microsoft
Internet security Customer relationship management
Backup and recovery Internet marketing
Security Network management
Desktop support Sales/lead generation
Data/document management Billing

That’s a stark difference between what users value and what Microsoft thought they did. Or perhaps what Microsoft wished users valued. As McGovern notes, “And just like every other organization on the planet, what Microsoft wants is not always what the customer wants.”

This isn’t to pick on Microsoft; it really is the case at companies everywhere. Microsoft just happens to have been open enough to share their own experience here.

You can recognize it when it happens. Here are the Top 3 signs of inside-out thinking:

  • The spreadsheet says it will be big!
  • I don’t need customer validation, they don’t know what they want anyway
  • The Board/CEO/other senior executive is pressuring us to do this

Inside-out thinking is poor decision-making, it’s a bet with terrible odds, and wastes resources. Tough to understand how we can be so methodical with other operations in the organization and still go seat-of-the-pants in this area.

Update: I hadn’t seen his tweet at the time I published this post, but Box founder/CEO Aaron Levie offers another consequence of inside-out thinking here:

I’m @bhc3 on Twitter.

Follow

Get every new post delivered to your Inbox.

Join 646 other followers