How I chose my first MVP

The Lean Startup advocates an iterative approach to building a business, with each iteration teaching you something actionable. An important tool in this iterative process is the Minimum Viable Product (MVP).

This post explains how I identified what my first MVP should be, and should be interesting reading if you're thinking of building your own application.

In his book The Lean Startup, Eric Ries briefly defines an MVP:

A minimum viable product (MVP) helps entrepreneurs start the process of learning as quickly as possible. It is not necessarily the smallest product imaginable, though; it is simply the fastest way to get through the Build-Measure-Learn feedback loop with the minimum amount of effort.

Contrary to traditional product development, which usually involves a long, throughtful incubation period and strives for product perfection, the goal of the MVP is to begin the proces of learning, not end it. Unlike a prototype or concept test, an MVP is designed not just to answer product design or technical questions. Its goal is to test fundamental business hypotheses.

Note that MVPs don't need to be actual products; an MVP is simply the quickest thing you can make to learn about your next most pressing hypothesis. In his article Signs you aren't really building a Minimum Viable Product, Anthony Panozzo argues that most people who claim they're building an MVP actually aren't:

If you are building out a half of a product as your first stab, you might as well just call it version one or iteration zero or something like that. No sense in polluting the MVP term.

This makes a lot of sense. Having read it, I felt strongly that I ought to be able to demonstrate whether or not there was significant demand for Agile Planner without actually having to build it.

Anthony's article goes on to lay out three key questions that he puts to people who tell him about their MVP:

  1. What are you trying to learn with this particular MVP?
  2. What data are you collecting about your experiment?
  3. What determines the success or failure of the experiment?

It's a great article, and I recommend you read it.

So what happened when I applied Anthony's questions to Agile Planner? The answer to question 1 (what am I trying to learn?) is easy:

I want to learn whether there's a market for a less opinionated agile app, with a simple UI based on the metaphor of the index card.

Questions 2 and 3 are harder to answer, but provide a good context in which to evaluate your MVP.

After a week or so of research (and some serious thought), I had three candidate MVPs:

  • A landing page explaining the product, a "Buy" button, and an Adwords campaign.
  • A survey designed to identify how many people would pay for my app.
  • A demo video highlighting the differences between existing products and Agile Planner.

How do these ideas fair in the light of Anthony's three questions?

Idea 1: A landing page, a Buy button and an Adwords campaign

The landing page approach is simple. You setup a simple web site explaining your concept and add a call to action. You can ask people to sign up to your mailing list if they're interested, or take it a step further and offer them a chance to sign up for a paid product.

Anthony's second question is "what data are you collecting?". If you're building a B2B (business-to-business) product its important to prove that people are prepared to pay for your product; it's not enough that they like it (see The Order of AARRR by Brant Cooper). I felt my call to action should validate people's intent to purchase, so I'd setup a "Buy" button rather than just a mailing list.

Anthony's third question (what determines success or failure?) is the most difficult to answer. I've no idea what percentage of my visitors I should expect to try and buy the app. If I got 100 clicks, and only one person tried to buy it, what would that tell me?

Lots of attempts to purchase the app would confirm demand for the app. But if nobody clicked, it could just mean that I hadn't explained the point particularly well.

This would have been an experiment that couldn't fail. The results wouldn't be actionable, as I'd have felt there was plenty of mileage in my ideas even if nobody tried to buy it.

It was tempting to give up on MVPs at this point and just get on with building version 1, but I'd well and truly tethered my horse to the "maximum validated learning for minimal effort" train, and I was damned well going to apply it.

Idea 2: A survey

Let's recap. What am I trying to learn?

I want to learn whether there's a market for a less opinionated agile app, with a simple UI based on the metaphor of the index card.

Question 2 (what data are you collecting?) seems straightforward for a survey; you look at the answers and categorise them according to how likely you feel each respondent is to use your product.

Question 3 (what determines success or failure?) appears easy on the face of it; you could define a minimum number of positive responses before you start. And yet, as far as applying the scientific method goes, there's a problem.

If your results could be skewed by the questions that you're asking, how do you determine whether or not you've biased your results? If your results are biased by your questions, how do you know how much you've biased them by? And how can you reliably make decisions moving forwards from data that you know to be inaccurate?

I might be able to write a survey that would reveal whether respondents were unhappy with existing products, but I couldn't see a reliable way to find out whether or not they'd want to pay for my app. You could argue that proving that there's discontent would be enough, but that wouldn't tell me anything I didn't already know (developers are a vocal bunch).

Idea 3: A demo video

In The Lean Startup you can read about Dropbox's MVP. Drew Houston (the CEO of Dropbox) made a four minute video that highlights the main features of Dropbox. It really gets across how simple Dropbox is to setup and use, and it's no surprise that they had 70,000 people join their waiting list within days.

This approach really appeals to me; there's no easy way to explain the intangible aspects of how Agile Planner will work in static words and pictures. It's a little bit different to the competing products, and people familiar with the existing tools are going to need to see it in action to fully understand where I'm coming from.

It's not enough for them to decide to buy it because they think it looks pretty; I need to know whether they still want to buy it after they've appreciated what it won't do.

On the face of it, it sounds good. Let's ask Anthony's questions again...

What data are you collecting?

With a video you can track:

  • how many people visit the page,
  • how many start watching the video,
  • how many stop watching the video part way through (and if you use Wistia, you can see how far they get), and
  • how many people go on to try and purchase your app.

What would determine the success or failure?

I had plenty of confidence that a well made video would make it clear how Agile Planner would behave. If few people watched the entire video or tried to buy the app I'd be confident that there was a problem with the basic premise of the idea.

I knew I could post the video to Hacker News and ask for feedback from other entrepreneurs (Daniel Tenner has some good advice on how to go about it). The Hacker News audience isn't really an agile audience, but I felt there'd be enough overlap, and the feedback from other entrepreneurs would be useful.

I realised I could also ask for feedback on agile forums, LinkedIn groups, etc. I'd track the conversion rates from these different sources independently, in case a high number of visitors from Hacker News skewed my results.

I decided that if at least 50 people tried to purchase the app I'd be justified in building version one. If I couldn't get 50 signups, my next task would be to work out why.

So how do you make a demo like Drew's in as quickly as possible? That's a story for another post...

Links

About the author

Graham Ashton

Graham Ashton is an experienced agile team leader and project manager, and the founder of Agile Planner. You can follow him at @grahamashton on Twitter.