You can keep your MVP focused on a specific problem and deliver it quickly. But you must take the time to learn from your customer’s reactions and iterate on that information. This week’s resources give you ways you can test your assumptions to make sure you’re building something people want, and not just merely going through the motions.
Testing pricing and willingness to pay in the easiest possible way. Cory Zue created an MVP of his printable place card maker and tested it with Google Ads in order to try and kill his dream of creating a profitable product as fast as possible. Cory wanted to find out if he could get people to give him their email address to use his product. Based on the 20% of people who gave him their email address, his test was a success, sort of. He knew his site was worth an email address, but he didn’t know if it was worth actual dollars. Cory followed up his initial test with another one approached with the philosophy of trying to do as little work as possible to kill the dream as quickly as possible. Here’s Cory’s description of the process and what he learned from the results so far. (via @czue)
What we learned by testing our MVP on Medium. Paul Graham advises that you should “Make something people want.” Great advice, but how exactly can you do that. A minimum viable product can help you determine what people want, but sometimes you want to figure that out with as little work as possible, which often means not actually building a product. Mithun Madhusudan suggests that you should think of “an MVP as a hypothesis that you need to test and validate with a degree of statistical significance.” Which can be accomplished with zero lines of code. To give you an idea of how to do that, Mithun shared the story of how his team tried to validate his product Cubeit and what they learned.(via @mythun)
MVP Testing — how to drive traffic to your website. Eze Vidra and Jalin Somaiya gave a presentation to a group of 20 entrepreneurs at Launchpad about MVP Testing. They wanted to help early stage startups focus on their most immediate goal and optimize for it through testing and iteration, while leveraging the basics of online marketing. They wanted to encourage startups to spend some time working with an MVP so that they can check their customer’s reaction and demand and avoid building features no one wants. The presentation is self explanatory and compliments the Lean Startup methodology and customer development which encourage you to validate, test and iterate. (via @ediggs)
The art of MVP testing. Quotes such as “Done is better than perfect; Don’t worry be crappy;” and “Fail fast and fail often” are good in sentiment, but can lead to undesirable results if taken too literally. You want a speedy build-measure-learn cycle, but you don’t want to produce something so crappy that you can’t learn anything from it. Jan Jones suggests that even if you practice automated testing religiously, you should still go through your product firsthand and check each feature to make sure it works. (via @dynjo)
Riskiest Assumption Testing. Some teams rely on hope when introducing a new product – as in “I hope people buy this!” Other teams have found the Minimum Viable Product (MVP) to be a helpful starting point. Thomas Nagels would tell you that the MVP “may not be the best tool for selecting a starting point for your innovation.” He has found that the Riskiest Assumption Test (RAT) may be a better alternative for making product decisions. The RAT is “the process of finding the riskiest assumptions in your (business) model and validating them. High risk assumptions have two traits: a high probability of being wrong and significant impact when they are.” (via @thomas_nagels)