Experiment Library

Selecting a technique

Picking the right experiment technique:

Consider your needs

  1. Type of hypothesis: some techniques will produce better evidence for desirability, others work better for viability, feasibility and usability.

  2. Level of uncertainty: what strength of evidence do you need to move forward? When you know only a little then your goal is to produce evidence that points you in the right direction. In this situation, quick and cheap techniques will suffice despite the weak evidence. The more you know, the stronger the evidence you need to build confidence which usually achieved through more expensive techniques.

  3. Urgency: when is the next key decision point? You might need strong evidence to gain exec support or funding for your idea. Or perhaps you can run cheaper experiment across multiple perspectives to support decision makers.

Experiment selection tips:

  • Start cheap and fast. When you don't know much you're only job is to get a signal on the right direction. You will test more later so don't over think it.

  • Run multiple tests on the hypothesis set to strengthen evidence. Try to learn as quickly as possible, then run more experiments to produce stronger evidence and build confidence. Beware important decisions based on one-off or weak evidence. When you do, you're probably relying on luck to succeed...

  • Get the biggest bang for your buck. Always design the strongest experiment you can, while respecting your context. Remember, when you don't know much (your context), you should go cheap and fast but that doesn't rule out producing strong signals.

  • You can learn a lot without building anything. Not convinced? While this might seem counterintuitive, understanding customers jobs, gains and pains can be done without writing any code.

Discovery Techniques

  • Competitor Test

  • Basic Prototype (non-interactive)

  • Data Sheet

  • Link Tracking

  • Customer Interviews

  • Search Analysis

  • Customer Support Analysis

  • Online Ads

  • Sales Force Feedback

  • Discussion Forums

  • Email Campaign

  • Customer Survey

  • Explainer Video

  • Traffic Analysis

  • Referral Program

Validation Techniques

  • Tech Spikes (for feasibility)

  • Letter of Intent

  • Landing Page

  • Interactive Prototype

  • Concierge

  • Split Test

  • Presale

  • Mock Sale

  • Wizard of Oz

  • Single Feature MVP

Discovery playbooks

We know different experiments are appropriate for different situations. Techniques also feed into other techniques so it makes sense that we can string together multiple experiment techniques to gain momentum and build strong evidence over time.

Playbook starting points for different types of products:

  • B2B (saas): Discussion Forums > Sales Force Feedback > Customer interview > Competitor Test > Interactive Prototype > Presale > Single feature MVP

  • B2B (service): Expert Stakeholder Interviews > Customer Support Analysis > Brochure > Presale > Concierge

  • B2C Software: Customer Support Analysis > Customer interview > Online Ads (demand test) > Landing Page Test > Email Campaign > Interactive Prototype > Mock sale > Wizard of Oz

  • B2B2C: Sales Force Feedback > Customer interview > Online Ads (demand test) > Landing Page Test > Explainer Video > Presale > Concierge > Buy a Feature > Data Sheet > Partner & Supplier Interviews > Letter of Intent