Customer validation

Validate before you build

The most expensive startup mistake is building something nobody wants. CB Insights found it is the number one reason startups fail. Harvard Business School research shows 75% of venture-backed startups fail to return investor capital, often because the core customer assumption was wrong. The fix is not better engineering. It is validating your assumptions with real customers before you commit to building.

Why founders believe their idea is validated when it is not

The problem is not that founders skip validation. It is that they do validation wrong and mistake encouragement for evidence.

Rob Fitzpatrick, in The Mom Test, demonstrated why most customer conversations give founders false positives. The rules are simple: talk about their life, not your idea. Ask about specifics in the past, not opinions about the future. Talk less and listen more.

Most founders do the opposite. They describe their idea enthusiastically and interpret polite encouragement as validation. “Would you use a tool that does X?” is a hypothetical question. Humans are terrible at predicting their own future behavior. The answer is almost always “Sure, maybe!” which tells you nothing.

Confirmation bias compounds the problem. Daniel Kahneman documented in Thinking, Fast and Slowthat once humans form a hypothesis, they test it by seeking confirming evidence rather than disconfirming evidence. As Richard Feynman put it: “The first principle is that you must not fool yourself, and you are the easiest person to fool.”

The result: founders build for months based on what they think customers want, supported by ambiguous signals they interpreted as validation. The real feedback arrives only after launch, when the product meets indifference.

What happens when you build first and validate second

These are not scrappy garage startups. They had world-class teams, massive budgets, and brilliant technology. They still got the customer wrong.

01

Webvan: $800 million spent scaling before validating unit economics

Webvan raised $375 million in its IPO and built massive automated warehouses for online grocery delivery across 10 cities. They scaled before proving the model worked in even one city.

The assumption that a significant portion of consumers were ready to buy groceries online and pay for delivery was never validated at small scale. Webvan burned through $800 million and went bankrupt in 2001. The market eventually proved real, but Instacart and Amazon Fresh succeeded 15 years later with incremental scaling and validated demand.

02

Segway: a $5,000 solution to a problem pedestrians did not prioritize

Pre-launch hype was enormous. Steve Jobs reportedly said it was "as big as the PC." An investor predicted $1 billion in sales faster than any company in history. Dean Kamen expected 10,000 units per week. They sold roughly 30,000 in the first two years.

The team never seriously validated whether city pedestrians wanted to replace walking with a $5,000 gyroscopic scooter, whether they would be comfortable riding one in public, or whether cities would allow them on sidewalks. The technology was brilliant. The customer validation was absent.

03

Google Glass: technically impressive, socially unacceptable

Google validated that the technology worked. They did not validate that consumers wanted to live with the social consequences. Wearers were called "Glassholes." People near Glass wearers felt surveilled.

The product solved hands-free information access, a problem most consumers did not prioritize enough to justify looking like a cyborg. Google pulled the consumer edition in 2015. A later enterprise pivot found better problem-solution fit, proving the technology was never the issue. The customer assumption was.

04

Quibi: $1.75 billion assuming people needed premium short-form content

Jeffrey Katzenberg assumed people needed premium 10-minute content for "in-between moments." The job of "entertain me for 5-10 minutes" was already done by TikTok, YouTube, and Instagram, for free.

Katzenberg blamed COVID for the failure. But TikTok exploded during the same period. The problem was not the pandemic. It was the assumption about the customer need. $1.75 billion in funding cannot overcome a hypothesis that was never validated with real customers.

The exponential cost of late validation

A wrong assumption caught at the idea stage costs almost nothing to fix. The same assumption caught after 18 months of development costs the entire investment.

Stage
Cost to pivot
Time lost
Idea / hypothesis
$0 – $1K
Days
Prototype / wireframe
$1K – $10K
Weeks
MVP
$10K – $100K
1–3 months
Post-launch (pre-scale)
$100K – $1M
3–6 months
Post-scale
$1M – $50M+
6–18 months
Steve Blank estimatesthat the cost of a pivot after launch is typically 10–100x the cost of a pivot during customer discovery. Alberto Savoia, Google’s first Engineering Director, coined “pretotyping” to test market appeal before building even a prototype, estimating it reduces validation cost by 100–1000x. The math is clear: the earlier you validate, the less it costs when you are wrong.

How Bandos makes validation part of the process, not an afterthought

Not a separate research sprint. Not a survey tool you configure on your own. Validation built into the same session where you define your direction.

Generate a survey from any node on your map

A persona, an opportunity, or a solution direction. At any point you can generate a research-grade, public-facing survey and share the link. No survey design expertise required. The questions are structured to avoid the Mom Test pitfalls: they ask about past behavior and current pain, not hypothetical future actions.

Real responses flow back into your session

Customer responses are not trapped in a separate tool. They flow directly back into your Bandos map. You see how real customers respond to your assumptions in the context of the direction you are building.

Wrong assumptions generate a corrected path

If real customers invalidate your current direction, Bandos flags the path and generates a new one built from the validated data. You do not start over. You pivot from evidence, not from gut feeling.

Anonymous responses eliminate social desirability bias

Survey respondents do not know who built the product. There is no founder in the room to be polite to. The responses reflect what people actually think, not what they think you want to hear. This is the Mom Test applied at scale.

Stop assuming. Start validating.

Define your direction and test it with real customers in the same session. Know what is worth building before you build it.

Get Started for Free