What Is a Fake Door Test?#
A fake door test measures real user interest in a product or feature that doesn't exist yet. You create the appearance of a product — typically a landing page or button — and track how many people try to use it. The click-through or sign-up rate tells you whether demand is real or imagined, without building anything.
The concept is simple. Instead of spending months coding a product, you spend a weekend building a page that describes it. You drive traffic. You measure who signs up. If nobody bites, you just saved yourself from building something nobody wants.
Fake door tests go by several names — painted door tests, smoke tests, or 404 tests — but the mechanics are identical. Show the door. See who tries to open it. Decide whether to build what's behind it.
If you want the broader end-to-end process, start with our guide to validating startup ideas.
Why Fake Door Tests Beat Asking People#
42% of startups fail because nobody wants what they built — not because of bad execution, funding, or competition. CB Insights' analysis of 111+ startup post-mortems found "no market need" is the single most common reason startups die. And yet most founders skip demand validation entirely.
The problem with surveys and interviews is what Rob Fitzpatrick calls the "Mom Test" failure: people lie to be polite. Ask someone "Would you use this?" and they'll almost always say yes. It costs them nothing to be encouraging.
Behavior doesn't lie. When someone enters their email address on a landing page, they're making a micro-commitment. When they click a "Buy Now" button for a product that doesn't exist yet, that's a signal worth 100 survey responses.
| Validation Method | Measures | Reliability | Cost |
|---|---|---|---|
| Surveys | Opinions | Low — people say what you want to hear | Free |
| User interviews | Recalled behavior | Medium — depends on question quality | Free |
| Fake door test | Actual behavior | High — real clicks, real emails | $50-100 |
| Full MVP | Usage + retention | Highest — but takes months | $5,000-50,000+ |
The fake door test sits in the sweet spot: high-reliability signal at minimal cost. Amplitude's research confirms that behavioral data from fake door tests consistently outperforms stated-preference methods for predicting actual product adoption.
How to Run a Fake Door Test in 5 Steps#
A complete fake door test takes 1-2 weeks and costs $50-100 in ad spend. Here's the process, step by step.
Step 1: Write a Testable Hypothesis#
Don't test "Will people like my idea?" That's too vague. Write a specific, falsifiable statement:
- "At least 5% of visitors to my landing page will enter their email"
- "Freelancers in r/freelance will click a 'Get Early Access' button at a rate above 3%"
- "B2B founders will sign up for a compliance tool waitlist at twice the rate of a generic project management waitlist"
Your hypothesis should include a target audience, a specific action, and a minimum threshold for success.
Step 2: Build the Door#
Create a landing page with three elements:
- Problem statement — Describe the pain in your audience's words
- Solution preview — What you're building to solve it (keep it simple)
- Call to action — Email capture, waitlist button, or "Get Early Access"
You don't need design skills. Carrd ($19/year), Framer (free tier), or even a simple HTML page will work. The page needs to be clear, not beautiful.
Step 3: Drive Targeted Traffic#
A landing page without traffic tells you nothing. You need 500-1,000 visitors for statistically meaningful results.
Organic (free but slower):
- Post in relevant subreddits (share your story, don't spam)
- Answer questions on Quora or Twitter related to the problem
- Share in Slack/Discord communities where your audience hangs out
Paid ($50-100):
- Google Ads targeting problem-aware keywords
- Reddit Ads for hyper-targeted community placements
- Meta Ads for broad audience testing
GrowthMentor's validation guide recommends budgeting $200-500 for comprehensive testing, but you can get meaningful signal with as little as $50 if you target well.
Step 4: Measure What Matters#
Track these metrics:
- Conversion rate — Signups divided by unique visitors (the primary metric)
- Bounce rate — Are people leaving immediately? Your messaging might be off
- Time on page — Longer = more engaged, even if they don't convert
- Traffic source performance — Which channel sends the most engaged visitors?
Wait until you have at least 500 visitors before drawing conclusions. UXtweak's fake door testing guide recommends 1,000+ views or 100+ clicks for reliable results.
Step 5: Decide — Green, Yellow, or Red#
| Signal | Conversion Rate | What It Means | Next Step |
|---|---|---|---|
| Green | Above 5% | Strong demand signal | Build an MVP |
| Yellow | 2-5% | Interest but hesitation | Test different positioning |
| Red | Below 2% | Weak demand | Pivot or kill this angle |
These benchmarks come from aggregated data across hundreds of landing page tests. Your specific niche may vary — a 3% conversion rate in enterprise B2B is excellent, while consumer products often need 8%+ to signal real traction.
3 Famous Fake Door Tests That Built Billion-Dollar Companies#
The most successful tech companies in history validated demand before writing production code. These aren't edge cases — they're the playbook.
Buffer: A Two-Page Website That Launched a $100M+ Company#
In 2010, Joel Gascoigne had an idea for a social media scheduling tool. Instead of building it, he created a two-page website. Page one described the product. A "Plans & Pricing" button led to page two — which simply said the product was still in development and asked for an email address.
People signed up. Gascoigne added a third page showing actual pricing tiers. People still signed up. That was enough validation to start building. Buffer is now used by millions and generates over $100M in annual revenue.
Dropbox: A 3-Minute Video That Got 75,000 Signups Overnight#
Drew Houston couldn't demo Dropbox without building the entire sync engine first. So in 2007, he recorded a simple explainer video and posted it to Hacker News with a waitlist form.
The result: their beta waiting list jumped from 5,000 to 75,000 overnight. Houston had his validation — people desperately wanted seamless file sync. Dropbox went on to IPO at a $12B valuation.
Robinhood: A Waitlist That Hit 1 Million Before Launch#
Robinhood's founders put up a landing page with one promise: commission-free stock trading. No app, no platform, just an email signup. Within a year, over 1 million people had joined the waitlist. That pre-launch demand gave them the leverage to raise funding and build the product.
What a Fake Door Test Costs in 2026#
A complete fake door test costs $50-100 and 1-2 weekends of effort. Compare that to the average failed startup, which burns through $29,000 before shutting down.
| Approach | Cost | Time | What You Get |
|---|---|---|---|
| DIY (Carrd + Google Ads) | $50-100 | 1-2 weekends | Conversion data, email list |
| No-code tool (Framer + Reddit Ads) | $100-200 | 1 weekend | Same + better design |
| StartupWorkshop validation | 1 credit | Hours | Landing page + hosting + analytics |
| Hiring a freelancer | $500-2,000 | 2-4 weeks | Custom design, but slower |
| Skipping validation entirely | $5,000-50,000+ | 3-6 months | A product nobody might want |
The math is straightforward. Spending $100 to discover that an idea has no demand is infinitely cheaper than spending $50,000 to discover the same thing after building a product.
Common Mistakes That Ruin Fake Door Tests#
Most fake door tests fail not because the method is flawed, but because of avoidable execution errors. Here are the ones we see most often.
Leading with the solution, not the problem#
If your landing page leads with "Introducing FooBar, the AI-powered widget optimizer," you're testing whether people understand your jargon — not whether they have the problem you're solving. Lead with the pain: "Tired of spending 3 hours a week manually updating widgets?"
Not enough traffic#
50 visitors is not a test. It's a coin flip. You need 500-1,000 visitors minimum. If you can't drive that much traffic in a week, that itself is a signal — your audience might be too small or too hard to reach.
Drawing conclusions too early#
A 10% conversion rate from 20 visitors means nothing. Wait for statistical significance. A simple rule of thumb: don't make decisions until you have at least 100 clicks on your CTA, or 500+ unique visitors.
Testing with friends and family#
Your mom will sign up for your landing page. Your college roommate will too. These aren't valid data points. Drive traffic from strangers who have no social obligation to support you.
Forgetting the "what happens next" page#
When someone clicks your CTA, they should land on a page that says something like: "Thanks for your interest! We're still building [product]. We'll email you when it's ready." Being transparent builds trust and avoids feeling deceptive.
Frequently Asked Questions#
Is fake door testing ethical?#
Yes, when done transparently. The best fake door tests tell visitors the product is "coming soon" and invite them to a waitlist. You're not charging money or making false promises — you're gauging interest. Chameleon's guide recommends always including a clear disclosure message.
How many visitors do I need for reliable results?#
Aim for 500-1,000 unique visitors. At 500 visitors, a 5% conversion rate gives you 25 signups — enough to see a pattern. Below 200 visitors, the variance is too high to draw meaningful conclusions.
Can I run a fake door test for a B2B product?#
Absolutely. B2B fake door tests often perform better because the audience is more targeted. Use LinkedIn Ads or post in industry-specific Slack communities. Conversion benchmarks are lower for B2B (3-5% is strong) because decision cycles are longer.
What's the difference between a fake door test and an MVP?#
A fake door test measures demand — do people want this? An MVP measures usability and retention — can people use this, and do they come back? The fake door test comes first. Only build the MVP after you've confirmed demand exists.
How long should I run the test?#
One to two weeks is the sweet spot. Shorter and you won't get enough traffic. Longer and you risk losing momentum. If you're running paid ads, 7 days gives most platforms enough time to optimize delivery.
Ready to validate your next idea? StartupWorkshop generates landing pages and tracks conversions automatically — so you can test demand in hours, not weeks.



