Skip to content
Back to Design & Validate
Design & ValidateDesign & Validate · Before roadmap turns into sunk cost

Product Validation Consultant

Before you spend another quarter building, find out whether the direction makes sense to real users.

You don’t need more internal confidence. You need outside signal. Product validation work turns assumptions into evidence while the change is still cheap.

If you built this, it would save us a week per month.

CustomerEngineering firm user, The Two-Year Mistake
From Uncertainty to Validated Prototype
Signals from shipped work
Design SprintDashboardSaaS

Admicom

From Uncertainty to Validated Prototype

Validated dashboard concept in 5 days

Read the case study
Workshop or product outcome from previous work
Project context

Artifacts, interfaces, and workshop material from the kind of work this page is about.

Vitali Gusatinsky working with a team
Who leads it

Vitali facilitates the room, frames the decision, and keeps the work close to the evidence instead of presentation theatre.

Engineering Software Company

1 week saved per user per month

The Two-Year Mistake

Admicom

5-day validation

From Uncertainty to Validated Prototype

FCG / Kuntarekry

500,000 applications/year

Design Sprint for 1.5 Million Users

Trusted by teams at

Where this starts to hurt

What starts showing up

These are the patterns that usually appear before a team admits the direction is under-questioned.

01

The roadmap is moving, but nobody can answer why users will care.

02

Support, sales, and product all describe the problem differently.

03

Design reviews keep circling the same debate because there is no user evidence to end it.

04

Teams are shipping from conviction, not validation.

Fit check

This is for the team that wants a real answer

The work is useful when there is an expensive decision ahead and enough honesty in the room to let evidence change direction.

Good fit

+

You have a feature, workflow, or product direction with real delivery cost attached to it.

+

You can get the right stakeholders in the room for a focused decision cycle.

+

You are willing to hear that the current direction is wrong, incomplete, or aimed at the wrong user.

+

You want tested direction, not another slide deck full of abstract recommendations.

Not the right format

-

You already know what to build and only need production execution.

-

The real decision-makers will not participate in the work.

-

You want research theater that confirms what leadership has already decided.

What changes

Outcomes you can point to

The point is not abstract insight. It is a smaller and more confident next move.

01

A tested prototype or workflow concept that real users have reacted to.

02

Clear guidance on what to build, what to drop, and what still needs investigation.

03

Shared confidence across product, design, and leadership because everyone saw the same evidence.

04

Faster execution afterward because the strategic debate gets resolved early.

How the work moves

A short decision cycle, not a research maze

This is structured to surface signal early, while the cost of changing course is still low.

1

Step 1

Frame the actual decision. Not a vague ambition. The real question that is holding the team up.

2

Step 2

Turn the riskiest assumption into something people can react to.

3

Step 3

Test with the right users while the work is still easy to change.

4

Step 4

Synthesize into clear decisions, not generic findings.

Quick fit check

If this page sounds uncomfortably familiar, take the quiz before you commit more budget.

The quiz is the fastest way to tell whether this is the right format, whether another route makes more sense, or whether the team simply needs execution support.

You have a feature, workflow, or product direction with real delivery cost attached to it.

You can get the right stakeholders in the room for a focused decision cycle.

You are willing to hear that the current direction is wrong, incomplete, or aimed at the wrong user.

Proof

Evidence from shipped work

These offers are anchored in actual projects, real stakeholder rooms, and visible change afterward.

From Uncertainty to Validated Prototype

Admicom

From Uncertainty to Validated Prototype

The existing dashboard lacked clarity. Users juggled multiple roles and confusing transitions between views. Marketing, product management, and customer service each had different ideas about what mattered most. Without user validation, they risked building on assumptions.

Pre-sprint user research interviews with actual software users
5-day design sprint co-led with Lauri Lännenmäki (UX engineer)
Cross-functional participation from marketing, product management, and customer service
See the full breakdown
If you built this, it would save us a week per month.
CustomerEngineering firm user, The Two-Year Mistake
Expectations were high, and they were met. The Design Sprint went really well. We received feedback internally that it was exceptionally well facilitated.
Samuli RantanenProduct Designer, Tocoman/Admicom, From Uncertainty to Validated Prototype
Design Sprint clearly defined our service direction and unified departmental goals.
Jari LepistöProduct Manager, FCG, Design Sprint for 1.5 Million Users
Deeper read

What this looks like in practice

Below is the fuller breakdown of where this format helps, what gets tested, and how a team leaves with a decision instead of more theatre.

Product validation sounds obvious until you watch how most teams actually make product decisions.

Someone has a strong opinion. A roadmap window opens. Design starts filling screens. Engineering estimates the work. By the time anybody asks users, the organization is already emotionally invested in the answer. At that point, the work is no longer validation. It is damage control.

That is the reason I do this work.

I have seen teams spend years building something people do not understand. Not because the team was weak. Not because the technology was bad. Because the core direction was never tested with the people who were supposed to use it.

The engineering-company case study is the cleanest example. Two years of development. Launch-day confusion. Then a user showed up with their own slides, mockups, and a better flow. They had been thinking about the problem for years. Nobody had created a process where that knowledge could surface. One comment from the right user changed the direction of the product and exposed how much waste had built up around untested assumptions.

That is what product validation consulting is really for. Not polishing concepts. Not making research look respectable. Creating a decision-making process where real user signal shows up before the expensive commitment happens.

What gets validated

Usually, not the whole product.

The useful question is narrower:

  • Is this workflow understandable?
  • Does this feature solve the problem we think it solves?
  • Is this the right ordering of steps?
  • Are we targeting the right user and the right use case?
  • Is the value proposition strong enough to justify building this at all?

Teams often come in asking for “feedback on the concept.” That is still too vague. Good validation work sharpens the question until it becomes testable.

At Admicom, the issue was not “do users like dashboards?” It was whether the product family could move toward a clearer dashboard concept that helped people switch contexts without getting lost. In five days, the team had a user-tested prototype and a cleaner product direction. That is the power of focus. The sprint did not solve every future product decision. It solved the decision that mattered now.

What usually goes wrong internally

The information is already in the organization, but it is trapped in the wrong places.

Support hears the recurring friction. Sales hears why deals stall. Customer success hears what people actually expected. Product sees tickets and requests, but not always the deeper pattern behind them. Leadership sees the strategic pressure, but not the daily reality of use.

Without a structured validation process, those perspectives never become one shared picture.

Instead, teams get:

  • fragmented opinions
  • design work done too early
  • overconfidence from inside the building
  • false certainty because nobody wants to slow momentum

That is why I prefer working with a specific decision and a cross-functional group. The output is not “research says maybe.” The output is “we tested this, here is what happened, here is what changes now.”

What the work looks like

Sometimes it is a one-day sprint around a single blocking question. Sometimes it is a three-day or five-day sprint with stakeholder alignment, prototype work, and user sessions. The format matters less than the sequence:

  1. Frame the decision correctly.
  2. Identify the riskiest assumption.
  3. Build the lightest useful thing to test that assumption.
  4. Put it in front of real people.
  5. Decide while the evidence is still fresh.

The reason this works is not speed for its own sake. It works because it compresses the gap between assumption and reality.

FCG / Kuntarekry is a good example of the organizational side. Eleven people. Different functions. A service used by 1.5 million people and handling half a million applications a year. The design sprint did not just produce a validated prototype. It aligned the room. That alignment would have taken months through ordinary committee behavior.

What you leave with

You should leave with something sharper than “insights.”

You should leave with:

  • a tested concept
  • evidence your team can replay and refer back to
  • a smaller, clearer build scope
  • a more honest understanding of user priorities

Sometimes the biggest win is discovering what not to build. That sounds negative until you price the alternative. Killing the wrong idea early is one of the best ROI moves a product team can make.

This is also why I do not position validation as a decorative discovery phase. It is operational risk reduction. It is how you avoid turning conviction into sunk cost.

Why bring in an external consultant

Because internal teams are rarely short on intelligence. They are short on distance.

You cannot be objective about your own product politics when you are living inside them. You also cannot facilitate a hard conversation well if you are personally attached to one of the outcomes.

An external product validation consultant changes that dynamic in three ways:

  • I can frame the uncomfortable question without defending past decisions.
  • I can facilitate a room where different functions are heard without the usual hierarchy taking over.
  • I can move from ambiguity to prototype quickly because I have seen the pattern before.

That outside perspective is not about “fresh ideas.” It is about removing the friction that keeps teams from hearing what is already true.

If the next product decision is expensive, validation is not extra work. It is the work that prevents the wrong investment.

FAQ

Questions that usually come up

The practical questions tend to be less about process and more about timing, scope, and how much certainty a team actually needs.

Curious if we're a fit?

A short quiz. Takes 2 minutes. Helps us both figure out what kind of help might work for your situation.

If there's a fit, you'll be able to book a time immediately. Sometimes the answer is "you don't need me" — and I'll tell you that too.