Contact Us: 1-800-634-1990


Totango Blog

Stay in the latest updates.

Get the latest tips and advice delivered straight to your inbox.

Lean Startups are Not About Learning

Lean startups are not about learning — or at least not just about it. Learning is not enough because translating what you’ve learned into value is just as important and often no less difficult.

Measure–Learn–Act is a key tenet of lean startups. We are big believers in lean. We run Lean startup Israel and frequently blog and talk about our experiences of running a lean company.

Measure-Learn-Act is also a core part of our company vision. We believe measuring, learning and acting on usage-data is the right way to build, scale and operate any SaaS and web-business — and we’re building Totango to help companies do just that.

In this post, we’ll talk about how we “eat our own dogfood” and use this principle, and the Totango product, to drive our own product’s evolution.

How we measure progress?

In his remarkable work on lean-startups, Eric Reis uses the following definition for measuring progress at a startup:

Definition of progress for a startup: validated learning about customers (read more)

We found that to be a limiting concept.

We discovered that when we defined learning as our core objective, we ended up spending too much energy on our own learning (running A/B tests, minimum viable versions to learn about market interest, etc.) and not enough on leveraging our learning to deliver value to end-users and customers.

In other words, learning is not enough and is not the end goal. The real test of a startup is whether it builds a service of value — one that people want to use and can’t live without.

So our definition of progress is:

Definition of progress for a startup: validated value delivered to customers

‘Value’ means we released a product or service enhancement that helps customers accomplish something better or faster. ‘Validated’ is that we have a way to quantify the value delivered, usually through a positive change to one of the key usage metrics we track.

If, and only if, we reach that point do we declare progress. All the rest are considered internal milestones along the way.

How we measure validated value to customers?

Here is a set of product-level metrics we monitor on an ongoing basis:

Each product or service improvement we undertake needs to ultimately manifest itself with an improvement to one of these metrics. For example, changes to our signup process need to yield an improvement in signup numbers; a better customer onboarding process should result in a higher and faster rate of activations.

Our most interesting metric is ‘Engaged Organizations’. Our goal here is to have a quantifiable way to determine if accounts that went through the signup and activation process are actually deriving value from the solution. We measure this by counting the number of days each user performed meaningful interactions with our solution.

‘Meaningful interactions’ are not generic: each service naturally has a different set (in our case it can be usage of our inbox capabilities or interacting with an activity stream).

They also change over time when new functionality is added or product pivots are made. The point is that they provide a solid way to validate if users are seeing value in our solution, which in turns helps us determine if changes we make have a positive impact.

Learning as a means to delivering value

Validated-value-delivered is the way we measure progress, but that doesn’t mean we don’t spend a lot of time trying to learn what customers want. In fact, we’ve found that is the only way to consistently be able to add value. Otherwise, you spend too much time on bad ideas and get overly invested in product directions that ultimately prove incorrect.

We try to build an MVP (minimum viable product) as the first step for every product idea or feature request we handle. If we get good validation on the need, we know it’s a good place to spend more time.


Forcing ourselves to scientifically validate our assumptions about customer needs not only helps us reach results faster, it frees us from the need to argue to death over the merits of certain product directions.Rather than argue about it, we find a quick way to validate things as a precursor to spending more time on them.

But we don’t stop there, and neither should you.Measure value delivered.

It’s the best way to keep yourself true to the core mission of creating value, and make sure your product is on track.

Guy Nirpaz

Guy Nirpaz is a Silicon Valley-based Israeli entrepreneur and CEO of Totango, a Customer Success software platform. A pioneer in the Customer Success field, Guy established the Customer Success Summit and is a well-regarded industry speaker and community contributor. Guy loves people and technology and has dedicated his career to improving the way in which business is done through innovation. Fun Facts: Guy moonlights as the lead guitarist in a rock band based out of his garage in Palo Alto and used to command a tank well as having grown oranges.

  • Kleine2

    I think this is a helpful clarification but not necessarily contradictory to Eric’s intention (validated could be interpreted as validated by customers paying for this functionality).

    Some learning is negative for example after we implemented feature x we learned that customers don’t care at all about x and in fact don’t even care at all about the whole X. This is also progress.

    There is something to be said for getting the result of the learning into the hand of the customer, but this is not the only goal. You might not be able to deliver on everything you learn right away – you prioritize. Also, some of the things you talk about, like signups are more optimizations than learning which are really more relevant IMO to a later stage of the company and focusing too much on later stage metrics might detract from the focus on the validated learning or the validated value as you define it.

    • Learning that users don’t care about X is great. But it’s only real progress if you can take the resources saved by this learning and apply them someplace that users *do* care about. That is the core of our argument — learning is a necessary and important step, it tells you what to do and what not, but it is still an intermediate step towards delivering value.

      Agreed that picking the right metric to work on is important. Generally speaking, before product/market fit, most of the attention should be on user-engagement rather than steps higher up the acquisition funnel… doesn’t mean you shouldn’t measure those too though 🙂

      thanks for the comments!

  • really amazing post 😀 thank you. It’s the best way to keep yourself true to the core mission of creating value, and make sure your product is on track. 😀

  • Pingback: Best of 2011: SaaS Best Practices and Customer Analytics()

  • Pingback: WEDNESDAY 120208 : LeanStartupWOD()

You might also like
You might also like
Another tip from VP sales at Hubspot Mark Roberge at the Sales 2.0 convention re
This week I was attending the Business of Software Workshop talking about Using Customer Analytics
As part of my effort to create the SaaS Executive Dashboard, which helps executives to