Win/Loss Reviews


Share on LinkedIn

Win/Loss reviews are critical to continuous improvement. Do you conduct them?

Amazingly, for as much effort as we put into winning or losing a deal, I see too many organizations being very casual about analyzing actual performance and outcomes. I seldom see win reviews conducted. Sometimes I see loss reviews conducted–but most often, it’s not a loss review, it’s a reason code in a CRM system–and most of the time it is price or a product reason.

Sometimes I see managers conducting loss reviews, less to learn from what happened, but more to beat up a sales person with, “How can you screw up so badly?”

When we do conduct win/loss reviews, too often we only get part of the picture. We may talk to the sale team only, never going to the customer and asking for their feedback. Or we look at things on a deal by deal basis, but don’t step back to look at the picture more broadly–trying to detect patterns in why we win or lose. “Are we chasing the right customers?” “Are we vulnerable in certain situations or with certain competitors?” “Are we good at small deals, but can’t compete well in large deals?” The list can go on.

But we can’t improve performance, we can’t identify or fix problems, we can’t leverage certain classes of opportunities unless we have a complete picture of why we win and lose.

We need to understand at a deal level what happened and why. We need to interview not only sales people, but partners and customers. In those reviews we have to be open in hearing what people are telling us, not listening with an agenda. Even if they are telling us bad stuff—”your products suck, your service sucks, we don’t like you,” it’s the only way we can learn and identify issues.

We need to look at a complete analysis, trying to assess patterns. How many deals did we compete in? What was the win loss rate, how has it changed over time? What’s our win/loss rate for large deals? Mid-sized deals? Small deals? For certain product categories or mixes, against certain competitors, in certain segments? Getting very granular in the analysis, for example, “for deals between $1-5M, what’s our win rate, what’s the average size of the win, what’s the average cycle time.” Likewise for losses. Likewise, for deals $5-10M, and so on.

Recently, I did an analysis of several thousand deals an organization had competed in. Their win ratio for their largest category of deals was OK. As a consultant, I’m supposed to always say you can do better, but it was OK. But when I looked deeper, the average win of these largest deals was roughly half the size of the average loss of that category of deals. When I looked at sales cycle time, it took them twice as long to lose a large deal as it did to win the deal.

Where they thought they were reasonably good at winning large deals–just based on win ratio, they were actually very good at winning the smallest of the large deals but extremely bad at the biggest deals. We went on to discover a whole number of other things about those types of deals.

It was only after understanding the complete picture that we were able to put our fingers on the problems they had, then start to fix them.

So are you doing win/loss analysis? Are you really doing it, not only understanding things deal by deal, but seeing the patterns of where you are good and where you are bad? Are you really looking at understanding why–not just your own opinion, but from the customer?

Win/loss reviews and analysis is very powerful. Done correctly, they can correct terrible misperceptions we may have about our business. Done correctly, you can always figure out ways to improve performance.

Republished with author's permission from original post.

Dave Brock
Dave has spent his career developing high performance organizations. He worked in sales, marketing, and executive management capacities with IBM, Tektronix and Keithley Instruments. His consulting clients include companies in the semiconductor, aerospace, electronics, consumer products, computer, telecommunications, retailing, internet, software, professional and financial services industries.


  1. Dave: part of the discovery you describe could have been realized earlier had the analysis not been called “Win/Loss.” I made this change several years ago when I found that some “wins” could not cleanly be defined as wins, and the same for “losses.”

    Is a deal that closed six months after the forecast close date for half the expected revenue a “win?” At the very least, it doesn’t rightly fit in the same category as a customer order that met every forecast target. Similarly, should a customer purchase be classified as a win when the victory is Pyrrhic? These are just a few of the nuances that are lost at the outset when the Win/Loss label is applied.

    As an antidote, I recommend to my clients to use the terms “After-event review,” or “Post-action review” so that the discussion can be more open and honest about what went right and what went wrong. In wins and losses, there’s normally both.

    Here’s the set of questions I use with my projects, which I wrote about in a 2010 blog, “Sweep Your Sales Problems Under the Rug. Your Competitors Will Love You For It.”

    Every After-Event Review must ask and answer six questions:

    1. What was the intended outcome?
    2. What actually happened?
    3. What was learned?
    4. What do we do now?
    5. Who should we tell?
    6. How do we tell them?

    When these questions can be answered freely and objectively, the right information and insights will be provided to the people within the company who need to know.

    (by the way, this set of questions was developed from a colleague who is a military veteran. He used them in the US Army for organizational learning. Poignantly, the reasons for asking these questions in a military venue are far more dire than anything you or I deal with. For this reason, one overarching expectation is that no artifact is off the table, and nobody can pull rank. So if someone considers that the commanding officer committed an error, it’s part of the discussion.)

  2. Thanks Andy, I’m glad you brought up the military after action reviews. What’s really interesting about them is the focus on learning and improvement, rather than assessing blame. Too often, win/loss reviews are conducted to try and assess what the sales person did wrong, and how they can assess blame.

    I was also trying to address a broader perspective. Too often, we look at things on a situation by situation, deal by deal basis, but fail to step back and look at patterns from all the situations. The situation I described in the post could not have been identified—at least the real root causes—without looking at 100’s or 1000’s of deals, noticing patterns in various situations.

    Most companies are simply ignoring this.


Please use comments to add value to the discussion. Maximum one link to an educational blog post or article. We will NOT PUBLISH brief comments like "good post," comments that mainly promote links, or comments with links to companies, products, or services.

Please enter your comment!
Please enter your name here