Win/loss programs fail the same way: the interviews happen, the notes go into a folder, the folder doesn't get read, and the next deal runs as if the last one never closed. The instrument below is designed for the opposite — a fifteen-minute structured interview per deal, a fixed tagging taxonomy, and one sheet that turns the qualitative into patterns the team can act on.
The template covers the minimum viable loop. If you're running a dedicated win/loss program, you need more depth. If you're running nothing today, this is enough to start.
The five questions
Every interview, same order, no deviation. The reason to keep the questions fixed is the whole value is in comparing answers across deals; rewording the question destroys the comparison.
The interview protocol
Run each interview in twenty minutes. Record (with permission). Have a second person on the call for notes — the AE who lost the deal should not be the interviewer.
The tagging taxonomy
Every deal gets tagged against the same five categories. Consistent tags are the difference between a folder of interviews and a data set.
Five tags per deal, one coachable moment. Ten minutes of work after the interview, no longer.
The one sheet
One row per deal, one column per tag. Weekly, the PMM sorts by primary reason and surfaces the three patterns most common in the last twelve deals. Monthly, the same view goes to sales leadership. Quarterly, it feeds the positioning audit.
What to cut from the template
There's a version of this instrument with forty fields and a weighted scoring rubric. Skip it. The failure mode of elaborate win/loss templates is the AE refuses to fill them out and the program dies. Five questions, five tags, one coachable moment. If a deal produces more information than fits, write a separate memo — don't expand the template.
The discipline is the loop, not the instrument. A team running this exact template every week, talking about the patterns every month, will learn more in a quarter than a team running a twenty-page deep-dive on three deals will learn in a year.
How this feeds other capabilities
The win/loss sheet is an input to three other places: the battle cards (every "competitor displacement" tag updates the named competitor's card), the positioning audit (every "positioning delta" tag scores against message consistency), and the roadmap (every "product gap" tag with the same feature named twice gets a product-review slot). Stratridge's Win/Loss Review runs this loop automatically against your CRM; the template above is the manual version, and the one worth starting with.
Win/Loss Review
Turn every lost deal into something your team can actually act on.
Win/Loss Review takes your lost-deal notes and turns them into objection patterns, rebuttal suggestions, and positioning gaps — then writes the learning back to Strategic Context so the next deal benefits from it.
- ✓Surfaces patterns across lost deals, not one-off anecdotes
- ✓Generates rebuttal suggestions from real objections
- ✓Feeds findings back into your strategic memory
One sharp B2B marketing read, most Thursdays.
Practical frameworks, competitive teardowns, and field observations across positioning, messaging, launches, and go-to-market. Written for working CMOs and PMMs. No listicles. No vendor roundups. Unsubscribe whenever.
Keep reading
Win/Loss Interview Questions You're Not Asking
Standard win/loss scripts confirm what you already believe. Thirteen sharper questions — organized by phase — that surface the buyer's real reasoning and the positioning delta behind it.
Battle Card Template for Feature Launches
When a competitor ships a feature, sales has 72 hours before buyers start asking. A compressed five-section battle card template built for the feature-launch window.
Win/Loss Analysis Without a Dedicated Program
A full win/loss program costs $80–150K a year. Here's the lightweight version that a team without that budget can run in four hours a month — with the three questions that do most of the work.