A weekly competitor review that produces no decisions is worse than no review at all — it consumes thirty minutes of PMM time, thirty minutes of sales-leader time, and an ambient expectation that competitor intelligence is being converted into action. The version below produces decisions. Same cadence, same duration; different shape. The shape is what makes the review worth holding.
The prerequisite: work has to happen before the meeting
The meeting is not where the monitoring work gets done. The monitoring work — scanning pricing pages, reading careers pages, reviewing CEO LinkedIn activity — happens during the week, produces the signal log, and is synthesized into four artifacts that get circulated 24 hours before the meeting. Meetings where the monitoring work is attempted in real time fail. Meetings where the work is presented and decisions made succeed.
The four artifacts, circulated 24 hours before
The 30-minute agenda
The attendee list
Three to four people. More than four and the meeting dilutes into broadcast. The fixed attendees: a PMM (owner), a sales leader or senior AE, and one of CS or product (rotating by week, based on which function is most relevant). Optionally the CMO for weeks with Tier 1–3 escalations.
Not invited: the whole sales team, the marketing team, product managers below director level, executives below the CMO. The meeting's outputs route to those teams; the meeting itself stays tight.
The four disciplines that separate structure from status-update
What to do when the meeting stops producing decisions
The failure mode is usually one of three. Watching for each, and correcting early, is what keeps the review from degrading into a status update that wastes everyone's time.
Failure 1: The signal log is consistently empty or thin. If the PMM arrives with 0–2 signals most weeks, the monitoring work isn't producing enough volume. The fix is upstream — more time allocated to monitoring, better tier-1 competitor coverage, or better use of tooling. The review can't compensate for a weak signal pipeline.
Failure 2: The decisions don't ship as promised. The meeting produces a Respond-level decision; the battle card doesn't update within two weeks. After three or four missed commitments, attendees stop investing in the meeting because the decisions are theater. The fix is executive air cover: the CMO has to hold attendees to their commitments, or the commitments become suggestions.
Failure 3: The same signal appears in four consecutive weeks. The team has repeatedly deferred the decision. This usually signals a political or organizational issue, not a monitoring issue. The fix is naming the deferral explicitly: "We've had this signal on three agendas. Either we Respond this week or we Ignore it. Which?" Forcing the choice ends the drift.
The meeting note
A one-page note goes to a named distribution list within two hours of the meeting. The note has four sections: decisions made (with owner and date), signals logged as Ignore, Monitor items extended (with new review date), and next-meeting agenda items.
Distribution: the meeting attendees, the CMO, and any team leader whose domain was affected by a decision (sales enablement for battle-card updates, content for blog-response decisions, product for roadmap-pressure signals). Not the whole company — wider distribution reduces candor.
The note, circulated weekly, builds the institutional memory that makes the monitoring program durable. Six months in, the note archive is the most useful competitive-intelligence artifact the company has. Without the note, each week's meeting is a standalone event; with the note, the weekly meetings compound into a running record of how the competitive landscape shifted and what the company decided in response. That compounding is the entire point.
Competitor Signals
Know what your competitors are doing before your reps find out in a deal.
Competitor Signals monitors your named competitors' public surfaces daily — pricing pages, messaging, job postings, and more — and flags the moves that actually demand a response. No noise, no Google Alerts, no manual checking.
- ✓Daily monitoring of competitor positioning moves
- ✓Filters noise from material changes
- ✓Recommended responses grounded in your own strategy
One sharp B2B marketing read, most Thursdays.
Practical frameworks, competitive teardowns, and field observations across positioning, messaging, launches, and go-to-market. Written for working CMOs and PMMs. No listicles. No vendor roundups. Unsubscribe whenever.
Keep reading
How to Run a Competitor Signal Review Meeting (30 Minutes)
The weekly meeting most SaaS companies run to 'discuss competitors' produces no decisions. Here's the 30-minute version that does — with the agenda, the three required artifacts, and the rule that keeps it from becoming a status update.
The 6 Types of Competitor Signals You Need to Track
Most monitoring dashboards track the wrong thing — they count alerts. The six signal types below are what actually moves deals, and each has a distinct cadence, owner, and response shape.
10 Competitor Monitoring Mistakes That Waste Your Week
Ten specific ways competitor-monitoring programs consume a PMM's calendar without producing decisions — and the single correction for each that reclaims the hours.