Most compliance dashboards measure activity. Number of controls. Number of tickets closed. Number of evidence items collected. These are useful for an operations review. They are nearly useless for answering the question an executive, a board, or a regulator is actually asking, which is: 'Is the program working?' This article is a practical framework for choosing metrics that describe posture instead of activity, with specific examples for program health, control operating effectiveness, evidence quality, audit readiness, and risk exposure — and an honest list of the vanity metrics to stop reporting.
Activity metrics vs. posture metrics
An activity metric counts what the program did. A posture metric describes the state of the program. 'Number of access reviews completed this quarter' is an activity metric. 'Percentage of in-scope systems with a current access review in place' is a posture metric. The first tells you the team was busy. The second tells you whether the program is in the state it is supposed to be in. Most dashboards skew heavily toward activity because activity is easy to count. Posture metrics require a canonical model of 'what the program should look like right now' — which is exactly what a compliance program is supposed to maintain anyway. If your system cannot produce posture metrics, that is itself a diagnostic signal.
A five-category framework for compliance metrics
A useful dashboard covers five categories. Program health: does every control have a current owner, a current definition, and an owner who has reviewed it in the last ninety days? Control operating effectiveness: for each control, how many of the expected evidence events occurred on time, how many late, how many missed? Evidence quality: what fraction of evidence meets the four audit-grade properties (complete, contemporaneous, attributable, tamper-evident)? Audit readiness: what fraction of your in-scope population has evidence that would survive a sample today? Risk exposure: what is the current state of your risk register, how many high risks are accepted vs. treated, and what is the trend? These five categories cover almost everything an executive, a board, or an auditor cares about.
Specific metrics to adopt
A non-exhaustive list of metrics that actually mean something. Control coverage: percentage of canonicalized requirements with at least one control mapped to them. Target: 100%. Anything less is unmapped scope. Control currency: percentage of controls whose definition was reviewed in the last review cycle (quarterly or annually, depending on program). Target: 100% by the end of each cycle. Evidence freshness: median and 95th percentile age of the most recent evidence for each control, measured against the control's expected cadence. Target: median within cadence; tail within cadence plus one reporting period. Sample-ready percentage: fraction of controls for which a randomly selected sample from the current population would have compliant evidence available in under an hour. Target: 95% at any point in the observation period. Finding velocity: average time from a finding being opened to being closed, trended over quarters. Remediation discipline is a leading indicator of audit outcomes. Risk acceptance rate: fraction of identified risks that are accepted rather than treated, trended over time. Rising acceptance is almost always a signal of under-investment, not of newfound risk appetite.
The metrics to stop reporting
A short list of vanity metrics that should disappear from compliance dashboards. Raw control count. It is a proxy for complexity, not for strength. Large control counts almost never correlate with stronger posture; they usually correlate with duplication across frameworks. Evidence artifact count. 'We collected 1,200 evidence items this quarter' is an operations metric, not a posture metric. Ticket throughput. A high close rate on low-quality tickets is worse than a low close rate on high-quality ones. Dashboard greenness. A dashboard that is 100% green either reflects exceptional discipline or — far more often — reflects thresholds that were set low enough that failures never register. Any of these metrics, reported on its own without posture context, is at best noise and at worst actively misleading.
The executive summary dashboard
The dashboard you show the CEO or the board should fit on one page. It has five numbers, one per category: program health, control operating effectiveness, evidence quality, audit readiness, risk exposure. Each number has a trend arrow against the prior quarter. Each number has a one-line narrative explanation of the drivers. Below the summary, three or four open items have owners and close dates. That is the whole page. Sophisticated executives read this and have a useful conversation about investment and accountability in under fifteen minutes. Unsophisticated dashboards produce forty slides and no decisions.
The operations dashboard
Separate from the executive dashboard is the operations dashboard, which the compliance team uses to run the program day-to-day. This dashboard is allowed to contain activity metrics — control-by-control drill-downs, ticket queues, upcoming cadences, evidence collection progress. It is the instrument panel for the team, not the external scorecard. Keep the two dashboards distinct. Confusing them is how activity metrics end up in board decks, and how activity metrics end up in board decks is how compliance programs lose executive attention.
Key takeaways
- Activity metrics count what you did. Posture metrics describe the state of the program. Posture metrics are what executives, boards, and auditors actually care about.
- A useful dashboard covers five categories: program health, control operating effectiveness, evidence quality, audit readiness, and risk exposure.
- Adopt metrics like control coverage, control currency, evidence freshness, sample-ready percentage, finding velocity, and risk acceptance rate.
- Stop reporting raw control count, evidence artifact count, ticket throughput, and dashboard greenness. They are vanity metrics.
- The executive dashboard is one page with five numbers and a short narrative. Keep it separate from the operations dashboard the team uses day-to-day.