When MGM Resorts got hit with a devastating social engineering attack in September 2023, it wasn't a firewall failure. It wasn't a zero-day exploit. A threat actor called the help desk, impersonated an employee, and walked right through the front door. The estimated cost? Over $100 million. And here's the uncomfortable truth — most organizations have no idea whether their people would fall for the same trick, because they aren't tracking the right security awareness metrics.
I've spent years helping organizations build security awareness programs. The single biggest gap I see isn't the training itself — it's the measurement. Leaders pour budget into awareness initiatives, then can't answer the CEO's simplest question: "Is it working?" This post gives you the exact metrics that answer that question with data, not gut feelings.
Why Most Security Awareness Programs Can't Prove Their Value
Here's what actually happens in most organizations. Someone buys a training platform, rolls out an annual compliance module, checks the box, and moves on. When the board asks about human risk, the CISO shows a completion rate — 94% of employees finished the training — and everyone nods.
That number is almost meaningless. Completion rates tell you who clicked "next" enough times. They tell you nothing about behavior change, risk reduction, or whether your $200,000 investment moved any needle that matters.
The 2023 Verizon Data Breach Investigations Report found that 74% of all breaches involved a human element, including social engineering, errors, and misuse. That number has hovered in the same range for years. If completion rates were a meaningful security awareness metric, that number would be dropping. It isn't.
The problem isn't awareness training itself. It's that organizations measure activity instead of outcomes. Let's fix that.
The Security Awareness Metrics That Actually Matter
Effective measurement requires layering leading indicators (behavior signals) with lagging indicators (incident data). Here are the metrics I track with every program I advise on, broken into categories.
Phishing Simulation Performance
This is the most direct behavioral measure you have. Run regular phishing simulations and track three numbers:
- Click-through rate: The percentage of employees who click a simulated phishing link. Industry benchmarks from the 2023 Verizon DBIR suggest initial click rates often land between 10-20% before training. Your goal is sustained single digits.
- Report rate: The percentage of employees who use the phishing report button to flag the simulation. This is more important than click rate. A high report rate means your people are actively defending the organization, not just avoiding mistakes.
- Time-to-report: How quickly do the first reports come in? A fast median time-to-report means your human detection layer is working in near-real-time, which complements your technical controls.
Track these monthly, not quarterly. Quarterly simulations give threat actors a 90-day window of unmeasured risk. If you need a platform to run simulations at scale, our phishing awareness training for organizations is built specifically for this kind of continuous measurement.
Training Engagement Depth
Move beyond completion rates. Measure how deeply people engage:
- Assessment scores (pre vs. post): A 15-point average improvement post-training is a solid benchmark. Anything under 5 points suggests the content isn't landing.
- Knowledge retention at 30/60/90 days: If you only test immediately after training, you're measuring short-term memory. Send micro-assessments at intervals to track actual retention.
- Repeat failure rate: What percentage of employees who failed a phishing simulation fail again within 60 days? This is your most honest metric. A persistently high repeat failure rate among a specific group tells you exactly where to focus remediation.
Real-World Incident Indicators
The lagging indicators connect your program to actual security outcomes:
- Credential theft incidents: Track the number of confirmed credential compromises per quarter. A mature awareness program should correlate with a downward trend over 12-18 months.
- Reported suspicious emails (real, not simulated): A rising number of organic reports from employees is one of the strongest signals your security culture is improving. I've seen organizations go from 20 reports a month to over 300 within a year of launching continuous training.
- Incidents caused by human error: Track misconfigured systems, accidental data exposure, and policy violations. This connects awareness to operational risk.
What Are the Most Important Security Awareness Metrics?
The most important security awareness metrics are phishing simulation report rate, repeat failure rate, and real suspicious email reports. These three numbers, tracked together, tell you whether employees are recognizing threats, whether training is changing behavior long-term, and whether your organization's overall security culture is improving. Click-through rates and completion rates are secondary — behavior change is what reduces breach risk.
Building a Security Awareness Metrics Dashboard
Data without structure is just noise. Here's how I build a dashboard that a CISO can present to the board in under five minutes.
Tier 1: Executive Summary (3 Numbers)
Executives don't want twelve charts. Give them three:
- Human Risk Score: A composite of phishing click rate, report rate, and repeat failure rate, normalized to a 0-100 scale. Higher is better. Update monthly.
- Year-over-year incident reduction: Total security incidents attributed to human factors, compared to the same quarter last year.
- Training coverage: Percentage of workforce that has completed current-quarter training and participated in at least one simulation. This replaces the old "completion rate" with something more rigorous.
Tier 2: Operational Detail (For Security Teams)
Your SOC and security team need granular data:
- Click rates broken down by department, role, and seniority level
- Simulation difficulty progression — are you increasing complexity over time?
- Correlation between training completion and incident involvement (do trained employees generate fewer tickets?)
- Mean time to detect social engineering attempts reported by users vs. technical controls
Tier 3: Trend Analysis (Quarterly Review)
Plot your key metrics on a 12-month rolling basis. You're looking for:
- Sustained improvement, not spikes after training that decay quickly
- Departments or locations that diverge from the organizational trend
- Seasonal patterns (phishing click rates often spike after holidays)
If you want foundational training content to underpin these metrics, our cybersecurity awareness training program covers the core topics — from credential theft to ransomware — that every employee needs.
The $4.88M Benchmark You Can't Ignore
IBM's 2023 Cost of a Data Breach Report pegged the global average cost of a breach at $4.45 million. In the United States, it hit $9.48 million. But here's the number that matters for awareness metrics: organizations with high levels of security skills shortage (which includes undertrained employees) paid an average of $4.88 million — $760,000 more than those with low skills shortages.
That delta is your ROI argument. If your security awareness metrics can demonstrate a measurable reduction in human-caused incidents and a measurable improvement in detection speed through employee reporting, you have a financial case that directly maps to breach cost reduction.
Frame it like this for your CFO: "Our phishing report rate increased from 12% to 47% over 12 months. Employees now flag credential theft attempts an average of 4 minutes after delivery. Based on industry benchmarks, this detection speed reduces incident cost by an estimated 30%."
That's a conversation executives understand.
How to Start Tracking Security Awareness Metrics This Quarter
You don't need a six-figure analytics platform to start. Here's a practical 90-day plan.
Month 1: Baseline
- Run an unannounced phishing simulation across the entire organization. Use a moderate-difficulty template — something that mimics a credential theft attempt, not a Nigerian prince email.
- Record click rate, report rate, and time-to-report.
- Deploy a baseline knowledge assessment covering social engineering, multi-factor authentication, ransomware identification, and zero trust concepts.
Month 2: Train and Simulate
- Roll out targeted training. Focus on the weakest topics from your baseline assessment.
- Run a second simulation with a different template and slightly higher difficulty.
- Compare Month 1 and Month 2 results. Look for early movement.
Month 3: Measure and Report
- Compile your first dashboard. Even a spreadsheet works at this stage.
- Calculate your initial Human Risk Score.
- Present to leadership with a 12-month improvement target.
- Identify your highest-risk departments for focused intervention in Q4.
This isn't a theoretical framework. I've seen organizations cut their phishing click rates by 60% in six months using exactly this cadence. The key is consistency — monthly simulations, monthly measurement, monthly visibility.
The Metrics Trap: What to Stop Measuring
Some metrics actively mislead. Drop these from your reporting:
- Training completion rate as a standalone KPI: It measures compliance, not competence. Keep it as a coverage metric, but never let it headline your dashboard.
- Total training hours: More hours doesn't mean better outcomes. A 5-minute microlearning module that changes behavior beats a 60-minute lecture that employees click through while checking email.
- Number of policies published: Policies matter, but publishing them doesn't equal reading them, and reading them doesn't equal following them.
Every metric on your dashboard should answer one question: "Is our workforce getting better at recognizing and responding to threats?" If a metric doesn't contribute to that answer, remove it.
Connecting Metrics to a Zero Trust Framework
If your organization is moving toward a zero trust architecture — and in 2023, you should be — your security awareness metrics feed directly into the "never trust, always verify" model. Zero trust assumes breach. It assumes insiders can be compromised. Your awareness metrics measure exactly how likely that compromise is and how fast your people can detect it.
CISA's Zero Trust Maturity Model explicitly calls out workforce identity and behavior as foundational pillars. Your phishing simulation data, incident reporting rates, and credential hygiene metrics provide evidence for maturity assessments across those pillars.
The NIST Cybersecurity Framework similarly emphasizes awareness and training under its Protect function. Your metrics map directly to PR.AT (Awareness and Training) subcategories, giving you audit-ready documentation.
The Metric That Predicts Your Next Breach
If I could only track one security awareness metric, it would be repeat phishing failure rate. Employees who click a simulated phish, receive training, and click again within 60 days represent your most concentrated human risk. In my experience, this group is typically 3-5% of the workforce — and they're disproportionately targeted by real threat actors who profile organizations through LinkedIn, social media, and prior breaches.
Identify this group. Give them one-on-one coaching, not another video. Restrict their access privileges if the pattern continues. This is where security awareness intersects with access management, and it's where your metrics create the most immediate risk reduction.
Track what matters. Measure behavior, not busywork. And use the data to build a security culture where every employee is part of the defense — not just the IT department. The organizations that take security awareness metrics seriously are the ones that stop making headlines for the wrong reasons.
For additional data on the threat landscape driving these metrics, the FBI IC3 2022 Internet Crime Report documents over $10.3 billion in reported losses — a number that underscores exactly why measuring your human risk isn't optional anymore.