In 2023, MGM Resorts lost an estimated $100 million after a threat actor social-engineered the company's IT help desk with a single phone call. The attackers didn't exploit a zero-day vulnerability. They exploited a person. That incident should make every security leader ask a blunt question: would my employees have fallen for the same trick? And more importantly — would I even know?
If you're running a security awareness program but can't prove it's working, you're essentially flying blind. Knowing how to measure security awareness training is the difference between a checkbox exercise and a program that actually reduces risk. This post gives you the specific metrics, tools, and benchmarks I use — and that I've seen work across organizations of every size.
Why "We Did the Training" Is Not a Metric
I've audited security programs where the only evidence of training effectiveness was a spreadsheet showing 94% completion. That number tells you almost nothing. Completion rates measure compliance, not competence. They tell you who clicked "Next" enough times, not who can spot a credential theft attempt in a real inbox.
The 2024 Verizon Data Breach Investigations Report found that 68% of breaches involved a human element — phishing, pretexting, misuse, or error. That number has barely budged in years. If your training were working, you'd expect to see behavioral change. If you're not measuring behavior, you can't claim progress.
So let's talk about what actually works.
The Metrics That Actually Answer "How to Measure Security Awareness Training"
There's no single number that captures training effectiveness. You need a dashboard of leading and lagging indicators. Here are the ones I track in every program I advise on.
1. Phishing Simulation Click Rates
This is the most direct behavioral metric you have. Run regular phishing simulations and track the percentage of employees who click malicious links, open attachments, or — worst of all — submit credentials on fake login pages.
According to the 2024 Verizon DBIR, the median time for a user to click a phishing link was less than 60 seconds. Your simulations need to capture that urgency. Track click rates monthly or quarterly. A meaningful program should show a downward trend over 6-12 months.
If you need a structured simulation program, the phishing awareness training for organizations at phishing.computersecurity.us provides scenario-based exercises designed for exactly this kind of measurement.
2. Reporting Rates
Click rates tell you who failed. Reporting rates tell you who fought back. This is the metric most programs ignore — and it's arguably more important than click rates.
When an employee spots a suspicious email and reports it through your phishing report button or to your SOC, that's a human sensor firing. Track the ratio of reports to simulations sent. A mature organization should see reporting rates climb above 40-60% over time.
If employees aren't reporting, they either don't recognize threats or don't know how to report them. Both are training failures.
3. Time to Report
Speed matters. A phishing email reported in 2 minutes gives your incident response team a fighting chance. One reported after 4 hours? The damage may already be done. Measure the median time between simulation delivery and first employee report. Track this alongside your click rate — you want click rates falling and reporting speed increasing.
4. Repeat Offender Rate
Some employees click every simulation. I call them your "frequent flyers." Track repeat clickers as a percentage of total employees. These individuals represent your highest human risk and need targeted intervention — not just another generic training module.
A well-run program should reduce repeat offenders by 50% or more within two simulation cycles. If it doesn't, your training content isn't landing with that audience.
5. Knowledge Assessment Scores
Pre- and post-training quizzes give you a snapshot of knowledge retention. They're not sufficient on their own — someone can ace a quiz and still click a phishing link — but they help you identify specific knowledge gaps. Are employees confused about multi-factor authentication? Do they understand what social engineering looks like beyond email?
Use assessments strategically. Baseline before training, reassess 30 and 90 days after. Look for sustained retention, not just post-training spikes. Our cybersecurity awareness training course at computersecurity.us includes assessment components designed to measure exactly these gaps.
6. Real Incident Data
This is the lagging indicator that matters most. Track the number of actual security incidents caused by human error — clicked real phishing emails, credential compromises, business email compromise losses, ransomware infections from user actions. Compare year-over-year or quarter-over-quarter.
If your awareness program is effective, you should see a measurable reduction in human-caused incidents. If you don't have this data, work with your SOC or MSSP to start categorizing incidents by root cause.
Building a Measurement Framework That Holds Up
Individual metrics are useful. A framework that connects them to business risk is powerful. Here's the structure I recommend.
Step 1: Establish Your Baseline
Before you change anything, measure where you are. Run an unannounced phishing simulation. Conduct a knowledge assessment. Pull your incident data for the past 12 months. Document everything. This is your "before" snapshot.
Too many organizations skip this step and then can't demonstrate improvement to leadership. You need the baseline to prove ROI.
Step 2: Define Your KPIs and Targets
Pick 4-6 metrics from the list above. Assign specific targets. For example:
- Reduce phishing simulation click rate from 32% to under 15% within 12 months
- Increase phishing email reporting rate from 12% to 40% within 12 months
- Reduce repeat offender rate by 50% within 6 months
- Achieve 80%+ score on post-training knowledge assessments
- Reduce human-caused security incidents by 25% year-over-year
These targets should be realistic but ambitious. Benchmark against your industry if possible — the CISA cybersecurity best practices resources provide useful reference points.
Step 3: Measure Continuously, Not Annually
Annual training and a single annual phishing test is a compliance exercise, not a security program. Run phishing simulations at least monthly, varying templates and attack vectors. Rotate through credential harvesting, malware attachment, SMS phishing, and voice-based social engineering scenarios.
Continuous measurement smooths out anomalies and gives you trend data. It also keeps security awareness top-of-mind for employees — which is the entire point.
Step 4: Segment Your Data
Organization-wide averages hide your real risk. Break your metrics down by department, role, location, and seniority. In my experience, finance and HR teams get targeted disproportionately by business email compromise schemes. Executive assistants are prime targets for whaling attacks.
When you segment, you can target training where it matters most. A blanket approach wastes resources on low-risk groups and under-serves high-risk ones.
Step 5: Report to Leadership in Business Terms
Your CISO cares about risk reduction. Your CFO cares about cost avoidance. Your board cares about liability. Translate your metrics accordingly.
IBM's 2024 Cost of a Data Breach Report pegged the global average cost at $4.88 million. If your program demonstrably reduced human-caused incidents by 30%, you can frame that as risk reduction against a multi-million dollar exposure. That's a conversation leadership will engage with.
What Does a Good Phishing Click Rate Look Like?
This is the question I get asked most often, so here's a direct answer. Industry benchmarks vary, but here's a general guide based on aggregated data from multiple sources:
- First simulation (no prior training): 25-35% click rate is typical
- After 6 months of training + simulations: 15-20% is achievable
- Mature program (12+ months): Under 5-10% is the target
If you're consistently above 20% after a year of active training, something in your program isn't working — the content, the frequency, the relevance, or all three. Reassess your approach. Consider whether your simulations are realistic enough and whether your training addresses the actual attack techniques threat actors are using in 2025.
The Metrics That Don't Matter (As Much As You Think)
Training Completion Rates
Yes, you need them for compliance. No, they don't measure effectiveness. A 98% completion rate with a 30% phishing click rate means your training is thorough but ineffective.
Employee Satisfaction Scores
"Did you enjoy the training?" is a customer service question, not a security question. Entertaining training is nice. Training that changes behavior is essential. Measure behavior, not sentiment.
Number of Courses Completed
More training ≠ better security. I've seen organizations assign 12 modules per year and wonder why employees rush through them with zero retention. Quality and relevance beat volume every time.
Advanced Measurement: Culture and Behavior Indicators
Once your basic metrics are solid, start looking at culture indicators. These are harder to quantify but incredibly telling.
Unsolicited Reporting of Real Threats
When employees start reporting suspicious emails that aren't simulations, you've achieved something meaningful. Track the volume of organic reports to your SOC or IT help desk. This is your strongest indicator of a security-aware culture.
Policy Adherence Audits
Spot-check behaviors like screen locking, password manager usage, removable media handling, and multi-factor authentication enrollment. These observable behaviors tell you whether training is translating into daily habits.
Shadow IT Reduction
Employees who understand security risks are less likely to use unauthorized apps and services. If your shadow IT discoveries are declining, your awareness program may be a contributing factor.
Tying It All Together: The Quarterly Security Awareness Scorecard
Here's what I recommend presenting to leadership every quarter:
- Phishing simulation click rate — trend over time, segmented by department
- Phishing reporting rate — trend over time
- Repeat offender count — with intervention status
- Knowledge assessment average — pre vs. post, with retention data
- Real incident count — human-caused incidents, quarter-over-quarter
- Risk score — a composite number based on weighted metrics above
One page. Six metrics. Clear trends. That's what gets budget renewed and programs expanded.
Start Measuring What Matters
Knowing how to measure security awareness training isn't optional anymore. Regulators expect it — the FTC's Safeguards Rule explicitly requires monitoring and testing of security programs. Cyber insurers ask about it on every application. And threat actors are counting on the fact that most organizations still don't do it well.
If you're building or rebuilding your program, start with a solid training foundation. The cybersecurity awareness training at computersecurity.us gives your employees the knowledge base, and the phishing simulation training at phishing.computersecurity.us gives you the behavioral data to prove it's working.
Measure behavior, not just completion. Track trends, not just snapshots. Report in business terms, not technical jargon. That's how you build a security awareness program that actually protects your organization — and how you prove it to everyone who needs convincing.