The Clock Starts Ticking the Moment You Discover a Breach

In December 2020, FireEye disclosed it had been breached by a sophisticated threat actor — a revelation that quickly unraveled into the massive SolarWinds supply chain compromise affecting 18,000 organizations including multiple U.S. government agencies. The question every security team faced wasn't just "how bad is it?" It was "who do we notify, how fast, and in what order?"

If you're searching for how to report a data breach, you're likely facing one of two scenarios: you've already discovered a breach and need to act now, or you're building an incident response plan before disaster strikes. Either way, this guide walks you through every step — from federal reporting obligations to state notification laws to the practical mechanics of getting it done right.

I've helped organizations navigate breach reporting more times than I'd like to count. The process is stressful, confusing, and littered with legal landmines. But it's also entirely manageable if you know the playbook.

Step 1: Confirm the Breach and Preserve Evidence

Before you report anything, you need to confirm what actually happened. Not every security incident qualifies as a reportable data breach. A breach typically involves unauthorized access to, or acquisition of, personally identifiable information (PII), protected health information (PHI), or financial data.

What Counts as a Reportable Breach?

A reportable data breach generally means that sensitive personal data was accessed, stolen, or exposed by an unauthorized party. This includes Social Security numbers, financial account information, medical records, login credentials, and in many states, even email addresses combined with passwords.

The moment you suspect a breach, preserve everything. Don't wipe systems. Don't reboot servers. Document what you know, screenshot alerts, and isolate affected systems from the network. Your forensic evidence is critical for both the investigation and any regulatory inquiries that follow.

Engage Your Incident Response Team Immediately

If you have an internal incident response team, activate them now. If you don't, engage outside forensics counsel immediately. According to IBM's 2020 Cost of a Data Breach Report, organizations with an incident response team saved an average of $2 million per breach compared to those without one. That number isn't theoretical — it reflects real legal fees, regulatory fines, and remediation costs.

Your legal counsel should be involved from minute one. Attorney-client privilege can protect sensitive communications during the investigation — but only if counsel is directing the forensic work.

Step 2: Report to Federal Law Enforcement

Here's where most organizations stumble. They focus entirely on notification laws and forget that reporting to law enforcement is a separate — and critical — step.

File a Report with the FBI's IC3

The FBI's Internet Crime Complaint Center (IC3) is the primary federal intake point for cybercrime. You can file a complaint at ic3.gov. In 2020, the IC3 received 791,790 complaints with reported losses exceeding $4.2 billion — a 69% increase in complaints from 2019.

Filing with IC3 does more than check a box. It feeds into FBI investigations that may already be tracking your threat actor. In ransomware cases especially, early reporting can connect you with decryption resources or intelligence about your attacker's methods.

Contact the U.S. Secret Service or CISA

For breaches involving critical infrastructure, payment card fraud, or nation-state activity, contact the Cybersecurity and Infrastructure Security Agency (CISA) at cisa.gov/report. CISA can provide technical assistance and help contain the threat, particularly if you're dealing with a credential theft campaign or supply chain compromise.

The Secret Service handles financial crimes and has cyber fraud task forces in major cities. If your breach involves stolen financial data, they want to hear from you.

Step 3: Determine Your State Notification Obligations

This is where knowing how to report a data breach gets complicated fast. All 50 U.S. states, the District of Columbia, Guam, Puerto Rico, and the U.S. Virgin Islands have enacted data breach notification laws. And none of them are identical.

The Patchwork Problem

Each state defines "personal information" slightly differently. Each has different notification timelines. Some require notification within 30 days (like Florida). Others use vague language like "in the most expedient time possible" (like California). Some states require you to notify the state attorney general. Others require notification to consumer protection agencies.

You don't just notify the state where your organization is based. You notify every state where affected individuals reside. A breach affecting customers in 20 states means complying with 20 different notification frameworks simultaneously.

Key State Requirements You Can't Ignore

  • California (CCPA): Notify affected residents "in the most expedient time possible and without unreasonable delay." Written notice must include specific elements. The California Attorney General must be notified if more than 500 residents are affected.
  • New York (SHIELD Act): Expanded the definition of private information in 2020 to include biometric data and account credentials. Notification to the AG, Department of State, and Division of State Police is required.
  • Texas: Notification within 60 days. AG notification required if more than 250 residents are affected.
  • HIPAA (federal): If PHI is involved, notify HHS within 60 days for breaches affecting 500+ individuals. For smaller breaches, annual reporting is permitted.

I strongly recommend consulting the National Conference of State Legislatures' breach notification law database for current requirements in every jurisdiction.

Step 4: Notify Affected Individuals

Notification to affected individuals is where your organization's reputation is either preserved or destroyed. I've seen companies send vague, legalistic letters that infuriate customers. I've also seen companies that communicate clearly, take responsibility, and actually strengthen trust.

What Your Notification Must Include

While requirements vary by state, most notifications should include:

  • A description of the incident and the date it occurred
  • The types of information that were exposed
  • Steps the organization is taking to address the breach
  • Steps individuals can take to protect themselves (credit monitoring, password changes, fraud alerts)
  • Contact information for questions
  • Contact information for the FTC, relevant state AG, or credit reporting agencies

Be direct. Be specific. Don't hide behind passive language like "information may have been accessed." If you know data was stolen, say so.

Timing Matters More Than You Think

Delayed notification erodes trust and invites regulatory scrutiny. The Equifax breach in 2017 exposed 147 million consumers' data, but the company waited six weeks to disclose it — a delay that fueled congressional investigations and a $575 million FTC settlement. The lesson: move fast, communicate clearly, and don't let your legal team turn the notification into an incomprehensible document.

Step 5: Report to Industry-Specific Regulators

Depending on your industry, you may have additional reporting obligations beyond state laws.

Financial Services

Banks and financial institutions must report breaches to their primary federal regulator (OCC, FDIC, or Federal Reserve) and potentially to FinCEN if the breach involves suspicious financial activity. The Gramm-Leach-Bliley Act (GLBA) has its own notification requirements.

Healthcare

HIPAA-covered entities must notify the Department of Health and Human Services (HHS) via their breach portal. For breaches affecting 500 or more individuals, HHS must be notified within 60 days, and local media must also be notified. This is not optional — the HHS Office for Civil Rights actively investigates and fines organizations for non-compliance.

Payment Card Industry

If payment card data was compromised, you must notify your acquiring bank and the card brands (Visa, Mastercard, etc.) per PCI DSS requirements. This typically triggers a forensic investigation by a PCI Forensic Investigator (PFI). The card brands may impose fines or require you to fund card reissuance.

What Happens If You Don't Report a Data Breach?

This is the question I get most often, and the answer is simple: it gets worse. Much worse.

The FTC has enforcement authority over companies that fail to protect consumer data or fail to notify consumers of breaches. The FTC's action against Uber in 2018 — after the company concealed a 2016 breach affecting 57 million users and paid the hackers $100,000 to delete the data — resulted in an expanded consent decree and two decades of mandatory privacy audits.

State attorneys general are equally aggressive. In 2020, the California AG's office made clear that CCPA enforcement was a priority. New York's AG has brought multiple actions against companies for delayed or inadequate breach notification.

Criminal penalties exist too. In some jurisdictions, willfully concealing a data breach can result in personal liability for executives.

Build Your Reporting Playbook Before You Need It

The worst time to figure out how to report a data breach is during an active incident. Here's what your preparation should look like right now:

  • Document your notification obligations for every state where you have customers, employees, or data.
  • Pre-identify your legal counsel — both internal and external breach counsel with forensics experience.
  • Create template notifications that can be customized quickly. Have legal review them in advance.
  • Run tabletop exercises annually. Simulate a ransomware attack or credential theft scenario and walk through every reporting step.
  • Train your employees on recognizing social engineering and phishing attacks — the most common breach vectors. Our cybersecurity awareness training course covers exactly these scenarios in practical, actionable modules.
  • Test your phishing defenses regularly with realistic simulations. Our phishing awareness training for organizations helps your team recognize and report phishing attempts before they become breaches.

The Reporting Timeline: A Quick Reference

Here's the condensed version of your reporting sequence:

  • Hour 0-4: Confirm the breach. Preserve evidence. Engage incident response team and legal counsel.
  • Hour 4-24: Contain the breach. Begin forensic investigation. Identify what data was affected and how many individuals are impacted.
  • Day 1-3: File reports with FBI IC3 and CISA. Notify your cyber insurance carrier. Begin drafting individual and regulatory notifications.
  • Day 3-30: Complete state AG notifications per applicable deadlines. Send individual notification letters. Notify industry-specific regulators (HHS, card brands, banking regulators).
  • Day 30+: Cooperate with law enforcement investigations. Implement remediation measures. Conduct a post-incident review and update your security awareness program.

These timelines compress significantly if you're subject to HIPAA, GDPR (if EU residents are affected), or state laws with aggressive deadlines. Know your obligations before the clock starts.

Prevention Is Still Cheaper Than Reporting

The Verizon 2020 Data Breach Investigations Report found that 22% of breaches involved phishing and 37% involved stolen or compromised credentials. These aren't exotic attack vectors. They're preventable with consistent training, multi-factor authentication, and a zero trust approach to access management.

Every dollar you invest in security awareness training and phishing simulation saves multiples in breach costs, legal fees, and regulatory penalties. The IBM 2020 report pegged the global average cost of a data breach at $3.86 million. For U.S. organizations, it was $8.64 million.

Knowing how to report a data breach is essential. Knowing how to prevent one is better. Start with your people — because that's where most breaches start too.