In February 2024, Change Healthcare suffered a ransomware attack that exposed the protected health information of approximately 190 million people — making it the largest healthcare data breach in U.S. history. The fallout wasn't just the breach itself. It was the weeks of confusion about who had been notified, which regulators had been contacted, and whether the company had met its legal obligations. That chaos is what happens when an organization doesn't know how to report a data breach before one actually hits.
If you're reading this, you either just experienced a breach, you're preparing for one, or you're trying to understand your obligations. I've walked organizations through this process dozens of times, and I can tell you: the reporting phase is where most companies make their costliest mistakes. Not the technical remediation — the notification timeline, the regulatory filings, the communication to affected individuals. That's where lawsuits start and regulators sharpen their teeth.
This guide walks you through every step — who to contact, when, and how to document everything so you don't compound a security incident with a compliance failure.
What Counts as a Reportable Data Breach?
Not every security incident triggers a reporting obligation. A reportable data breach typically involves unauthorized access to, or acquisition of, unencrypted personal information that compromises its confidentiality, integrity, or security. The specifics vary by jurisdiction and industry, but the threshold is lower than most people think.
If a threat actor accessed a database containing names paired with Social Security numbers, financial account information, medical records, or login credentials — you almost certainly have a reportable breach. Even if you believe the data wasn't actually exfiltrated, many state laws use an "access" standard, not an "acquisition" standard.
The Gray Areas That Trip Organizations Up
I've seen companies convince themselves a breach isn't reportable because the data was "encrypted." But if the encryption keys were also compromised — and they often are — that exception evaporates. Another common mistake: assuming that because only employee data was exposed (not customer data), there's no reporting obligation. Employee personal information triggers the same notification laws in every U.S. state.
When in doubt, treat it as reportable. Over-reporting costs you an awkward conversation. Under-reporting costs you an FTC enforcement action.
Step 1: Activate Your Incident Response Team Immediately
Before you report anything externally, you need to understand what happened. Your incident response team — whether internal or a retained third party — should be activated within hours, not days. Their job is to contain the breach, preserve forensic evidence, and scope the impact.
Document everything from the moment of discovery. The clock for your legal notification deadlines starts ticking the moment you "knew or should have known" about the breach. In many jurisdictions, that's the discovery date, not the date you finish your investigation.
What Your Response Team Needs to Determine
- What systems were compromised and how
- What types of personal information were exposed
- How many individuals are potentially affected
- Whether the threat actor still has access
- Whether data was exfiltrated, encrypted by ransomware, or both
- The timeline of the intrusion — when it started, when it was detected
This scoping exercise feeds directly into every notification you'll file. Regulators will ask these exact questions, and "we don't know yet" has a very short shelf life before it becomes "you didn't investigate adequately."
Step 2: Know Your Legal Notification Deadlines
This is where knowing how to report a data breach gets legally complex. The United States doesn't have a single federal breach notification law for all industries. Instead, you're dealing with a patchwork of 50 state laws, plus sector-specific federal requirements.
State Breach Notification Laws
All 50 U.S. states, the District of Columbia, Guam, Puerto Rico, and the U.S. Virgin Islands have enacted breach notification laws. The deadlines range from 30 days (states like Florida and Colorado) to 60 days (most common) to "without unreasonable delay" (the vague standard used by several states). Some states, like Maine and several others updated in recent years, require notification to the state attorney general within 30 days.
You must comply with the law of every state where an affected individual resides — not just the state where your company is headquartered. If you have customers or employees in 40 states, you potentially have 40 different sets of requirements.
Federal Requirements by Sector
- Healthcare (HIPAA): Notify individuals within 60 days of discovery. Notify HHS. If 500+ individuals in a state are affected, notify prominent media outlets in that state.
- Financial Services (GLBA/Interagency Guidance): Notify your primary federal regulator. The FTC's updated Safeguards Rule requires notification to the FTC within 60 days if 500+ individuals are affected.
- Public Companies (SEC): Since December 2023, material cybersecurity incidents must be disclosed on Form 8-K within four business days of determining materiality.
- Critical Infrastructure: CISA's Cyber Incident Reporting for Critical Infrastructure Act (CIRCIA) will require reporting within 72 hours once final rules take effect.
The FTC's Health Breach Notification Rule also applies to entities not covered by HIPAA — including health apps and connected fitness devices — and the FTC has been actively enforcing it since 2023.
Step 3: Report to Federal Agencies
Beyond your specific regulatory obligations, you should report the breach to relevant federal agencies. This isn't always legally required, but it's almost always strategically smart.
FBI Internet Crime Complaint Center (IC3)
File a report at ic3.gov. This is especially important if ransomware, credential theft, business email compromise, or any financially motivated threat actor is involved. The FBI's IC3 received over 880,000 complaints in 2023 with potential losses exceeding $12.5 billion. Your report contributes to pattern analysis that can actually lead to arrests and infrastructure takedowns.
CISA
If your organization is in critical infrastructure — or even if it's not — report the incident to CISA at cisa.gov/report. CISA can provide technical assistance and share threat intelligence that helps you understand whether you're dealing with an isolated incident or part of a broader campaign.
State Attorneys General
Most states require direct notification to the Attorney General's office, often through an online portal. Many states publish their notification forms online. Don't skip this step — it's one of the most common compliance failures I see. An organization will notify individuals but forget the AG filing, and that omission alone triggers an enforcement inquiry.
Step 4: Notify Affected Individuals
This is the notification most people think of first, and it's the one that demands the most care. A poorly written breach notification letter can trigger panic, confusion, lawsuits, and media attention you don't want.
What Your Notification Must Include
Requirements vary by state, but most breach notification letters must contain:
- A description of the incident in general terms
- The types of personal information involved
- Steps the organization is taking in response
- Steps individuals can take to protect themselves
- Contact information for your organization
- Contact information for the relevant state AG and the FTC
- Information about credit monitoring services, if offered
Send the notification by first-class mail unless you meet the threshold for substitute notice (typically for very large breaches or when you lack mailing addresses). Email notification is allowed in some states if you have prior consent.
The Tone Matters More Than You Think
I've reviewed hundreds of breach notification letters. The ones that generate the fewest complaints are direct, specific, and empathetic — without being defensive or legalistic. Say what happened, what you're doing about it, and what the recipient should do. Don't bury the key information in paragraph six of a three-page letter.
Step 5: Notify Third Parties and Business Partners
If the breached data includes information you process on behalf of another organization — a client, a vendor relationship, a partnership — you likely have contractual obligations to notify them. Review your data processing agreements. Most modern contracts include breach notification clauses with timelines of 24 to 72 hours.
If you're a service provider, this notification should happen in parallel with your regulatory filings, not after. Your client needs to assess their own reporting obligations, and every hour you delay puts them further behind their own deadlines.
The $4.88M Lesson Most Organizations Learn Too Late
According to IBM's 2024 Cost of a Data Breach Report, the global average cost of a data breach hit $4.88 million — the highest ever recorded. But here's the number that matters for this conversation: organizations that contained a breach in under 200 days saved an average of $1.02 million compared to those that took longer.
Speed of response — including speed of reporting — directly correlates with cost reduction. Not just regulatory fines, but litigation costs, customer churn, and remediation expenses. The organizations that fare best are the ones that practiced their breach response plan before they needed it.
This is why cybersecurity awareness training for your entire organization isn't optional. When employees can identify a phishing email or a social engineering attempt before it becomes a breach, you avoid the reporting process entirely. Prevention is cheaper than notification by orders of magnitude.
How to Report a Data Breach: Quick Reference
If you landed here looking for a concise answer, here it is:
To report a data breach, you must: (1) investigate and scope the incident immediately, (2) notify affected individuals within the timeframe required by the applicable state law (typically 30-60 days), (3) file a report with the state Attorney General in every state where affected individuals reside, (4) report to applicable federal regulators (HHS for healthcare, FTC for financial or health data, SEC for public companies), and (5) file a complaint with the FBI IC3 at ic3.gov and report to CISA if the breach involves a cyberattack. Documentation of every step is critical for demonstrating compliance.
Build the Muscle Before You Need It
The organizations that handle breach reporting well aren't the ones that panic-Google "how to report a data breach" at 2 AM on a Saturday. They're the ones that already have an incident response plan, have tested it through tabletop exercises, and have trained their employees to recognize threats before they escalate.
Phishing remains the most common initial attack vector in data breaches. The 2024 Verizon Data Breach Investigations Report found that the human element was involved in 68% of breaches. That means your employees are both your biggest vulnerability and your strongest defense — depending on whether they've been trained.
Invest in phishing awareness training for your organization that includes realistic phishing simulations. When your staff can spot a credential theft attempt or a social engineering lure, you reduce your breach risk dramatically. Pair that with multi-factor authentication and a zero trust architecture, and you've addressed the root causes behind most reportable breaches.
Your Breach Response Checklist
- Maintain a written incident response plan — review it quarterly
- Assign roles: incident commander, legal counsel, communications lead, forensics lead
- Keep a current inventory of what personal data you hold and where
- Map your notification obligations by state and sector before a breach occurs
- Pre-draft notification letter templates with legal review
- Run tabletop exercises at least annually — include executives and legal
- Train every employee on security awareness and phishing recognition
- Establish relationships with law enforcement contacts (FBI field office, CISA regional staff) before you need them
Don't Let the Breach Define You — Let the Response
Every organization of meaningful size will face a data breach at some point. That's the reality of operating in 2025. What separates the organizations that survive from the ones that end up in congressional hearings and class action lawsuits is not whether the breach happened — it's how they responded.
Report quickly. Report completely. Report to every entity that has a legal right to know. Document every decision and every timeline. And start building your defenses and your response capabilities today, before you need them.
The breach itself is a bad day. A botched notification is a bad year. Know the difference, and plan accordingly.