◼ Dossier

Mark Zuckerberg

Founder and CEO, Meta Platforms (Facebook, Instagram, WhatsApp).

Net worth: ~$213 billion (Forbes, May 2026)

Built the world's largest surveillance infrastructure under the banner of "connecting people." Let a genocide happen in Myanmar because content moderation would have cost money. Knew his platform was psychologically damaging teenage girls and chose engagement over their safety. Violated a 2012 FTC consent decree four years after signing it. Collected biometric data from millions of users without consent. Built ad tools that let landlords discriminate by race. The consistent pattern: internal research surfaces harm, harm gets suppressed, regulators act too late, settlements close without admission, cycle repeats.

6 documented violations

1. Federal privacy law violation / consumer deception

Settled

Cambridge Analytica: 87M users' data harvested without consent — $5B FTC fine, $100M SEC settlement

2018–2019

Cambridge Analytica harvested the personal data of up to 87 million Facebook users without informed consent, using it to build psychological profiles for political advertising in the 2016 US election and Brexit. This violated the 2012 FTC consent decree Facebook had signed after prior privacy abuses. The FTC imposed a $5 billion penalty — the largest civil fine in FTC history at the time. The SEC separately fined Facebook $100 million for misleading investors about data misuse risks.

  • A personality quiz app called "This Is Your Digital Life" accessed not only quiz-takers' data but the data of all their Facebook friends — without those friends' knowledge or consent.
  • This was not a breach. Facebook's API was designed this way. The company permitted it.
  • The 2012 FTC consent decree required Facebook to obtain affirmative consent before sharing data with third parties. It did not comply.
  • 87 million users' psychological profiles were used to micro-target political advertising for Ted Cruz, Donald Trump, and Brexit campaigns.
  • FTC vote to impose $5 billion penalty: July 2019. Largest FTC civil penalty in history at the time.
  • SEC settlement: $100 million for misleading investors about the risk that user data had been misused.
  • Australia's Information Commissioner separately sued Facebook in 2020, estimating 311,000+ Australians were affected.
  • Facebook did not admit wrongdoing.
FTC press release — $5 billion Facebook penalty, July 24, 2019

2. Facilitation of mass atrocity (moral crime)

Active / ongoing

UN: Facebook played "determining role" in Myanmar genocide — $150B class action filed

2013–2017

Facebook was the internet in Myanmar. The Myanmar military used the platform systematically to spread dehumanizing hate speech against the Rohingya Muslim minority — fabricated atrocity stories, calls for ethnic cleansing, coordinated incitement. The UN's Independent International Fact-Finding Mission on Myanmar (2018) found Facebook played a "determining role" in spreading hatred. An estimated 10,000–25,000 Rohingya were killed. 700,000+ were displaced. Civil society organizations had warned Facebook about the incitement since 2013.

  • Facebook entered Myanmar's market as smartphones went from near-zero to widespread penetration. For many users, Facebook was their only internet.
  • The Tatmadaw (Myanmar military) ran coordinated campaigns on Facebook — fake accounts, fabricated stories of Rohingya violence against Buddhists, explicit incitement to killing.
  • Facebook had a single Burmese-language content moderator for years while the campaigns ran.
  • Civil society researchers and organizations warned Facebook repeatedly from 2013 onward. The warnings were not acted on.
  • UN Fact-Finding Mission on Myanmar, August 2018: Facebook "played a determining role in spreading hate speech." Mission called on Facebook to investigate its role.
  • Facebook acknowledged in 2018 that it "wasn't doing enough to help prevent our platform from being used to foment division and incite offline violence."
  • The company did not remove the military's accounts until after the UN report was published.
  • 2021: A $150 billion class action lawsuit was filed in California by Rohingya refugees.
UN Independent International Fact-Finding Mission on Myanmar — 2018 Report

3. Suppression of internal safety research / harm to children

Active / ongoing

Facebook Files: internal research showed Instagram harming teens — Zuckerberg personally blocked fixes

2019–2026

Former Facebook data scientist Frances Haugen leaked tens of thousands of internal documents in October 2021 — the Facebook Files. They showed that Facebook's own research found Instagram made body image worse for 1 in 3 teenage girls, that 13.5% of teen girls said Instagram made suicidal thoughts worse, and that the company chose not to act. Court documents unsealed in 2024 showed Zuckerberg personally rejected proposals to improve teen mental health on the platform. In March 2026, New Mexico won civil penalties of $375 million and $372 million against Meta for harming children.

  • Internal Facebook research (2019–2020): "We make body image issues worse for one in three teenage girls."
  • Internal finding: 13.5% of teen girls on Instagram said the platform made suicidal thoughts worse.
  • 14% of teenage boys in the US reported negative social comparison effects from Instagram.
  • Frances Haugen testified before the Senate Commerce Committee (October 5, 2021): "Facebook, over and over again, chose to optimize for its own interests, like making more money."
  • The algorithm was known internally to amplify anger and outrage. This was a deliberate design choice, not a bug.
  • Court documents (2024): Zuckerberg personally rejected Meta's proposals to improve teenagers' mental health.
  • New Mexico civil penalties: $375 million and $372 million (March 2026) for harms to children's mental health.
  • A New Mexico lawsuit separately alleged Meta failed to restrict harmful AI bot interactions with minors, with evidence Zuckerberg approved access despite safety warnings.
Frances Haugen Senate testimony — Protecting Kids Online, October 5, 2021

4. State privacy law violation (biometric data)

Settled

Collected biometric data from millions of Texans without consent — $1.4B settlement

2022–2024

For years, Meta collected facial geometry data from photos tagged on Facebook in Texas, without obtaining user consent as required by Texas's Capture or Use of Biometric Identifier (CUBI) Act. The data powered Meta's facial recognition features. In July 2024, Meta agreed to pay $1.4 billion to the state of Texas — the largest privacy-related settlement ever obtained by a state attorney general in US history.

  • Texas CUBI Act prohibits capturing biometric identifiers — including facial geometry derived from photos — without informed consent.
  • Meta's tagging features analyzed facial geometry from millions of tagged photos without disclosing the data capture or obtaining consent.
  • Texas Attorney General filed suit against Meta in 2022.
  • Settlement: $1.4 billion, July 2024. Largest state AG privacy settlement in US history.
  • Meta did not admit wrongdoing.
Texas Attorney General — $1.4 billion Meta settlement, July 2024

5. Data protection law violation (EU)

Regulatory fine

Unlawful EU-US data transfers — record €1.2 billion GDPR fine

2020–2023

The European Data Protection Board fined Meta €1.2 billion in May 2023 — the largest GDPR fine ever imposed. Meta had transferred EU users' personal data to US servers without providing the privacy protections required under EU law. Meta was ordered to stop the transfers and delete the unlawfully stored data.

  • EU General Data Protection Regulation (GDPR) requires that personal data transferred outside the EU receive equivalent protections.
  • Meta transferred EU users' data to US servers under Standard Contractual Clauses found inadequate by the Irish Data Protection Commission (DPC) following guidance from the EDPB.
  • The EDPB ruling (May 2023) imposed the fine and ordered Meta to bring its transfers into compliance within five months.
  • €1.2 billion fine: largest GDPR penalty in the regulation's history, surpassing Amazon's €746M (2021) and WhatsApp's €225M (2021).
European Data Protection Board — Meta GDPR ruling, May 2023

6. Civil rights law violation (Fair Housing Act)

Consent decree

Built housing ad tools that excluded protected classes by race, religion, disability — DOJ consent decree

2019–2022

Meta's advertising platform allowed landlords and real estate advertisers to exclude potential renters and buyers from seeing housing ads based on race, religion, national origin, sex, disability, and familial status — all protected categories under the Fair Housing Act. The Department of Justice filed suit in June 2022. Meta entered a consent decree requiring it to overhaul its ad targeting system.

  • Meta's ad tools included demographic targeting parameters — including race proxy attributes and zip codes — that enabled advertisers to discriminate in housing ads.
  • This capability was built into the product. Advertisers could filter audiences by characteristics that served as racial, religious, or disability proxies.
  • 2019: Meta settled with the ACLU and a coalition of civil rights organizations — HUD, NFHA, and others — for discrimination in housing, employment, and credit ads.
  • June 2022: DOJ filed suit despite the 2019 settlement, finding ongoing discriminatory practices.
  • Consent decree: Meta required to build a new ad system ("Special Ad Audiences") that removes discriminatory targeting from housing, employment, and credit advertising.
DOJ press release — Meta housing discrimination lawsuit, June 21, 2022

Editorial

Zuckerberg is not an engineer who got rich building something useful. He is the architect of a global psychological modification system optimized for advertising revenue. Every time that system caused visible harm — Myanmar, teen suicide, election interference, privacy stripping — his company had internal documentation of the harm and chose not to stop it.

The FTC's $5 billion fine — still the largest in FTC history — amounts to less than three months of Meta's profit. The €1.2 billion EU fine is a rounding error in his net worth. The $1.4 billion Texas settlement was paid without admission. The Myanmar genocide lawsuit is ongoing, and no individual has been held criminally liable for the role of Facebook in enabling it.

Six documented charges. Zero criminal prosecutions. Net worth ~$213 billion. The surveillance infrastructure is still running.