top of page
  • Writer's pictureSamuel Kwan

Facebook's criminal liability

Updated: Oct 20, 2023

Facebook whistleblower Frances Haugen testified before the Senate Commerce Subcommittee on Consumer Protection, Product Safety, and Data Security on Oct. 5, 2021. Haugen accused her former employer of making deliberate choices that harmed children, public safety and democracy. The hearing received a great deal of coverage, with some lawmakers and media referring to the hearing as Facebook’s “Big Tobacco moment” for how the whistleblower testimony may bring enough attention to an industry’s harmful effects to cause real change. However, this was not the first time Facebook has been in the spotlight for the wrong reasons.


In March 2018, the Federal Trade Commission (FTC) announced it was investigating Facebook for possibly violating a 2011 settlement agreement with the FTC when Facebook improperly released users’ private data to Cambridge Analytica, a firm tied to Donald Trump’s presidential campaign. A year later, in March 2019, The New York Times reported that numerous U.S. states and the European Union were investigating whether Facebook failed to protect users’ privacy.


The FTC can choose from a panoply of federal statutes to charge companies for breaking the law—primarily, the Federal Trade Commission Act (FTCA), which deals with unfair competition, and the Clayton Act, which addresses mergers and monopolies. However, neither act carries criminal penalties. There is also the issue of Section 230 of the Communications Decency Act of 1996. Section 230 essentially protects social media companies like Facebook from liability for what users post. There have been bipartisan calls to amend or abolish Section 230, so that social media sites would have greater incentive to police online conduct. Neither the FTCA, Clayton Act, nor especially Section 230, are enough to deter Facebook and others from prioritizing profit over user safety.


Haugen testified that Facebook purposely did not act to stop the spread of hate speech and online misinformation because taking down posts and links could cause the company to lose profit. Perhaps if Section 230 was changed, such as to remove Facebook’s immunity from civil liability, or to impose criminal liability for certain harmful content, the company might find motivation to prioritize public safety over profits. There might already be other ways to hold Facebook accountable. Relating to the 2011 settlement agreement, if Facebook didn’t accurately report on how it shared data, this error could lead to a charge of filing false information with the FTC.


Relevant criminal statutes exist that could be used to charge Facebook for hosting hateful content. A creative prosecutor might be able to cite Facebook’s potential criminal liability under 18 U.S.C. § 373, Solicitation to Commit a Crime of Violence. Section 373 states it is not a defense that someone is “irresponsible.” An affirmative defense under the law is a renunciation of a criminal intent, preventing the crime from happening. It could be argued then, by not removing posts that threaten others or call for violence, that Facebook did not take efforts to renunciate solicitation to commit violence. On Jan. 6, 2021, the day of the Capitol insurrection, one Facebook employee even posted “haven’t we had enough time to figure out how to manage discourse without enabling violence?”


How law enforcement might interpret Facebook’s role in enabling violence based on discrimination could perhaps also subject it to criminal civil rights violations under 18 U.S.C. Chapter 13. There is a larger array of existing law, beyond civil antitrust statutes or weak communications acts, that could potentially be used to make Facebook answer for the harm it has caused to society. More needs to be done to hold Facebook accountable since, according to Haugen, the company’s leadership was aware of the harm Facebook was causing but chose to do nothing to prevent it.

Recent Posts

See All
bottom of page