The Justice Against Malicious Algorithms Act Removes Liability Shield When a Platform Knowingly or Recklessly Promotes Harmful Content
Press Release 10/14/21
WASHINGTON, D.C. – Energy and Commerce Committee Chairman Frank Pallone, Jr. (D-NJ), Communications and Technology Subcommittee Chairman Mike Doyle (D-PA), Consumer Protection and Commerce Subcommittee Chair Jan Schakowsky (D-IL), and Health Subcommittee Chair Anna Eshoo (D-CA) announced new legislation today to reform Section 230 of the Communications Decency Act, which shields websites and online platforms from being held liable for third-party content.
The legislation, titled the Justice Against Malicious Algorithms Act, would amend Section 230 to remove absolute immunity in certain instances. Specifically, the bill would lift the Section 230 liability shield when an online platform knowingly or recklessly uses an algorithm or other technology to recommend content that materially contributes to physical or severe emotional injury. The bill will be introduced in the House tomorrow.
“Social media platforms like Facebook continue to actively amplify content that endangers our families, promotes conspiracy theories, and incites extremism to generate more clicks and ad dollars. These platforms are not passive bystanders – they are knowingly choosing profits over people, and our country is paying the price,” said Pallone. “The time for self-regulation is over, and this bill holds them accountable. Designing personalized algorithms that promote extremism, disinformation, and harmful content is a conscious choice, and platforms should have to answer for it.”
“We finally have proof that some social media platforms pursue profit at the expense of the public good, so it’s time to change their incentives, and that’s exactly what the Justice Against Malicious Algorithms Act would do,” Doyle said. “Under this bill, Section 230 would no longer fully protect social media platforms from all responsibility for the harm they do to our society. It’s my hope that by making it possible to hold social media platforms accountable for the harm they cause, we can help optimize the internet’s impact on our society.”
“The era of self-regulation is ending, and apologies and promises are no longer acceptable: today I join my colleagues to protect American consumers from companies that consistently put profits over people,” said Schakowsky. “Technology companies like Facebook say their platforms give every user a voice, but they amplify some voices over others. The Justice Against Malicious Algorithms Act holds these companies accountable for the severe harm they cause by spreading dangerous information. Today we make clear that people are more important than profits.”
“As Facebook whistleblower Frances Haugen has proven through testimony and documents, Facebook is knowingly amplifying harmful content and abusing the immunity of Section 230 well beyond congressional intent,” said Eshoo. “The Justice Against Malicious Algorithms Act ensures courts can hold platforms accountable when they knowingly or recklessly recommend content that materially contributes to harm. This approach builds on my bill, the Protecting Americans from Dangerous Algorithms Act, and I’m proud to partner with my colleagues on this important legislation.”
Online platforms use a personal user’s history to recommend or prioritize content. The Justice Against Malicious Algorithms Act would remove Section 230 immunity if an online platform knowingly or recklessly uses an algorithm to recommend content to a user based on that personal information, and if that recommendation materially contributes to physical or severe emotional injury.
The bill targets malicious algorithms but does not apply to search features or algorithms that do not rely on personalization. It would also not apply to internet infrastructure such as web hosting or data storage and transfer, or to small online platforms with fewer than five million unique monthly visitors or users.
The legislation is the result of years of hearings and oversight efforts, including an October 2019 hearing on Section 230 and content moderation, a June 2020 hearing on the rise of online disinformation and extremism, and March’s hearing with the CEOs of Facebook, Twitter, and Google.