Inside Cybersecurity

March 28, 2024

Daily News

Senators delve into complexities of Section 230 to address disinformation, other user-generated content issues

By Charlie Mitchell / July 29, 2020

A Senate Commerce subcommittee heard cautionary notes on reforming the law that provides liability protection for websites hosting user-generated content -- an issue that burst into the cybersecurity discussion amid Russian manipulation of social media to influence U.S. elections -- with calls for creation of a panel patterned after the Cyberspace Solarium Commission to examine so-called Section 230 issues.

Bipartisan alliances have emerged both in favor and in opposition to possible Section 230 reform. Some congressional Republicans concerned about political bias in the moderation of content have joined hands with Democrats interested in identifying and rooting out disinformation spawned by Russia and other hostile foreign actors.

Sens. John Thune (R-SD) and Brian Schatz (D-HI), chair and ranking member of the communications and internet subcommittee, respectively, in June introduced the Platform Accountability and Consumer Transparency (PACT) Act to reform Section 230 of the Communications Decency Act. It “would strengthen transparency in the process online platforms use to moderate content and hold those companies accountable for content that violates their own policies or is illegal.”

Thune at the hearing claimed web platforms were sustaining a “fiction of being a neutral platform for all ideas,” and that “broad” Section 230 application poses problems” for federal agencies and law enforcement.

Schatz cited many benefits to Section 230 but said “it’s OK to amend a law so it continues to work well.” The PACT Act would place new accountability on online platforms and “offers common-sense changes.”

But an actual legislative solution will be difficult, witnesses at the subcommittee’s Tuesday hearing noted.

Jeff Kosseff of the U.S. Naval Academy’s cyber science department testified that it’s difficult to see how the nation’s most prominent tech companies would have emerged without Section 230, and that creation of a “Solarium-like commission” might be the best way to examine the potential impacts of changing the law.

Elizabeth Banker of the Internet Association likewise cautioned that “not all the problems with user-generated content can be solved” by reforming Section 230.

“I’d love to see a ‘Section 230 Commission,’” Cyber Readiness Institute executive director Kiersten Todt said in a recent conversation with Inside Cybersecurity. Todt, who also served as executive director of the 2016 Commission on Enhancing National Cybersecurity, in an interview said “the issues we’ve seen with Twitter and other platforms show the need for high-level work. We can make this effective by sharply focusing on Section 230, social media and tech platforms.”

Todt said, “If ever there was an example of social media as critical infrastructure, it’s Twitter. The ease to hack into Twitter is unacceptable.”

The Tuesday hearing also featured testimony by Fordham University law professor Olivier Sylvain, who agreed Section 230 protection “has allowed an array of innovative applications for user generated content to proliferate.” However, Sylvain said:

[T]he current legal protection under Section 230 has also cultivated in application developers a cool, above-it-all indifference to (1) public law norms and (2) the immediate lived harms that so much of the content and data that they distribute causes. Dangerously misleading public health related information, disinformation about elections, nonconsensual pornography, and discriminatory advertising, all of which may be illegal in any given circumstance, proliferate still because online intermediaries do not have to bear the responsibility for designing systems that carefully distribute them. The question for you is whether there is something legislation can do to cultivate and engender a demonstrable sense of social responsibility.

A Section 230 author offers views

Former Rep. Christopher Cox (R-CA), an author of Section 230 in the 1996 telecom act, testified that a “well-crafted statute could provide some good here,” including “clear guidance to platforms.”

But Cox said, “As Congress considers whether to amend Section 230, therefore, it is important to keep in mind the many aspects of the modern internet we take for granted, and that are dependent upon Section 230’s protections. Compromising those protections risks a wide array of unintended consequences.”

He said, “In the 21st century, Section 230’s protection of website operators from liability for content created by their users operates as an essential buttress of free expression.”

He said the current law provides tools for combatting abuses and doesn’t impede criminal law enforcement at the federal or state level. Cox did call for “deputizing” state attorneys general to enforce relevant federal laws and “parallel” state laws.

Section 230 contains “an objective test” for moderating content, Cox said, without which “the potential for three-to-seven years of litigation” would exist “for every single piece of content.”

Cox expressed support for some of the aims of the Thune-Schatz bill, but raised concerns on the specifics related to transparency, liability and enforcement. Cox is now counsel at Morgan Lewis and Bockius; he was not testifying on behalf of clients or the firm.

Sen. Amy Klobuchar (D-MN) stressed that “we don’t have the protections in place” to confront disinformation and other fallout from migration of “billions and billions of dollars” from TV and radio, which have regulations, “to websites, which don’t.”

Klobuchar asked Cox whether he would’ve written in protections back in 1996 “if you knew then about targeting of elections,” but Cox stressed elements in the law that say a platform “deliberately distributing illegal content can be prosecuted.”

Klobuchar and fellow committee Democratic Sen. Richard Blumenthal of Connecticut argued that current protections are inadequate. Blumenthal said “the broad message to industry is the time for reform is now.”

Commerce petitions FCC on Section 230

Meanwhile, the Commerce Department’s National Telecommunications and Information Administration on Monday petitioned the FCC “to make clear when online platforms can claim section 230 protections if they restrict access to content in a manner not specifically outlined under the Act.”

Commerce Secretary Wilbur Ross said in a statement: “Many Americans rely on online platforms to stay informed and connected, sharing their thoughts and ideas on issues important to them, which can oftentimes lead to free and open debate around public policies and upcoming elections. It has long been the policy of the United States to foster a robust marketplace of ideas on the Internet and the free flow of information around the world. President Trump is committed to protecting the rights of all Americans to express their views and not face unjustified restrictions or selective censorship from a handful of powerful companies.”

The petition seeks clarity on:

  • Whether, and to what degree, Section 230 of the Communications Decency Act provides protection for social media’s content moderation decisions
  • The conditions under which content moderation and editorial decisions by social media companies shape content to such a degree that section 230 no longer protects them
  • Social media’s disclosure obligations with respect to their content moderation practices

“Times have changed, and the liability rules appropriate in 1996 may no longer further Congress’s purpose that section 230 further a ‘true diversity of political discourse,’” NTIA said. “A handful of large social media platforms delivering varied types of content over high-speed internet have replaced the sprawling world of dial-up Internet Service Providers (ISPs) and countless bulletin boards hosting static postings. … Thus, the fundamental assumptions driving early section 230 interpretation are antiquated and lack force, thus necessitating a recalibration of section 230 protections to accommodate modern platforms and technologies.” -- Charlie Mitchell (cmitchell@iwpnews.com)