Unlike technologies like light bulb, motor vehicles or the telephone, internet has no single “investor”. Instead, it has evolved over time. It got its start in the United States about 50 years ago, as a government weapon during the cold war. It was used as a means to share data and for communication. Along with all the advantages it brought, there was a concern among people of how it would shape the future. Some viewed the internet like a television, with a fear that sexual images were going to come blasting out of screens at unwitting children. Others considered it more similar to a telephone, a conduit or pipe agnostic to the content passing through it. But as we know it today, it is neither. In simple words, internet is a user-generated mirror that reflects our best and worst, our family photos, pet videos, unsolicited opinions, and stories of successes and setbacks. It shows us, who we really are and what we really think. These thoughts sometimes take a wrong turn, making internet a host to a flood of misinformation, hate speech, conspiracies, nefarious foreign meddling and terrorist propaganda.
For the internet we have today, we are grateful to the Section 230 of the Communications Decency Act of 1996, which protects social media giants such as Facebook, Twitter and YouTube from legal liability for material posted on their sites by third parties.
Section 230 is a section of the United States Communications Decency Act that generally provides immunity for website platforms from third-party content. At its core, Section 230(c)(1) provides immunity from liability for providers and users of an "interactive computer service" who publish information provided by third-party users:
No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.
The communications decency act was United States’ first notable attempt to regulate pornographic material on internet. Considered as Sen. James Exon’s brainchild, the law was introduced with an aim to prevent ‘filth’ on the internet. Due it its overreaching nature, it raised many questions like if it violates the First and Fifth Amendments by being overly broad and if the act is vague in their definitions of the types of internet communications which they criminalized?
Following this the Court held that the Act violated the First Amendment because its regulations amounted to a content-based blanket restriction of free speech. It also failed to clearly define "indecent" communications, limit its restrictions to particular times or individuals, provide supportive statements from an authority on the unique nature of internet communications, or conclusively demonstrate that the transmission of "offensive" material is devoid of any social value. It was also ruled out that since the First Amendment distinguishes between "indecent" and "obscene" sexual expressions, protecting only the former, the Act could be saved from facial over breadth challenges if it dropped the words "or indecent" from its text. Hence, it was struck down and what remained was the provision that allowed filth and other truly damaging content to metastasize on the internet. It has been strengthened by years of court decisions upholding the liability shield, even in lawsuits brought by people whose careers, businesses and lives were ruined by what someone posted about them on the Web.
The 1995 lawsuit of Stratton Oakmont, Inc. v. Prodigy Services Co. raised deep concern to save the internet and its economic potential. The efforts by the then Rep. Ron Wyden, Democrat of Oregon, and Rep. Chris Cox, Republican of California, for the inclusion of the Section 230 in the CDA occurred due to a series of events.
In October 1994, an unidentified user of Prodigy’s Money Talk bulletin board created a post which claimed that Stratton Oakmont, a Long Island securities investment banking firm, and its president Danny Porush, committed criminal and fraudulent acts in connection with the initial public offering of stock of Solomon-Page, Ltd. Later, Stratton Oakmont sued Prodigy as well as the unidentified poster for defamation.
Following a series of arguments, the court ruled out that Prodigy was liable as the publisher of the content created by its users because it exercised editorial control over the messages on their bulletin boards in three ways:
by posting Content Guidelines for users
by enforcing those guidelines with "Board Leaders"
by utilizing screening software designed to remove offensive language.
The result of the case was central to the passage of the communications decency act of 1996, aimed to allow internet service providers to avoid liability for user content on their services while still giving them the means for removing illegal content.
Without it, platforms would face the Hobson’s choice, which claimed that, if they did anything to moderate user content, they would be held liable for that content, and if they did nothing, who knew what unchecked horrors would be released.
What lies ahead for social media reform?
Section 230 was enacted at a time when America had less than 8%age of internet users. The law’s spirit of time and its brevity has always kept a door open for wide interpretations. This internet law is disliked on both sides of the aisle.
While the Democrats argue that Section 230 allows platforms to get away with too much, particularly with regard to misinformation that threatens public health and democracy, the Republicans argue that the platforms censor user content to republican’s political disadvantage.
Even former president, Donald Trump tried to persuade the congress into repealing section 230 completely by threatening to veto the unrelated annual defense spending bill.
With criticisms from both sides, it is likely; the republicans will reform Section 230 in near future. Already, Democrats and Republicans have proposed over 20 reforms – from piecemeal changes to complete repeal. There is also a possibility that these reforms might be harmful.
Apart from this, social media giants, Google and Facebook too, have suggested some liability only if they follow best practices for removing damaging material from their platforms.
Follow LexTalk World for more news and updates from International Legal Industry.