Section 230 – The Libertarian Republic https://thelibertarianrepublic.com "Rebellion to tyrants is obedience to God" -Benjamin Franklin Wed, 23 Jun 2021 17:22:55 +0000 en hourly 1 https://wordpress.org/?v=6.6.2 https://thelibertarianrepublic.com/wp-content/uploads/2014/04/TLR-logo-125x125.jpeg Section 230 – The Libertarian Republic https://thelibertarianrepublic.com 32 32 47483843 Conservatives Should Take Lessons From the Left on Protecting Free Speech https://thelibertarianrepublic.com/conservatives-take-lessons-from-left-free-speech/ https://thelibertarianrepublic.com/conservatives-take-lessons-from-left-free-speech/#comments Wed, 23 Jun 2021 17:22:55 +0000 https://thelibertarianrepublic.com/?p=119466 Conservative politicians rage over Big Tech’s use of its monopoly powers and reliance on Section 230 of the Communication Decency Act (“Section 230”) to censor speech on social media. Their rhetoric is full of sound and fury but achieves nothing. Moreover, with significant help from Big Tech, Democrats gained control...

The post Conservatives Should Take Lessons From the Left on Protecting Free Speech appeared first on The Libertarian Republic.

]]>
Conservative politicians rage over Big Tech’s use of its monopoly powers and reliance on Section 230 of the Communication Decency Act (“Section 230”) to censor speech on social media. Their rhetoric is full of sound and fury but achieves nothing. Moreover, with significant help from Big Tech, Democrats gained control of Congress and the White House. As a result, Democrats are unlikely to change Section 230 legislatively. To break Big Tech’s stranglehold on free speech on social media, those seeking reform should look to the Left’s brilliant strategy to regulate climate change without passing any new law.

Section 230 grants Big Tech immunity from civil suit when regulating the Internet. Being free of liability has allowed Big Tech to grow in size and wealth while achieving regulatory control over social media and political speech it deems objectionable. Big Tech even banned a sitting President from Facebook and Twitter. In addition, Google and Apple blocked Parler’s App from their stores, and Amazon Web Services suspended Parler’s access to its cloud network, thereby shutting it down until it could find new servers.

As conservative politicians rage at Big Tech, Governor DeSantis (R-FL) proposed and had passed a law prohibiting Big Tech from de-platforming the speech of political candidates and imposed fines on violators as high as $250,000 a day. Florida also allows citizens to sue companies violating the law. However, the penalties will not impact Big Tech’s actions. A few days ago, Google, with a market capitalization over $1 trillion, settled an antitrust case with French authorities for $270 million, a fraction of a percent of its worth. These individual attempts to control the power of Big Tech will be of limited success against the five largest tech giants, having a value of $ 5.2 trillion. The state of Florida’s budget is $101.5 billion. The 2021 combined budgets for all fifty states is $2.1 trillion.

To seriously challenge Big Tech, conservatives will need to develop and implement a coordinated and aggressive strategy that includes legislators, governors, attorneys general, interest groups, legal centers, and citizens who want freedom of speech in the public square.

This strategy has worked. The Left developed, coordinated, and implemented it to impose climate change regulation on the U.S. without Congress ever passing a climate change law. The strategy’s components:

  1. Using Administrative Procedure Act laws, file petitions with federal and state agencies to initiate rulemaking to change regulatory policy.
  2. If the petitions are denied, a coalition of states, cities, and non-profit organizations will appeal the denial to the courts.
  3. Use states as lead petitioners in court challenges to secure special standing recognized by the U.S. Supreme Court.
  4. Organize states with similar policy views into regional working units, similar to Compacts, designed to regulate activities of regional concern. For example, nine states in the Northeast and Mid-Atlantic, and three west coast, formed regional compacts to address climate concerns.
  5. Encourage state Attorneys General, and coalitions to sue on innovative legal theories. One example would be Big Tech’s potential liability for flagging posts as “misinformation,” which were later found to be credible, and Big Tech retracted its label. Specifically, Facebook’s recent retraction of its ban on posts concerning coronavirus leaks from a Chinese lab. Under Section 230(f)(3), when Facebook labels a third-party’s content “misinformation,” it is acting as an authoritative speaker on its platform by producing, in whole or in part, new content with the third party. Since Section 230 protects platforms, not speakers, Facebook’s speech no longer has immunity from civil liability, and facts supporting its new content may be subject to civil discovery.
  6. Environmental groups brought hundreds of NEPA cases to deny permits to oil and gas operations. NEPA lawsuits could also be brought to block permits for Big Tech’s massive energy-consuming data centers.
  7. Environmental organizations and state attorneys general brought over One Thousand lawsuits against the government and private parties in the U.S. to impose climate regulation. So perhaps a class action suit against Big Tech is appropriate by those whose comments on the lab leak were flagged as “misinformation.”

If conservatives are serious about reforming Section 230, they have all the tools for success: 27 Republican governors, 25 Republican Attorneys General, and certainly numerous groups that believe in free speech. Recently, 25 red states independently opted out of accepting extra federal unemployment benefits. These actions demonstrate a significant base of potential support.

The key to success rests in the conservative’s willingness to plan, organize and implement like the Left. So far, conservatives have found greater ego satisfaction on conservative cable channels than the difficult task of organizing, implementing, and accomplishing something—like preserving the free speech they constantly tell Americans they want to protect.

 

Image: Josh Hawley on YouTube

The post Conservatives Should Take Lessons From the Left on Protecting Free Speech appeared first on The Libertarian Republic.

]]>
https://thelibertarianrepublic.com/conservatives-take-lessons-from-left-free-speech/feed/ 4 119466
Regulation, Moderation, and Social Media Decentralization https://thelibertarianrepublic.com/regulation-moderation-and-social-media-decentralization/ https://thelibertarianrepublic.com/regulation-moderation-and-social-media-decentralization/#comments Thu, 20 May 2021 19:38:29 +0000 https://thelibertarianrepublic.com/?p=119198 “Do you remember the internet in ’96?” a silent television display asks in Facebook’s quintessential Klavika font during an ad break. The sound of a dial-tone connection shrieks out of the television and captures the attention of the casual viewers who have turned elsewhere or to social media during the...

The post Regulation, Moderation, and Social Media Decentralization appeared first on The Libertarian Republic.

]]>
“Do you remember the internet in ’96?” a silent television display asks in Facebook’s quintessential Klavika font during an ad break. The sound of a dial-tone connection shrieks out of the television and captures the attention of the casual viewers who have turned elsewhere or to social media during the intermission.

At a blistering 2021 speed, the screen shifts from archaic interfaces to modern emojis without giving as much as a second to focus before clearing the display for text reading “It’s been 25 years since comprehensive internet regulations were passed. It’s time for an update.” It is part of Facebook’s pro-regulatory advertisement push entitled, “Born in ’96” part of the larger “It’s Time” campaign.

As advertisements go, this one is remarkably effective, if a little overbearing. Would you expect anything less from the king of every corner in the advertising market?

The ecosystem created for the internet in ‘96 by the Communications Decency Act (CDA) has clearly not impeded Facebook’s success. After being created in 2004, Facebook was established in a post-CDA world and has played a leading role in establishing that world’s bounds. Nevertheless, the company’s anxiety over ambiguities in the ancient digital legislation is understandable.

Rather than letting Facebook’s executives design the social media market of the future, what if there were free competition? Not the kind of competition that Twitter and even Parler provide Facebook, rather, a type of decentralized competition that challenges the structure that the Silicon Valley giants are built on. In other words, how about a polycentric organization of competition that makes the CDA obsolete and breaks up vertical monopolies on user-generated content and use data? Thanks to an unexpected source, that competition may not be far away.

To Moderate or Not to Moderate is NOT the Question

Chances are, if you are made uneasy by a social media giant lobbying to change the rules that govern it and its competitors, that dubious feeling may come from a general distrust of Facebook itself. Facebook may or may not have lost your trust after Russia used the platform to target Americans with divisive advertisements during the 2016 election, or after CEO Mark Zuckerberg was summoned to Congress to testify in 2018 about the site’s alleged internal content moderation bias against conservatives. Even without negative associations, however, new regulations on established markets create barriers to entry and disincentivize competition. In this case, new regulations would mandate that social media companies practice internal content moderation—otherwise simply known as moderation—something that has strained Facebook’s abilities up until recently.

At the root of Facebook’s legal issues is the CDA. Although originally intended to determine what content was suitable for television, the CDA became one of the most foundational regulations for the burgeoning internet. Insofar as the internet is concerned, the CDA mandates that a site may not publish certain indecent, and often independently criminal, content. It also delegates the enforcement of these rules to a regulatory agency, the Federal Communications Commission (FCC), instead of leaving the justice system to sort out victims and perpetrators. Notably, the CDA also creates a distinction between “publishers,” standard websites that curate or create content, and “platforms” like social media sites that allow anyone to post and merely aggregate and serve content to consumers.

Distorting the justice system by inserting executive agencies between victim and perpetrator creates a topsy-turvy system. As it stands, proving the facilitatory guilt on the part of a social media company is far easier than proving that any crime outside of the scope of the CDA had been committed in the first place.

Under the CDA, there are two systems of online content production. Publishers are obligated to internally moderate content such that it remains within the bounds of what the law considers acceptable speech. On the other hand, platforms are not held to this standard and are, by the nature of the distinction, barred from behaving as publishers. Although this clause, better known simply as “section 230,” has been touted by many as the saving grace of the CDA from the perspective of free speech, it is also the wedge that causes Facebook to take flack from both the left and right.

Facebook has been scrutinized for not moderating content strictly enough in 2016 and for being too politically restrictive ever since. New regulations would certainly clear up Facebook’s role, especially if Facebook’s on-staff legal team would have a say in the verbiage of any proposed bill, which it likely would. Either way, moderation of user-generated media, and therefore free speech online, will either be centralized under a federal agency or only distributed between a few massive companies which themselves have nothing to do with content production.

The free market provided several alternatives to Facebook and Twitter but few gained traction in the face of such established competitors and steep regulatory obligations. One of these start-ups, Parler, managed to gain a healthy following when President Donald Trump was controversially removed from almost every other online platform following the storming of the Capitol on Jan 6. After gaining millions of users overnight, Parler’s web-hosting service, a subsidiary of Amazon, decided to sever ties with the company over its moderation policy. This effectively moderated the entire website off of the internet by refusing to do business with them.

Many were attracted to Parler’s moderation policies, or lack thereof, and, had it not been shut down, the site would have posed a competitive threat for a portion of Facebook’s disaffected user base. Although providing a place for truly unfettered conversation, Parler segmented conversation and would never be a comfortable place for the majority of social media users who prefer some community standards beyond the legal bare minimum to be enforced.

Besides, having a second, slightly edgier public square just outside the first is not a substitution for effective public discourse. The French third estate’s self-separation from the estates general did not, after all, create a more healthy political dialogue for the French people during the beginning of the French Revolution.

Parler was shuttered for two months while the site’s founders procured alternative web hosting services. Although currently functional, Parler’s existence is not a long-term competitive solution to the problem of legally obligatory moderation because it frames moderation, in and of itself, as a bad thing. The same could be said for President Donald Trump’s new media outlet if it is ever opened up to public contribution.

Moderation, when done offline, is a daily practice for most. Whether by choosing the members of your inner circle or choosing to only have two slices of pizza, people self-moderate their lives all the time. Centrally planned moderation, however, is called prohibition and often causes more harm than good.

Enter the Decentralized Social Media (DSM) model, a polycentric model of online interaction recently proposed on Medium by Ross Ulbricht, the currently imprisoned founder of Silk Road, an infamous illicit online marketplace that jump-started the popularity of Bitcoin in the early 2000s.

The Innovation of Decentralized Social Media

Moderation of something as big as social media is incredibly difficult and would take a massive amount of manpower if done entirely manually. Some of Facebook’s most closely guarded secrets determine the algorithms the company has developed for use in ad targeting and to facilitate moderation. The CDA both disallows this moderation and requires it, depending on which side of Section 230 a site falls on.

To oversimplify, Ulbricht’s DSM model would remove those automatic and manual moderation tools from under the hood of a social media’s servers and place those same processes in the device of the social media user under the control of separate companies that stand to profit from providing moderation and aggregation services at the discretion of the device owner. Users could access any or all of the web’s available social media content feeds at once and only be fed content within their own acceptable parameters while retaining ownership of their user data. All of this would be done through the operant function of Bitcoin, the encrypted blockchain.

In practice, the seemingly small distinction between where these algorithms are processed and who owns those functions resolves several of the questions raised and created by the CDA without having to create or pass any new laws.

Where the CDA consolidates and centralizes the responsibility for moderation under the content aggregator’s purview, start-up companies following Ulbricht’s model would compete to moderate and aggregate both user content and advertisements from social media platforms, thereby creating a market where there previously was only mandate.

Users would simply open an app wherein social media content is centralized. Users could pick and choose moderation and aggregation providers to reward with a portion of the advertisement revenue that their engagement latently generates and freely switch between providers.

Rather than having aggregation and moderation be centralized by legal obligation to a few social media companies, levels of moderation, advertisement service, and content prioritization would all be separate overlapping markets which independently compete to provide superior service and control for negative externalities.

Algorithmic moderation is a powerful tool that, along with manual moderation, can create comfortable digital environments. If moderation and aggregation were divorced from the social media platform and made a competitive marketplace, users would be free to use whatever network or combination of networks that they preferred and have the content they are served moderated however they see fit. If users could be in control of the moderation they facilitate and are subject to, the arrangement would be considerably more consensual. Social media companies under a decentralized model would not be held responsible for users misusing their digital infrastructure as content regulation would be the responsibility of the client-side moderation algorithm and the companies that compete to provide those services most effectively. If a user was ever dissatisfied with the moderation they were provided those would have market remedies.

Should someone use social media as a means to harass or threaten another, there would be no intermediate party at fault, freeing the judicial system to bring justice to guilty and affected parties alike.

Side-effects may include

Besides sidestepping the CDA’s ineffectual regulations, a DSM would protect the privacy of users by encrypting the user generated data used by moderation algorithms and keeping a function of that unique data as the user’s encryption key or proof of identity.

The value of the advertising market and the size of Facebook’s share of it are both due to the incredible amount of data that Google, Facebook, and other companies collect on every person who uses their services. This information is the company’s to sell, use to target ads, or train algorithms with. Under a DSM model, that information would be yours to sell and distribute among service providers.

The value of this information is worth much more than the emotional value of privacy. Companies like Facebook make much of their money in one way or another from the accuracy and scope of their user data collection. If that information were to become yours by using a DSM, so too would the money it generates.

As it stands, Facebook and Google data-mine users in exchange for a service. Were they to have to adapt to a DSM model, companies like this would need to shift to more traditional models where payment is offered directly for a service rendered. Ulbricht’s model would allow businesses to accommodate liquid payment between service providers like web-hosts or advertisers and the users so that the app constantly allows users to be in control of how much of their data they would like to share and how much usability they want to pay for by receiving ads.

Innovation always trumps regulation

Rather than offering prescription for what ails the social media marketplace, Ulbricht’s paper is a prediction from a prison cell. The unstoppable march of innovation is sure to further segment the digital marketplace for social programs into intricately specialized niches. The distributed social media model is merely a description of how those businesses and technologies would need to operate.

Because Ulbricht was not granted the clemency from the Trump administration that he so hoped for, the infamous programmer will not be the one to found the moderation or content aggregation start-ups that he describes. Public figures such as Jordan PetersonDave Rubin, and Tim Pool have all claimed to be creating platforms that in some way aggregate social media, beginning the process of decentralizing—or polycentrizing—the social media market. It remains to be seen if these or any start-ups will truly realize Ulbricht’s ideas, but if the CDA is not soon updated, it will likely be circumvented.

Just as 3D printers have shown several gun laws to be archaic, if not entirely obsolete, the best way to counter a bad set of laws or regulations is to create a technology or idea that renders it pointless. Ulbricht may not be the one to lead the charge, but his simple Medium post certainly opened a door.

Gavin Hanson (born in ’96) is the Editor-in-Chief of Catalyst

Published with permission from Catalyst. Read the original article here.

The post Regulation, Moderation, and Social Media Decentralization appeared first on The Libertarian Republic.

]]>
https://thelibertarianrepublic.com/regulation-moderation-and-social-media-decentralization/feed/ 3 119198
Opinion: Big Tech is a State Actor and Has Constitutional Obligations https://thelibertarianrepublic.com/big-tech-is-a-state-actor-constitutional-obligations/ https://thelibertarianrepublic.com/big-tech-is-a-state-actor-constitutional-obligations/#comments Mon, 05 Apr 2021 18:36:45 +0000 https://thelibertarianrepublic.com/?p=118669 Readers of the political press are familiar with the actions of Big Tech to censor the social media speech of former President Trump, several Republican Congressman, and purges of thousands of conservative social media accounts. Since these actions were taken by private parties against private parties, it is generally assumed...

The post Opinion: Big Tech is a State Actor and Has Constitutional Obligations appeared first on The Libertarian Republic.

]]>
Readers of the political press are familiar with the actions of Big Tech to censor the social media speech of former President Trump, several Republican Congressman, and purges of thousands of conservative social media accounts. Since these actions were taken by private parties against private parties, it is generally assumed the Constitution does not apply and Big Tech, with congressional immunity from suit, can regulate the Internet activities of private parties as it wishes.

When Big Tech uses the powers authorized by section 230 of the Communications Decency Act (CDA) to restrict access to materials on the Internet it considers “objectionable,” it is acting for the state (“State Action”). As a state actor, Big Tech must provide the same constitutional protections as government provides.

In a prior article, I argued section 230 was an unconstitutional delegation of authority by Congress to private parties. The seminal case supporting this position is  Carter v. Carter Coal, a 1936 U.S. Supreme Court case invalidating the delegation of government power to private coal producers to regulate other coal producers. The court characterized such action as “Legislative delegation in its most obnoxious form.” The holding has not been challenged for 85 years.

Unfortunately, Congress continues to ignore its unlawful delegation while Big Tech continues to regulate speech in the social marketplace as if the delegation is valid. Due to the significant impact on free speech, this controversy should be quickly resolved.  There are three possible outcomes: Congress re-writes the statute; the court declares section 230 constitutional or unconstitutional, or courts provide due process rights for objectionable speakers deprived of free speech by state actors. The first two options are years in the future. Affording due process can be immediate.

When are actions by private parties State Action?

There are two situations in which the actions of private parties are deemed State Action: (1) there is a close relationship between the actions of the private party and what government seeks to have accomplished; or (2) the private party performs a traditional government function.

 Constitutional protections are mandated when private parties are state actors

While State Action is a factual matter, the Supreme Court, in Skinner v. Railway Labor Executives’ Assn. (Labor Assn.) ruled on a situation similar to the actions of Big Tech. In Skinner, the government authorized but did not compel, private railroads to drug test employees as part of accident investigations. Railroads voluntarily conducted the tests. The Labor Association sought to enjoin the railroads from conducting drug tests, claiming unlawful searches in violation of the Fourth Amendment. The Supreme Court held that while the railroad’s program was a private initiative, the tests, encouraged by the government, cannot be viewed as private action outside of the reach of constitutional protections, i.e., state action.

As with Skinner, section 230 of CDA, did not compel Big Tech to restrict materials it deemed objectionable. Moreover, like Skinner, government’s grant of section 230 immunity and power to restrict materials, produced a close relationship between Big Tech and government that encouraged Big Tech to actively implement government’s goals, i.e., state action.

Another case, Marsh v. Alabama involved a company-owned town that operated like any other town, except that it prohibited the distribution of certain religious literature. The U.S. Supreme Court held when private parties exercise powers traditionally reserved for the state, they perform a public function; thus, bound to respect constitutional rights, the same as government.

The private parties owning the town of Marsh, like the private parties operating the Internet, both regulated speech. When Big Tech controls speech in the public square, it exercises state regulatory power. And, like Marsh, it must respect the constitutional rights of those in the square.

Courts have the power to immediately protect objectionable free speech

The actions of Big Tech are State Actions reviewable by courts that can balance the property interests of private parties against the free speech and due process rights of objectionable speakers.

Determining the process due a litigant depends on the situation. If only property rights are involved and other administrative processes are available to protect those rights, a hearing is generally not required before the deprivation occurs. However, when fundamental liberties, e.g., speech, are involved, courts must provide hearings before the deprivation of rights occurs.

While litigants cannot seek monetary damages due to Big Tech’s immunity from civil liability, they can seek a hearing for injunctive relief and discovery of why their free speech is being denied, before losing their right to speak in the public square.

 

This article was originally published in The Hill.

The Libertarian Republic publishes a wide variety of viewpoints. The opinions of the authors do not necessarily reflect those of the owner or editor.

 

 

 

 

The post Opinion: Big Tech is a State Actor and Has Constitutional Obligations appeared first on The Libertarian Republic.

]]>
https://thelibertarianrepublic.com/big-tech-is-a-state-actor-constitutional-obligations/feed/ 7 118669
Opinion: Section 230 is an Unconstitutional Delegation of Power to Big Tech https://thelibertarianrepublic.com/sec-230-unlawful-power-big-tech/ https://thelibertarianrepublic.com/sec-230-unlawful-power-big-tech/#comments Mon, 25 Jan 2021 19:54:37 +0000 https://thelibertarianrepublic.com/?p=117538 In the frenzied days after Democrats won control of Congress, the presidency, and rioters invaded the Capitol, Big Tech, relying on section 230 of the Communications Decency Act, for immunity from civil suit, launched a surprise attack on web content they deemed objectionable. Twitter permanently banned President Trump’s account, wiping...

The post Opinion: Section 230 is an Unconstitutional Delegation of Power to Big Tech appeared first on The Libertarian Republic.

]]>
In the frenzied days after Democrats won control of Congress, the presidency, and rioters invaded the Capitol, Big Tech, relying on section 230 of the Communications Decency Act, for immunity from civil suit, launched a surprise attack on web content they deemed objectionable. Twitter permanently banned President Trump’s account, wiping out his contact with 88 million followers and banned thousands of conservative social media accounts. Google and Apple blocked Parler’s App from their stores and Amazon Web Services (“AWS”) denied Parler access to its cloud network. Parler was shut down. A swath of conservatives lost the ability to speak on the Internet, the nation’s new public square, the place where political ideas are exchanged, and commerce flows.

Two questions must be answered:

  1. Can private parties controlling the public square, deprive citizens of their right to free speech?
  2. Can Congress empower private parties to regulate competitors?

Congress spectacularly muddled section 230 and the U.S. Supreme Court has not addressed it. Fortunately, decades-old Supreme Court cases involving the tech giants of yesteryear, i.e., coal companies, railroads, and company towns, provide guidance on the limits of big tech’s power to regulate the public square.

What does section 230 do?

Section 230 has two primary provisions. The first exempts internet providers from civil liability for publishing any information from another content provider that is objectionable. The second provision exempts Big Tech from liability when it takes voluntary, good faith actions, to restrict objectionable materials or provides the technical means to restrict them.

Private parties cannot deprive unpopular citizens of constitutional rights when governing the public square

By granting Big Tech immunity from civil liability when restricting material from the Internet it deemed objectionable, Congress encouraged and indirectly authorized private parties to regulate speech.  Congress has no constitutional power to authorize private parties to deprive, even unpopular citizens, of their constitutional rights. Moreover, when private parties control the new public square, they function as a government and must provide constitutional rights for all.

These principles are set out in Marsh v. Alabama, (1946). Marsh, a privately owned town, made it illegal for persons to distribute religious literature on its sidewalks. Since the town functioned like any other community having speech and commerce, citizens in the town had the same rights as if in a municipal town. When private parties wield great power over the public’s use of town services, the powers of the private parties are circumscribed by the statutory and constitutional rights of those using the town. Private property rights are not sufficient to justify restricting fundamental liberties.

Since the First Amendment severely limits governments’ power to regulate political speech, the government cannot grant private parties, functioning as a government, more power than it has. If Congress desires to impose speech limitations on the Internet, it must do so directly, by government regulation that protects the constitutional rights of citizens.

Congress cannot grant private parties the right to regulate competitors

By refusing to sell Parler’s app, and by denying Parler’s access to cloud storage, Google, Apple, and AWS, private parties, relying on a congressional grant of civil immunity, took, in essence, regulatory actions to put another private company out of business. Congress has no constitutional authority to authorize, or foster conduct by private parties, that allows them to regulate other businesses. This has been the law since the U.S. Supreme Court’s decided Carter v. Carter Coal, (1936).  

In Carter, Congress delegated to coal producers and miners the power to impose standards on other producers and miners.  Carter held a private entity “…may not be entrusted with the power to regulate the business of another, and especially a competitor. Any statute which attempts to confer such power undertakes an intolerable and unconstitutional interference with personal liberty and private property. The delegation is so clearly arbitrary, and … a denial of…due process…”

By granting immunity from liability to big tech for restricting materials Big Tech deems objectionable, Congress is sanctioning the regulation of private parties by other private parties, an action it has no constitutional authority to authorize. Regulating competition is the responsibility of the government.

The principles in Carter were upheld by the DC Circuit as recently as 2013 in American Assn of Railroads v.US DOT (reversed on other grounds).

Section 230 immunity from suit encourages Big Tech to assume the regulatory functions of government by regulating the rights of other businesses to speak and compete in the public square. The Constitution does not give Congress or private parties this power.

The post Opinion: Section 230 is an Unconstitutional Delegation of Power to Big Tech appeared first on The Libertarian Republic.

]]>
https://thelibertarianrepublic.com/sec-230-unlawful-power-big-tech/feed/ 5 117538
What Is Section 230 and Why Does Trump Want to Repeal It? https://thelibertarianrepublic.com/what-is-section-230-and-why-does-trump-want-to-repeal-it/ https://thelibertarianrepublic.com/what-is-section-230-and-why-does-trump-want-to-repeal-it/#comments Mon, 28 Dec 2020 16:57:31 +0000 https://thelibertarianrepublic.com/?p=117054 In 2020, many of us have become accustomed to terms and concepts we never thought we’d be discussing: “social distancing,” mask requirements, and Zoom parties all come to mind. We can add Section 230 to that list, an obscure provision of the Communications and Decency Act (1996) that was previously...

The post What Is Section 230 and Why Does Trump Want to Repeal It? appeared first on The Libertarian Republic.

]]>
In 2020, many of us have become accustomed to terms and concepts we never thought we’d be discussing: “social distancing,” mask requirements, and Zoom parties all come to mind.

We can add Section 230 to that list, an obscure provision of the Communications and Decency Act (1996) that was previously unknown to most.

Section 230 is a frequent target of President Trump’s ire, and as such it can now frequently be found trending on Twitter, being debated in Congress, and featured in primetime media coverage. All in all, dozens of bills to repeal or modify Section 230 have been introduced in 2020.

TechDirt journalist Mike Masnick writes, “If you were in a coma for the past 12 months, just came out of it, and had to figure out what had happened in the last year or so solely based on new bills introduced in Congress, you would likely come to the conclusion that Section 230 was the world’s greatest priority and the biggest, most pressing issue in the entire freaking universe.”

But while it is a recurring topic of discussion, it seems the incessant chatter has only left Americans more confused. This explainer is here to break down the code and the debate swirling around it.

So what’s the truth about Section 230? What does it actually say and what are its implications? Fortunately, the original author of the bill, Senator Ron Wyden, is still around and on record when it comes to the current dispute.

“Republican Congressman Chris Cox and I wrote Section 230 in 1996 to give up-and-coming tech companies a sword and a shield, and to foster free speech and innovation online. Essentially, 230 says that users, not the website that hosts their content, are the ones responsible for what they post, whether on Facebook or in the comments section of a news article. That’s what I call the shield.”

“But it also gave companies a sword so that they can take down offensive content, lies and slime — the stuff that may be protected by the First Amendment but that most people do not want to experience online. And so they are free to take down white supremacist content or flag tweets that glorify violence (as Twitter did with President Trump’s recent tweet) without fear of being sued for bias or even of having their site shut down. Section 230 gives the executive branch no leeway to do either.”

It can seem complicated, but it’s actually fairly straightforward. Section 230 simply says that only internet users are responsible for what they write, not the private companies whose websites host the commenters. Secondly, it affirms what the First Amendment already implies—that private companies don’t have to host speech that violates their values.

Section 230 was written early on in the internet age, long before social media companies even existed (although much of this debate has focused on those platforms). Within the bill, the authors explicitly say the law is “to preserve the vibrant and competitive free market that presently exists for the Internet and other interactive computer services.”

And it has been successful. The government got out of the way and the internet expanded rapidly. Private companies invested millions to build their online enterprises, encouraged by provisions like Section 230 that secured their rights against unjust legal charges that would have otherwise put those investments in severe jeopardy.

Online companies want and need internet users to interact with their content and share feedback on their platforms. That goes for publishers (like Vox.com and us here at FEE.org), platforms (like Twitter and YouTube), and everything in between. But they shouldn’t be held liable because someone writes something untrue on their pages, nor should they have to host content that they find offensive.

Ronald Reagan once said,“We must reject the idea that every time a law’s broken, society is guilty rather than the lawbreaker. It is time to restore the American precept that each individual is accountable for his actions.”

Individuals should be held accountable when they break the law or violate the rights of others. But it would be morally wrong to hold society at large, or even parts of society like private businesses, responsible for the action of an autonomous individual. In fact, this course of action would let the party actually responsible for harm off the hook while punishing a third party who did nothing wrong.

Shoshana Weissmann, the head of Digital Media and Fellow at the R Street Institute, recently wrote a punchy (and hilarious) article illustrating this concept—tying Section 230’s protections to Jeffrey Toobin’s Zoom “reveal” earlier this year. For those who’ve forgotten, Toobin accidentally exposed himself on a work Zoom call. As Weissmann points out, without Section 230, Zoom itself would have been liable for his lewd content rather than Toobin being held responsible.

Thankfully, we have Section 230 which creates a just and sensible legal apparatus for the internet and conduct on it. Without this protection, it is highly unlikely that the internet would have taken off and grown to its current state, much less produced the social media websites, online news outlets, and other user-reviewed services (like Yelp) we all now enjoy.

Section 230 became a hot topic in the fall of 2019 when President Donald Trump drafted an executive order requiring the Federal Communications Commission to develop rules that would limit its protections. Ultimately, that order never went through, as even the mention of it was met with confusion and alarm by regulators, legal experts, and First Amendment advocates.

The storm died down until May of this year when Twitter found itself in Trump’s crosshairs after slapping one of his tweets with a violence warning. This feud reignited Trump’s fury and determination to do away with Section 230.

Since then, Trump and his allies have regularly called for the repeal of Section 230. Trump believes that social media companies are unfair to him and his agenda, and his response to that is to use the government to force the private companies to act in a way he deems appropriate. He also believes that doing away with Section 230 would block social media companies from “censoring” information on their websites.

There has, of course, been pushback against all this. Many conservatives and libertarians have pointed out that Trump and his supporters fundamentally misunderstand the legal code and its implications. Supporters of Section 230 say it upholds the right to free speech in the age of the internet, and that it protects the free market as well.

Meanwhile, others like Republican Senator Roger Wicker have called for modifications to the law that would leave the liability shield in place, but that would force companies to host content that may violate their values.

Social media companies, who have incurred the bulk of the derision in this debate, are left between a rock and a hard place. Democratic leaders want them to censor more and guard against “fake news,” while some Republicans want to take away their rights for any content moderation.

True defenders of free speech, limited government, and the free market are largely being drowned out by the tidal wave of politicians and their supporters pushing for big government responses to a societal issue they dislike.

While opponents of Section 230 think that its removal would force companies to host their content and not “censor” information the company does not like, it would, in fact, have the opposite effect. If companies were liable for content posted on their pages by third parties, they would instead have to censor vigorously.

We’ve already seen a preview of what this would look like with the passage of the Allow States and Victims to Fight Online Sex Trafficking Act (FOSTRA). Signed into law in April of 2018, FOSTRA carved out an exception to Section 230 that essentially said websites would be held responsible for content promoting or facilitating sex trafficking or prostitution.

Internet companies reacted quickly, even those whose primary purpose had nothing to do with sex work. Craigslist removed its personals section altogether. Reddit and Google also took down parts of their websites. Notably, these actions were not taken because these sections of their websites promoted prostitution, but rather because policing them against the possibility that someone else might advertise illegal services was an impossible task.

It is almost inevitable that further eroding Section 230 would have similar impacts throughout the internet. Consider, for example, a company like Twitter. If it could potentially be sued for the millions of user posts on its platform, it would have to start censoring many more of them, or even running them through a pre-approval process. This would likely slow down the flow of information on these channels as the companies would be forced to sort through and approve content. Ultimately, these actions would result in all of us having less of a public square, fewer information streams, and a less rich internet experience.

Especially concerning is the impact these actions would have on smaller companies and start-ups, many of whom cannot afford losing liability protections. Ironically, those who seek to harm Facebook or Twitter by repealing this law would actually end up entrenching their power even more by putting their competitors out of business.

Take Parler for example. It is a growing, popular competitor of Twitter’s that many conservatives are flocking to. Should Section 230 be repealed, this new company would almost certainly be put out of business tomorrow as it does not yet have the revenue to withstand litigation. Twitter, on the other hand, would have the resources to survive and adapt.

“If Section 230 were to be repealed, or even watered down, this next generation of platform will likely be thwarted by liability threats. “Big tech” firms have the resources to comply with new mandates and regulations, so erecting this barrier to entry to nascent firms will artificially lock currently dominant firms in their lead positions.”

-An open letter to Congress from a coalition of conservative and libertarian think orgs, including Americans for Prosperity, Competitive Enterprise Institute, Freedomworks, and more

Some bills seek to modify Section 230 instead of repealing it. There are too many to name in one article, so we’ll focus on the worst and the most prominent: Senator Josh Hawley’s “Ending Support for Internet Censorship Act.”

This legislation would remove liability protections for companies with more than 30 million US users, 300 million global users, or $500 million in annual revenue. The bill also says that these large companies can apply for immunity from the bill if they go through a process that allows the FTC to screen their protocols and attest that their algorithms and content removal policies do not discriminate on the basis of political views.

So Hawley wants to fight “censorship” with – wait for it – actual government censorship of private companies.

Real censorship almost always involves the government, because without this tool of force, it is unlikely information could be totally suppressed. While people like to call social media content moderation “censorship” it really isn’t, not in the true sense of the word. Those who have their posts removed from one platform can easily go post them elsewhere. But what Hawley wants to do, which is use the government to censor the protocols of private companies, actually does constitute censorship as it would force them to allow the government to dictate what speech they would (or would not) host on their websites.

The notion that it would ever be wise to give the government this kind of power is quite jarring to encounter in America. It’s easy to see how this system would quickly eviscerate our fundamental rights to free speech by allowing the government to determine what belongs in the public square of discourse.

And, it’s important to remember that Biden appointees will soon be running these departments. This is an important reminder that the government bureaucrats who decide what counts as “neutral” will not be picked by your team forever. It would be prudent to stop giving the government more power that will only one day be used against you when your “team” is no longer in charge.

What’s next? Will they call to nationalize these platforms? This approach is antithetical to the ideals of limited government, free markets, and free speech.

“This bill forces platforms to make an impossible choice: either host reprehensible, but First Amendment protected speech, or lose legal protections that allow them to moderate illegal content like human trafficking and violent extremism,” said Michael Beckerman, president and CEO of the Internet Association.“That shouldn’t be a tradeoff.”

While many seem to think that Section 230 makes a distinction between ideological publishers and neutral platforms, and that companies who act as publishers do not enjoy its protections, this isn’t true. Section 230 applies to all internet companies and makes no such distinction between publishers and platforms.

Section (c.) of Section 230 specifically addresses this point and speaks to the protection of companies who block and screen offensive material. It immediately states that “no provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider. It goes on to say that when it comes to matters of civil liability, “no provider or user of an interactive computer service shall be held liable on account of any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lews, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected.”

Publishers can be sued for defamatory language online, just as they can be sued for it in print. So can Twitter or Facebook, if they issue a statement or a post. But that isn’t a relevant scenario to Section 230, which again, merely maintains websites are not liable for content you may choose to write on their pages.

Removing content they find offensive is well within their First Amendment rights, and within their Section 230 rights. It doesn’t change their status as a company or their protections under the law.

Many advocates for repealing Section 230 have hung their cases on the “publisher vs. platform” argument in an attempt to mislead their followers. But the good news is, Section 230 is relatively short. You can literally read it in less than five minutes for yourself and see that the publisher vs. platform discussion is a non-issue.

There are also those who claim that Section 230 is a special protection or an exemption for social media companies. This argument also fails to hold water.

One of the few, legitimate functions of government is to uphold the rights of individuals; when that is done businesses have a secure and just climate to operate within. That is exactly what Section 230 did. When the internet came about, it opened up an entirely new marketplace and one that needed such rights affirmed in order for people to invest in it.

Section 230 merely applied the same types of laws we see in the tangible world to the online marketplace. Would Burger King be liable if you came in and shouted obscenities at their customers? Should they be forced to host you on their premises and allow your attack on their clients to continue? Of course not. The same rules should apply to an internet company, and thanks to Section 230 they do.

Furthermore, without this provision to protect an online free market, the courts would likely be bogged down with frivolous lawsuits, which would cost taxpayers dearly. Even sorting through and throwing out such suits is an expensive and time-consuming process.

On this issue, those who believe in limited government and free markets need to put their principles over short-term political expediency. Individuals, whether acting alone or jointly through a business, have the right to free speech, meaning the government has no right to tell them what they can or cannot say. While we may disagree with their choices to remove some users or throttle access to certain content (and I do), it would be a violation of their fundamental rights to force them to host speech they disagree with.

This argument is akin to one that caught the attention of many conservatives years ago: The Christian baker, Jack Phillips, who famously refused to bake a custom cake for a same-sex wedding citing his free speech rights. Just as the baker had a First Amendment right to not endorse a message that violated his beliefs, so too do the owners of social media companies. If we dislike the ways in which they run their platforms, the proper solution is for us to create or fund their competitors—not use big government as a weapon to tread on them.

This is the beauty of the free market. We don’t need the federal government to get involved in this picture outside of creating a fair legal apparatus in which companies can flourish. With Section 230 they got this right, and consumers now enjoy a wide range of options online thanks to its provisions.

If users are unhappy with Twitter or Facebook, they can take their business elsewhere and vote with their feet. If enough users do that, Twitter and Facebook will willingly change their policies to attract users back, or they will cease to exist.

Some have noted that the network effect makes it difficult for social media competitors to attract new customers, referring to the fact that for some products users find more enjoyment in them when a large number of their peers partake in the experience. But MySpace used to have the network effect advantage, and it still lost out to upstart competitors. And the recent (and impressive) success of Parler shows that there is still room for competition in this picture.

As always, free people are far better equipped to solve this problem than the government.

 Hannah Cox

Hannah Cox
Hannah Cox is a libertarian-conservative writer, commentator, and activist. She’s a Newsmax Insider and a Contributor to The Washington Examiner.

This article was originally published on FEE.org. Read the original article.

The post What Is Section 230 and Why Does Trump Want to Repeal It? appeared first on The Libertarian Republic.

]]>
https://thelibertarianrepublic.com/what-is-section-230-and-why-does-trump-want-to-repeal-it/feed/ 6 117054
Why We Must Keep Section 230 and Pay the High Cost of Free Speech Online https://thelibertarianrepublic.com/why-we-must-keep-section-230/ https://thelibertarianrepublic.com/why-we-must-keep-section-230/#comments Sun, 23 Feb 2020 23:41:36 +0000 https://thelibertarianrepublic.com/?p=109873 The next several years may be among the most important for the future of free speech since the United States’ founding. The parameters of the United States’ centuries-old debate on the First Amendment are undergoing rapid and irreversible change driven by an explosion in information and communication technology. Before the...

The post Why We Must Keep Section 230 and Pay the High Cost of Free Speech Online appeared first on The Libertarian Republic.

]]>
The next several years may be among the most important for the future of free speech since the United States’ founding. The parameters of the United States’ centuries-old debate on the First Amendment are undergoing rapid and irreversible change driven by an explosion in information and communication technology.

Before the widespread adoption of social media, speech might have been free but being heard was often expensive. This technological constraint created gatekeepers in the mainstream media whose decisions were sometimes unjust but at other times brought stability and thoughtfulness to public discourse.

The economics of the internet has changed this landscape in ways we’ve only begun to understand. The cost of broadcasting one’s ideas around the world has fallen to near zero. The reality of network effects tends to concentrate people on a few social media platforms and search engines. In terms of the famous example meant to describe harmful speech, we’ve all been given megaphones and tickets to a crowded theater, and any of us can yell fire at any time.

These largely open internet platforms would look very different without Section 230 of the 1996 Communications Decency Act, which grants web platforms immunity from litigation for things its users say. The drumbeat from across the political spectrum to repeal Section 230 and silence the voices of bad actors is growing. While those concerns are justified, we need to fight even harder to protect speech online and its legal underpinnings, and that effort must come from all of us instead of on our behalf.

Nastier Than We Want Them to Be

In many ways Americans’ opinions on free speech have remained far closer to the classical liberal ideal than their opinions on free commerce. Rather than searching in vain for rules to get rid of speech we don’t like, the consensus has usually, though not always, come down on protecting such speech rather than regulating it. The debate has left in its wake a long line of often-cringeworthy antiheroes around whom the public has rallied but might not want to have over for brunch.

I was 12 or 13 when a friend brought over a cassette copied from his older brother that contained the music of one of the final free-speech antiheroes of age before the internet — the Miami rap group 2 Live Crew. The group’s message was hedonism pushed to such a limit that it would likely raise eyebrows even if it were released in less innocent times today. The arrests, trial, and successful appeal by the group that ensued wound up producing the iconic “Parental Advisory – Explicit Lyrics” sticker still used today.

I’m not sure one could create anything more riveting to a group of suburban junior-high-aged boys. For my progressive parents the issue was cut-and-dried — a line had to be drawn to stop censors and moral crusaders such as the high-profile prosthelytizing parents’ group led by Tipper Gore. Even at that age I remember wondering if the issue would be so cut-and-dried had my parents actually heard the album — my friends and I made sure they didn’t.

The Miami police department certainly wasn’t amused, and in June 1990 three group members were arrested after performing songs from the album in question at a local club. They were found guilty in district court but acquitted on appeal. Puritanical crusaders aside, most people supported their right to free expression. Consider the alternative — government regulating song lyrics that would constitute a first step down an all-too-visible slippery slope.

Almost 30 years later the controversy surrounding 2 Live Crew seems almost quaint. First, the potential damage from the offending speech was fundamentally limited. I’m afraid any problems I might have today can’t be blamed on rap lyrics having warped my mind. But rather than individual antiheroes, the information age gives us swarms of trolls, bots, and extremists who create echo chambers, seek out and radicalize the most vulnerable, and can create false information capable of swaying elections.

It doesn’t stop there. In the early 1990s speech may have been “free,” but broadcasting it to large numbers of people was quite expensive. In truth the antiheroes were successful artists with demand for their work and a record label willing to invest millions of dollars in recording and distributing it. Right or wrong, people were entertained, and the group’s leader and producer Luther Campbell is credited among the pioneers of a musical sound influential to this day. Nasty as they may have been, 2 Live Crew were let in by the gatekeepers. Then the gates collapsed.

Snowballing Trolls

We’re still trying to wrap our heads around the novel economics of internet platforms, developments due in large part to Section 230 of the 1996 Communications Decency Act. The brief passage speaks best for itself:

No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.

Platforms such as social media, search engines, and even services like Uber somewhat arbitrarily dubbed the “sharing economy” are game-changing in their ability for users to independently post and find information and engage with each other at an extremely low cost. That’s only the case, however, if the company owning the platform is allowed to act as facilitator rather than central planner. In an age where media companies made explicit choices over what they broadcast, civil and criminal liability, though potentially still debatable, made more sense. Online, the time and risk aversion required to act as gatekeeper and preclude litigation would likely have been almost prohibitive.

Any follower of Adam Smith should be excited by an information economy driven from the bottom up rather than dictated by media elites from on high. But as is the case with human freedom of any kind, the path to truly new ideas is littered with the potential for wrong turns that look far more problematic than some baudy rappers in a Miami nightclub. But now, more than ever, consider the alternatives.

Out of Many, Many

Two articles by Reason’s Robby Soave nearly a year apart suggest exactly why government shouldn’t take it upon itself to save us from our complicated new world online. Last March, Repubican Senator Josh Hawley railed against Section 230 as he chastised big tech for censoring conservative voices. A few weeks ago, Elizabeth Warren was shaking her head at Facebook for what it included — things she characterized as “disinformation.” While partially correct, both senators’ statements are confused and nakedly partisan, holding Facebook to the fire for failing to meet opposite obligations.

Why not simply repeal 230 and get out of the way, allowing internet platforms disciplined by the threat of litigation to act as gatekeepers? In one sense, that puts us right back where we were, with media corporations anointing a larger number of chosen ones. But anyone who has seen the realm of class action litigation up close knows that would lead to overzealous restrictions on platform speech. The threat of trials in front of juries who don’t find big companies sympathetic, the overzealous pursuit of cases and wins by the plaintiffs bar, and the temptation of settlement as an easy and relatively cheap way out for risk-averse corporations make for something almost as sanitized and elitist as the not-so-good-old-days for which people mistakenly pine.

The real way forward is illuminated, somewhat appropriately, by one of the most maddeningly incorrect readings of the Section 230 controversy I’ve encountered. Writing for the Brookings Institution, former FCC head Tom Wheeler just wants us to be united:

Social media undermines what the Founding Fathers were focusing on when they wrote “We the People” and established the motto “E Pluribus Unum” (out of many, one). The concept of “We” and the formation of a “Unum” is essential for democracy to work. Humans are inherently tribal. Democracy requires us to overcome that tribalism — to find our Unum — in the pursuit of a greater good. In contrast, the business plans of the dominant digital companies are built on dividing us into tribes in order to sell targeted access to each tribe.

Wheeler trots out the cliche “out of many, one,” but he is really saying, “out of many, me.” Social media prevents the adults in the room from keeping guard over what we can hear, and though their goals may differ, Wheeler and Senators Hawley and Warren all agree that the imperative is to get speech on internet platforms much more in line with what they would like it to be.

That’s both bad on its face and impossible in practice. We tend to think of dangerous speech online being planned by nefarious elements (see: Russion election meddling). What I see more at least with my own eyes are hundreds or thousands of smaller bad acts or neglectful behavior snowballing into disinformation and distrust.

Keeping Section 230 and protecting speech on the internet is not a passive act. It requires the many, rather than the one, to engage. Note the difference between engagement, tolerance, and censorship. Calls for “civility” in online dialog are appropriate but may miss the point. Without gatekeepers, we certainly must do better to not descend into ugliness in the digital realm. But that’s not enough — we reap the rewards of a room full of microphones when we’re curious about those who disagree with us. The flipside is doing more to reject those snowballing elements that create the problem. They can be free to speak as they please, but we are free to push back, especially against the nasty, extreme voices people might mistake for our own side.

Many will think I’m optimistic to a fault about the capacity of individuals to behave online. The result won’t be perfect, and for a while it probably won’t be good. But this technology is new, and we do learn from ourselves and each other. One can’t pretend the internet hasn’t made the dark side of free speech more dangerous. But rather than a censor, it’s time for a gut check. If you don’t think everyone should have a microphone in today’s age of low entry barriers, you have to say who does and doesn’t. That’s above my pay grade. Let’s use our microphones wisely.

Max Gulker

listpg_max

Max Gulker is an economist and writer who joined AIER in 2015. His research focuses on two main areas: policy and technology. On the policy side, Gulker looks at how issues like poverty and access to education can be addressed with voluntary, decentralized approaches that don’t interfere with free markets. On technology, Gulker is interested in emerging fields like blockchain and cryptocurrencies, competitive issues raised by tech giants such as Facebook and Google, and the sharing economy. Gulker frequently appears at conferences, on podcasts, and on television. Gulker holds a PhD in economics from Stanford University and a BA in economics from the University of Michigan. Prior to AIER, Max spent time in the private sector, consulting with large technology and financial firms on antitrust and other litigation. Follow @maxgAIER.

Republished with permission from the American Institute for Economic Research.

The post Why We Must Keep Section 230 and Pay the High Cost of Free Speech Online appeared first on The Libertarian Republic.

]]>
https://thelibertarianrepublic.com/why-we-must-keep-section-230/feed/ 3 109873