Top Stories Tamfitronics
This week saw a dramatic turn inour nation’s desperate efforts to clean up the increasingly poisonous onlinesewers that we call social media.
First, the backstory. If youpublish a newspaper or newsletter and you publish “illegal” content—encouragecrimes or homicide, offer to sell drugs, promote child porn, advocateoverthrowing a government—you can go to jail. If you publish things that defameor lie about persons or corporations, you can be sued into bankruptcy.
If you own a bookstore or newsstandand distribute books, magazines, and newspapers and offer for sale illegalcontent—child or snuff porn, stolen copyrighted material, instructions formaking illegal drugs or weapons—you can also go to jail. And if you sellmaterials that openly defame individuals or corporations, you can be sued intobankruptcy.
In the first category, you’d be apublisher. In the second, you’d be a distributor.
But what is social media?Particularly those types of social media that use an algorithm to pushuser-produced content out to people who haven’t explicitly asked for it?
Twenty-eight years ago, socialmedia sites like CompuServe and AOL were regulated as if they werepublications, with the occasional secondary oversight as if they weredistributors. They had an obligation to make sure that illegal or defamatorycontent wasn’t published on their sites, or, if it was, to remove it within areasonable time period.
The internet back then was arelatively safe and peaceful place. I know, as I ran a large part of one of thelargest social media sites that existed at the time.
But then things got weird.
Back in 1996, some geniuses inCongress thought, “Hey, let’s do away with the entire concept of the publisheror distributor having responsibility for what happens in their place.”
Seriously. Selling drugs,trading in guns and ammunition, human trafficking, planning terrorist attacks,overthrowing governments, sparking genocides, promoting open lies and nakeddefamation. All good. No problem.
No matter what happens on a socialmedia site, Congress said, its owners and managers bear no responsibilitywhatsoever for what’s published or distributed on and from thesite. None. They’re untouchable. They take the profits but never haveto worry about being answerable for the damage their site is doing.
Sounds crazy, right?
But that’s exactly what Congressdid with Section 230 of the Telecommunications Act of 1996 in what theythought, at the time, was a good-faith effort to help the brand-new internetgrow in a way they hoped would eventually become an important and useful socialgood.
It sure hasn’t worked out that way.And, as noted, it wasn’t always this way in the years before 1996.
Back when the internet started, butbefore hypertext markup language, or HTML, was invented, there were really onlytwo big “houses” on the internet: CompuServe and AOL. My old friend andbusiness partner NigelPeacock and I ran “forums” on CompuServe starting around 1979, rightup until 1996.
We ran the IBM PC Forum, theMacintosh Forum, and about two dozen other “social media forums” where peopleinterested in ADHD, UFOs, the Kennedy assassination, international trade,spirituality, and a bunch of other topics could log in and discuss.
CompuServe paid us well, because wehad to make sure nothing criminal happened in any of the forums we ran. Weeven had to carry our own liability insurance. And we split the revenue withthe 20 or so people who worked with us.
We kept those places open and safe,as did hundreds of other “Sysops,” or Systems Operators, who ran other forums onCompuServe and AOL. After all, these were CompuServe’s and AOL’s“publications” or “bookstores,” and the companies were paying us to make surenothing illegal happened inside it.
Until 1996, that is. That year,after Section 230 became law, CompuServe decided they no longer needed to paySysop moderators to keep their forums clean and crime-free, so they quit payingus. Most of us left.
The result of Section 230 of theTelecommunications Act of 1996 is obvious today. The attack on our Capitol waslargely planned on social media and the internet more broadly, where you canalso buy ghost guns, other people’s credit card numbers, drugs, and illegalporn.
Social media sites now runalgorithms that choose specific content from users to push out to other usersto keep them “engaged” so they’ll have maximum exposure to advertisements,which have made the owners of the sites into billionaires.
In 1997, in the caseZeran v. America Onlinethe Fourth Circuit Court of Appeals ruled thatSection 230 is written so tightly that even when an online serviceknowingly allows lawbreaking, it can’t be held accountable.
More recently, last month inMoody v. NetChoicethe Supreme Court ruled that social media companiesarealsoprotected, like newspaper publishers are, by theFirst Amendment. They essentially affirmedBecause,writing:
“[A] platform’s algorithm thatreflects ‘editorial judgments’ about ‘compiling the third-party speech it wantsin the way it wants’ is the platform’s own ‘expressive product’ and istherefore protected by the First Amendment.”
Mark Zuckerberg, who owns one ofthose “publications,” has become one of the richest men on the planet becausehe doesn’t have to pay for content moderation. Twitter made a few billionaires,too, before Elon Musk turned it into a right-wing disinformation machine.
Nonetheless, Section 230 lives on.I wrote a book that covers it,TheHidden History of Big Brother: How the Death of Privacy and the Rise ofSurveillance Threaten Us and Our Democracy.
So did JoshHawley, the Republican senator from Missouri who hopes to be the nextTrumpy president, and his book’s take is pretty much the same as mine: Section230 is extremely problematic, at the very least.
Which brings us to this week’s bignews. For the first time, a federal appeals court (the Third Circuit, seated inPhiladelphia) has ruled that because Section 230 largely deals with socialmedia sites as “publishers,” that doesn’t protect them as “distributors” (likebookstores).
In this case, a 10-year-old girlreceived a TikTok video—pushed out to her by that company’s algorithm—for athing called the “blackout challenge” where people see how long they can cutoff their own breathing or blood supply before blacking out. Little NylahAnderson tried the challenge and accidentally asphyxiated herself to death.
Her mother sued in Pennsylvania fornegligence and wrongful death, using state product liability laws as the basisfor her suit. From there it went to federal court, whereAndersonv. ByteDanceended up before the Third Circuit.
Two Trump-appointed and oneObama-appointed judges ruled unanimously that ByteDance, which owns TikTok,isn’t “publishing” a social media site but—because their algorithm “curates”content and sends its choices, unsolicited, to users on the site—is actually“distributing” content.
In other words, social media sitesare bookstores, not newspapers. And online “bookstores,” they ruled, arenotprotectedby Section 230.
The case is far from settled; fromhere it’ll go to the Supreme Court, and its fate there is hard to predict giventhe court’s embrace of the First Amendment argument in previous cases.
And the social media companies,raking in billions in profits, have so far stymied all efforts to do away withSection 230 or regulate their behavior by taking advantage of the SupremeCourt’s legalization of political bribery: Silicon Valley is one of the largerplayers in the lobbying and campaign contributions game.
Nonetheless, there’s now a veryreal possibility, even absent of congressional action, that social mediacompanies may end up back where AOL and CompuServe were before 1996, having tohire thousands of content moderators to protect themselves from both criminaland civil action.
Europe is moving in this directiontoo, with the arrestin France last week of the Russian founder of Telegram, a social mediachannel where human trafficking and other criminal activity were both commonand known to the Systems Operators.
If Zuckerberg and his peers have tostart hiring people to do what Nigel and I did for CompuServe years ago, it mayreduce their income from tens of billions to mere billions. They probably won’teven notice the difference.
And society will be the better forit, our political landscape will stabilize, and fewer children will die.
That’s a trade-off that’s well worthmaking. But why wait for the Supreme Court (which may well not agree with theThird Circuit)? Congress can also repeal or substantially rewrite Section 230.
It’s time to consign Section 230 tothe trash heap of history and begin to hold accountable and regulate thesetoxic behemoths to the same simple standards to which magazines, newspapers,and newsletters like this and bookstores must adhere.