Online Platforms Have Become Chaos Machines. Can We Rein Them In?

, Online Platforms Have Become Chaos Machines. Can We Rein Them In?
Spread the love
3 Views

MIT Sloan’s Sinan Aral, author of The Hype Machine: How Social Media Disrupts Our Elections, Our Economy, and Our Health—and How we Must Adapt, speaks with HBR about what the world needs to do to contain the chaotic consequences of online platforms.

In January, a Reddit message board called r/WallStreetBets sent the stock of a beleaguered video-game retailer, GameStop, skyrocketing, costing some Wall Street hedge funds billions of dollars. Wild as the story of retail traders rocking markets may be, we’ve seen this kind of thing happen before: The saga is just the latest chaotic consequence of people’s ability to communicate and coordinate, en masse, on online platforms. Its recent precedents vary dramatically, from the relatively benign (thousands of teens coordinating on TikTok to inflate attendance expectations for a Trump rally, for example) to the malignant (insurrectionists using the platforms Gab and Parler to plan and execute their attack on the U.S. Capitol). It’s a story that we’ll likely see repeated in new ways, with increasing regularity, from here on out.

The question now is: How do we — as citizens, corporations, and governments — responsibly wield the burgeoning power of online platforms and reckon with their real-life consequences?

To try to answer this question, I spoke with Sinan Aral, Director of the MIT Initiative on the Digital Economy and author of The Hype Machine: How Social Media Disrupts Our Elections, Our Economy, and Our Health—and How we Must Adapt, about how this latest story fits into the emerging trend of unwieldy and deeply consequential activity on online platforms.

What would it mean to regain control over online platforms?

We’ve seen a lot of the peril that can come from social media — especially recently. But we’ve also seen a tremendous amount of promise. So, I like to think of the question as, “How are we going to achieve the promise of social media and avoid the perils?”

We have four levers at our disposal, in my opinion: money, code, norms, and laws.

By money I mean the platforms’ business models, which provide the incentives for user behavior, advertiser behavior, and investor behavior.

Code is the design of the platforms and the algorithms that underlie them, whether it’s newsfeed algorithms or “people you may know” algorithms and so on.

Norms are the real-world social behaviors around the platforms. How do people actually use this stuff?

And then laws, which encompass everything from antitrust to Section 230 to privacy legislation to the modernization of SEC guidelines.

And I do think that we can steer this technology towards good and away from bad. But in order to do that, we really need to dig into the science of social media and pull those levers accordingly.

In your book, which was published last September, you predicted an event like the attack on the Capitol on January 6th. You also described the likelihood of economic disruption not unlike what we saw last week with the GameStop story. Having thought about the disruptive potential of social media for some time, what are your impressions of the events of January 2021?

I think we’re witnessing, in real time, society grappling with the emergence of social media as a very powerful force. Experts who have been studying this stuff have been warning for months, if not years, that these types of disturbances could happen as a result of online platforms. And this isn’t the first time that we’ve seen social media activity manifest in a real-world threat to democracies or disruption of an economy. But the scale, publicity, and extremity of these recent events feels new in the U.S.

What disruptive potential of online platforms are you most concerned about in the near future?

Well, the third element of the trifecta in my book’s subtitle is public health. I think that’s relevant to the current moment of the pandemic because vaccine hesitancy is proliferated by misinformation. And we’re already starting to see some of that now. We saw anti-vaccine protesters fueled by social media misinformation shut down the vaccination site at Dodger Stadium a few days ago — and this is just a replay of what happened with measles in 2018 and 2019. I’m sincerely hoping that we don’t see more of these kinds of events come to pass.

This month, we’ve seen platforms begin to self-regulate much more aggressively. Twitter and Facebook and other platforms banned former President Trump. Discord briefly banned r/WallStreetBets and the trading platform Robinhood has restricted trades on GameStop, AMC, and Blackberry. What do you make of these attempts?

The events are certainly different in important ways, but both raise a fundamental question about content moderation. We’ve seen historically that platforms begin with a mostly hands-off policy. But in the past year or so, we’re starting to see public pressure mounting — and the specter of regulation rearing its head. We’ve got the antitrust case against Facebook. We’ve got a lot of talk about reform or repeal of Section 230 of the Communications Decency Act. And in the face of that kind of pressure, as well as more and more evidence that platforms are having an effect on real-world behaviors in ways that society doesn’t want to see, these companies are starting to make more aggressive moves to moderate content, to ban accounts that violated their policies and, and to draw the lines.

How and where do they draw the lines?

That’s the big question. It ultimately comes down to the difference between free speech and harmful speech, whether it’s around politics or financial information. There are some obvious breaches — we don’t want coordinated attempts to kidnap and kill the governor of Michigan, or livestreams of mass murders on these platforms, for example. Those kinds of things are clearly harmful and easier to moderate. But when we’re talking about kinds of communications that are technically legal, but potentially harmful, those are much trickier to categorize and moderate if you’re a platform.

So, who should make those calls? Should it come down to platforms self-policing or government regulation?

Well, it’s important to note that President Biden’s nominee for Secretary of Commerce, Gina Raimondo, said that she would favor the Commerce Department’s National Telecommunications and Information Administration leading a reform of Section 230. But in my opinion, agencies drawing the lines between harmful speech and free speech is not a great idea, because they’re run by political appointees.

The right place to do that is in Congress — the most representative, deliberative body we have — and in the courts, in case law.

What we saw happen on Reddit last week — the coordinated effort of an online community to boost the value of certain stocks — doesn’t only raise questions about what kind of discourse is allowed on platforms. It also raises important questions about the future of financial rules and regulations. What kind of response do you think this is going to trigger?

I expect the SEC will conduct an investigation of these market moves that we saw this week. We’ve seen the SEC investigate Lidingo Holdings and DreamTeam for creating market-manipulative social media posts back in 2014. The same type of investigation could certainly happen here.

We don’t yet know enough about the GameStop situation. For example, it’s not clear the David vs. Goliath story is exactly right. Who exactly was in this crowd? Were they tied to financial institutions with a stake? There were hedge funds like BlackRock and private individuals like Ryan Cohen, the former Chewy.com CEO, who made a lot of money. That’s not illegal, but new paths to economic instability have been revealed. For example, if we thought Russia found it productive to disrupt our democracy through social media manipulation, what do you think they are thinking, after GameStop, about the prospect of disrupting our economy with a similar strategy? We need to know more.

What outcomes of this story will matter most?

Well, I think we saw the GameStop thing already expand very rapidly beyond one stock — to AMC and then to Blackberry — to become sort of a social movement, right? So, whether this continues to happen or not depends on a couple of things. Number one, who gets burned financially? Number two, what kind of regulatory response do we see from the SEC? Those outcomes will dictate what happens next.

What do you think is an underrated takeaway of the GameStop story that you think people should pay attention to?

I think it’s important for people to realize that when it comes to markets, social media doesn’t operate in isolation. Of course, social media is a crowdsourcing mechanism where lots of people can coordinate their behavior, or spread misinformation, or decide to buy and sell stocks, etc. But it’s coupled to very sophisticated systems that are analyzing the sentiment on platforms and linking that sentiment to automated trading algorithms, as well as recommendations to institutional investors to buy or sell stocks. And so there’s a feedback loop. Institutional investors have plugged their sensing systems into the crowd. This ends up complicating the story; it’s not two systems at odds with each other, it’s actually one big system getting tangled.

Read More

  •  
  •  
  •  
  •  
  •  
  •  
  •  
  •  

Keep Reading

PreviousNext

Comments

Leave a Reply