Placeholder Image

Subtitles section Play video

  • Something I think about a lot is Tide Pods.

  • Someone just made a joke one day, that Tide Pods look delicious.

  • And so people started, you know, just making posts about this on social media.

  • And it was just transparently hilarious.

  • But the platform's incentives are such that if you actually did eat a Tide pod,

  • you'd get a million views.

  • And what had seemed super funny, all of a sudden was a public health hazard.

  • Platforms, for I think very understandable reasons,

  • have a terrible time figuring out when the joke stops being funny.

  • "Twitter, for the very first time, has fact-checked President Trump...."

  • "...on voter misinformation."

  • "That's the line Facebook, Twitter, and others seem to have drawn."

  • "Trump continues to falsely insist that the election was stolen."

  • "Twitter has put up a flag more than a hundred times since Election Day."

  • "Trump urging his followers: Be there, will be wild."

  • "Fight for Trump! Fight for Trump!"

  • "Twitter permanently suspending the president's personal account...."

  • "...due to the risk of further incitement of violence."

  • "Facebook."

  • "YouTube."

  • "Pinterest."

  • "Shopify and Paypal."

  • And so the question is, what is the right moment for the platform to intervene?

  • I think we're in a period of rethinking what misinformation is.

  • I think the past few years, we had the thought that misinformation was individual bad posts,

  • and maybe some individual actors that needed to be disciplined.

  • But, if we could just prune that garden,

  • the rest of our information ecosystem would be okay.

  • Alex Jones is sort of the classic example.

  • "Apple, Facebook, Spotify, and YouTube..."

  • "...have now removed content associated with Jones and InfoWars."

  • He does not have nearly the influence over American life today.

  • But that kind of whack-a-mole approach

  • is just not giving us the information ecosystem that we want.

  • This idea that the election has been stolen, which we know to be false,

  • is being repeated ad nauseum all across the Internet,

  • in private chats, in private messages, as well as in public.

  • This is becoming the big lie.

  • It's larger than any one user.

  • It's larger than a thousand users.

  • It's going to require a much more serious and difficult approach

  • than simply removing one account,

  • no matter how prominent that account might be.

  • 147 members of Congress voted to overturn the results of the election

  • after the Capitol had been attacked.

  • Are these platforms ready to deplatform 147 sitting members of Congress?

  • Removing Trump was the easy part.

  • He incited an attack on his own government.

  • That is not a close call.

  • The hard call is, you're about to have maybe 70 million Americans,

  • or some huge percentage of that, talking,

  • including in online spaces, about an election being stolen that was not stolen.

  • And that is going to have a lot of really dangerous consequences.

  • I don't think these platforms will succeed

  • if they can only be defined by what they will not allow.

  • It's thatwhat are they replacing it with?

  • There needs to be a positive, constructive counterbalance

  • to all of the misinformation and conspiracy theories.

  • What can they do to build a better media ecosystem?

  • Because if we don't have a shared sense of reality,

  • I truly do not believe we are going to have a liberal democracy in America very much longer.

Something I think about a lot is Tide Pods.

Subtitles and vocabulary

Click the word to look it up Click the word to find further inforamtion about it