Placeholder Image

Subtitles section Play video

  • Back in the 1980s, actually, I gave my first talk at TED,

  • and I brought some of the very, very first public demonstrations

  • of virtual reality ever to the TED stage.

  • And at that time, we knew that we were facing a knife-edge future

  • where the technology we needed,

  • the technology we loved,

  • could also be our undoing.

  • We knew that if we thought of our technology

  • as a means to ever more power,

  • if it was just a power trip, we'd eventually destroy ourselves.

  • That's what happens

  • when you're on a power trip and nothing else.

  • So the idealism

  • of digital culture back then

  • was all about starting with that recognition of the possible darkness

  • and trying to imagine a way to transcend it

  • with beauty and creativity.

  • I always used to end my early TED Talks with a rather horrifying line, which is,

  • "We have a challenge.

  • We have to create a culture around technology

  • that is so beautiful, so meaningful,

  • so deep, so endlessly creative,

  • so filled with infinite potential

  • that it draws us away from committing mass suicide."

  • So we talked about extinction as being one and the same

  • as the need to create an alluring, infinitely creative future.

  • And I still believe that that alternative of creativity

  • as an alternative to death

  • is very real and true,

  • maybe the most true thing there is.

  • In the case of virtual reality --

  • well, the way I used to talk about it

  • is that it would be something like

  • what happened when people discovered language.

  • With language came new adventures, new depth, new meaning,

  • new ways to connect, new ways to coordinate,

  • new ways to imagine, new ways to raise children,

  • and I imagined, with virtual reality, we'd have this new thing

  • that would be like a conversation

  • but also like waking-state intentional dreaming.

  • We called it post-symbolic communication,

  • because it would be like just directly making the thing you experienced

  • instead of indirectly making symbols to refer to things.

  • It was a beautiful vision, and it's one I still believe in,

  • and yet, haunting that beautiful vision

  • was the dark side of how it could also turn out.

  • And I suppose I could mention

  • from one of the very earliest computer scientists,

  • whose name was Norbert Wiener,

  • and he wrote a book back in the '50s, from before I was even born,

  • called "The Human Use of Human Beings."

  • And in the book, he described the potential

  • to create a computer system that would be gathering data from people

  • and providing feedback to those people in real time

  • in order to put them kind of partially, statistically, in a Skinner box,

  • in a behaviorist system,

  • and he has this amazing line where he says,

  • one could imagine, as a thought experiment --

  • and I'm paraphrasing, this isn't a quote --

  • one could imagine a global computer system

  • where everybody has devices on them all the time,

  • and the devices are giving them feedback based on what they did,

  • and the whole population

  • is subject to a degree of behavior modification.

  • And such a society would be insane,

  • could not survive, could not face its problems.

  • And then he says, but this is only a thought experiment,

  • and such a future is technologically infeasible.

  • (Laughter)

  • And yet, of course, it's what we have created,

  • and it's what we must undo if we are to survive.

  • So --

  • (Applause)

  • I believe that we made a very particular mistake,

  • and it happened early on,

  • and by understanding the mistake we made,

  • we can undo it.

  • It happened in the '90s,

  • and going into the turn of the century,

  • and here's what happened.

  • Early digital culture,

  • and indeed, digital culture to this day,

  • had a sense of, I would say, lefty, socialist mission about it,

  • that unlike other things that have been done,

  • like the invention of books,

  • everything on the internet must be purely public,

  • must be available for free,

  • because if even one person cannot afford it,

  • then that would create this terrible inequity.

  • Now of course, there's other ways to deal with that.

  • If books cost money, you can have public libraries.

  • And so forth.

  • But we were thinking, no, no, no, this is an exception.

  • This must be pure public commons, that's what we want.

  • And so that spirit lives on.

  • You can experience it in designs like the Wikipedia, for instance,

  • many others.

  • But at the same time,

  • we also believed, with equal fervor,

  • in this other thing that was completely incompatible,

  • which is we loved our tech entrepreneurs.

  • We loved Steve Jobs; we loved this Nietzschean myth

  • of the techie who could dent the universe.

  • Right?

  • And that mythical power still has a hold on us, as well.

  • So you have these two different passions,

  • for making everything free

  • and for the almost supernatural power of the tech entrepreneur.

  • How do you celebrate entrepreneurship when everything's free?

  • Well, there was only one solution back then,

  • which was the advertising model.

  • And so therefore, Google was born free, with ads,

  • Facebook was born free, with ads.

  • Now in the beginning, it was cute,

  • like with the very earliest Google.

  • (Laughter)

  • The ads really were kind of ads.

  • They would be, like, your local dentist or something.

  • But there's thing called Moore's law

  • that makes the computers more and more efficient and cheaper.

  • Their algorithms get better.

  • We actually have universities where people study them,

  • and they get better and better.

  • And the customers and other entities who use these systems

  • just got more and more experienced and got cleverer and cleverer.

  • And what started out as advertising

  • really can't be called advertising anymore.

  • It turned into behavior modification,

  • just as Norbert Wiener had worried it might.

  • And so I can't call these things social networks anymore.

  • I call them behavior modification empires.

  • (Applause)

  • And I refuse to vilify the individuals.

  • I have dear friends at these companies,

  • sold a company to Google, even though I think it's one of these empires.

  • I don't think this is a matter of bad people who've done a bad thing.

  • I think this is a matter of a globally tragic,

  • astoundingly ridiculous mistake,

  • rather than a wave of evil.

  • Let me give you just another layer of detail

  • into how this particular mistake functions.

  • So with behaviorism,

  • you give the creature, whether it's a rat or a dog or a person,

  • little treats and sometimes little punishments

  • as feedback to what they do.

  • So if you have an animal in a cage, it might be candy and electric shocks.

  • But if you have a smartphone,

  • it's not those things, it's symbolic punishment and reward.

  • Pavlov, one of the early behaviorists,

  • demonstrated the famous principle.

  • You could train a dog to salivate just with the bell, just with the symbol.

  • So on social networks,

  • social punishment and social reward function as the punishment and reward.

  • And we all know the feeling of these things.

  • You get this little thrill --

  • "Somebody liked my stuff and it's being repeated."

  • Or the punishment: "Oh my God, they don't like me,

  • maybe somebody else is more popular, oh my God."

  • So you have those two very common feelings,

  • and they're doled out in such a way that you get caught in this loop.

  • As has been publicly acknowledged by many of the founders of the system,

  • everybody knew this is what was going on.

  • But here's the thing:

  • traditionally, in the academic study of the methods of behaviorism,

  • there have been comparisons of positive and negative stimuli.

  • In this setting, a commercial setting,

  • there's a new kind of difference

  • that has kind of evaded the academic world for a while,

  • and that difference is that whether positive stimuli

  • are more effective than negative ones in different circumstances,

  • the negative ones are cheaper.

  • They're the bargain stimuli.

  • So what I mean by that is it's much easier

  • to lose trust than to build trust.

  • It takes a long time to build love.

  • It takes a short time to ruin love.

  • Now the customers of these behavior modification empires

  • are on a very fast loop.

  • They're almost like high-frequency traders.

  • They're getting feedbacks from their spends

  • or whatever their activities are if they're not spending,

  • and they see what's working, and then they do more of that.

  • And so they're getting the quick feedback,

  • which means they're responding more to the negative emotions,

  • because those are the ones that rise faster, right?

  • And so therefore, even well-intentioned players

  • who think all they're doing is advertising toothpaste

  • end up advancing the cause of the negative people,

  • the negative emotions, the cranks,

  • the paranoids,

  • the cynics, the nihilists.

  • Those are the ones who get amplified by the system.

  • And you can't pay one of these companies to make the world suddenly nice

  • and improve democracy

  • nearly as easily as you can pay to ruin those things.

  • And so this is the dilemma we've gotten ourselves into.

  • The alternative is to turn back the clock, with great difficulty,

  • and remake that decision.

  • Remaking it would mean two things.

  • It would mean first that many people, those who could afford to,

  • would actually pay for these things.

  • You'd pay for search, you'd pay for social networking.

  • How would you pay? Maybe with a subscription fee,

  • maybe with micro-payments as you use them.

  • There's a lot of options.

  • If some of you are recoiling, and you're thinking,

  • "Oh my God, I would never pay for these things.

  • How could you ever get anyone to pay?"

  • I want to remind you of something that just happened.

  • Around this same time

  • that companies like Google and Facebook were formulating their free idea,

  • a lot of cyber culture also believed that in the future,

  • televisions and movies would be created in the same way,

  • kind of like the Wikipedia.

  • But then, companies like Netflix, Amazon, HBO,

  • said, "Actually, you know, subscribe. We'll give you give you great TV."

  • And it worked!

  • We now are in this period called "peak TV," right?

  • So sometimes when you pay for stuff, things get better.

  • We can imagine a hypothetical --

  • (Applause)

  • We can imagine a hypothetical world of "peak social media."

  • What would that be like?

  • It would mean when you get on, you can get really useful,

  • authoritative medical advice instead of cranks.

  • It could mean when you want to get factual information,

  • there's not a bunch of weird, paranoid conspiracy theories.

  • We can imagine this wonderful other possibility.

  • Ah.

  • I dream of it. I believe it's possible.

  • I'm certain it's possible.

  • And I'm certain that the companies, the Googles and the Facebooks,

  • would actually do better in this world.

  • I don't believe we need to punish Silicon Valley.

  • We just need to remake the decision.

  • Of the big tech companies,

  • it's really only two that depend on behavior modification and spying

  • as their business plan.

  • It's Google and Facebook.

  • (Laughter)

  • And I love you guys.

  • Really, I do. Like, the people are fantastic.

  • I want to point out, if I may,

  • if you look at Google,

  • they can propagate cost centers endlessly with all of these companies,

  • but they cannot propagate profit centers.

  • They cannot diversify, because they're hooked.

  • They're hooked on this model, just like their own users.

  • They're in the same trap as their users,

  • and you can't run a big corporation that way.

  • So this is ultimately totally in the benefit of the shareholders

  • and other stakeholders of these companies.

  • It's a win-win solution.

  • It'll just take some time to figure it out.

  • A lot of details to work out,