Placeholder Image

Subtitles section Play video

  • [MUSIC PLAYING]

  • WILSON WHITE: Good afternoon, everyone, especially

  • for those of you who are here in California.

  • My name is Wilson White, and I'm on the public policy

  • and government relations team here in California.

  • We have an exciting talk for you today as part of our Talks

  • at Google series, as well as a series of conversations

  • we're having around AI ethics and technology ethics more

  • generally.

  • So today, I'm honored to have Professor Yuval Noah

  • Harari with us.

  • Yuval is an Israeli historian and a professor at the Hebrew

  • University of Jerusalem.

  • He is a dynamic speaker, thinker, and now

  • an international bestselling author.

  • He's the author of three books.

  • We're going to talk about each of those books today.

  • The first book he published in 2014, "Sapien," which explored

  • some of our history as humans.

  • His second book in 2016 had an interesting take on our future

  • as humans.

  • It was "Homo Deus."

  • And then recently published a new book,

  • the "21 Lessons for the 21st Century,"

  • which attempts to grapple with some of the issues,

  • the pressing issues that we are facing today.

  • So we'll talk about some of the themes in each of those books

  • as we go through our conversation.

  • But collectively, his writings explore very big concepts

  • like free will and consciousness and intelligence.

  • So we'll have a lot to explore with Yuval today.

  • So with that, please join me in welcoming Professor Yuval

  • to Google.

  • [APPLAUSE]

  • YUVAL NOAH HARARI: Hello.

  • WILSON WHITE: Thank you, Professor, for joining us.

  • Before getting started, I have to say

  • that when the announcement went out

  • across Google about this talk, I got several emails

  • from many Googlers around the world who told me

  • that they had either read or are currently reading

  • one or multiple of your books.

  • So if you are contemplating a fourth book,

  • maybe on the afterlife, no spoilers

  • during this conversation.

  • I want to start with maybe some of the themes in both

  • your current book, "21 Lessons," as well

  • as "Homo Deus," because I'm the father of two young kids.

  • I have two daughters, a five-year-old

  • and a three-year-old.

  • And the future that you paint in "Homo Deus" is interesting.

  • So I'd like to ask you, what should I

  • be teaching my daughters?

  • YUVAL NOAH HARARI: That nobody knows

  • how the world would look like in 2050,

  • except that it will be very different from today.

  • So the most important things to emphasize in education

  • are things like emotional intelligence

  • and mental stability, because the one thing

  • that they will need for sure is the ability

  • to reinvent themselves repeatedly

  • throughout their lives.

  • It's really first time in history

  • that we don't really know what particular skills to teach

  • young people, because we just don't

  • know in what kind of world they will be living.

  • But we do know they will have to reinvent themselves.

  • And especially if you think about something like the job

  • market, maybe the greatest problem they will face

  • will be psychological.

  • Because at least beyond a certain age,

  • it's very, very difficult for people to reinvent themselves.

  • So we kind of need to build identities.

  • I mean, if previously, if traditionally people built

  • identities like stone houses with very deep foundations,

  • now it makes more sense to build identities like tents that you

  • can fold and move elsewhere.

  • Because we don't know where you will have to move,

  • but you will have to move.

  • WILSON WHITE: You will have to move.

  • So I may have to go back to school now

  • to learn these things so that I can teach the next generation

  • of humans here.

  • In "21 Lessons for the 21st Century,"

  • you tackle several themes that even we at Google,

  • as a company who are on the leading edge of technology

  • and how technology is being deployed in society,

  • we wrestle with some of the same issues.

  • Tell me a bit about your thoughts

  • on why democracy is in crisis.

  • That's a theme in the current book,

  • and I want to explore that a bit.

  • Why you think liberal democracy as we knew

  • it is currently in crisis.

  • YUVAL NOAH HARARI: Well, the entire liberal democratic

  • system is built on philosophical ideas we've inherited

  • from the 18th century, especially the idea

  • of free will, which underlies the basic models

  • of the liberal world view like the voter knows best,

  • the customer is always right, beauty

  • is in the eye of the beholder, follow your heart,

  • do what feels good.

  • All these liberal models, which are

  • the foundation of our political and economic system.

  • They assume that the ultimate authority is the free choices

  • of individuals.

  • I mean, there are, of course, all kinds of limitations

  • and boundary cases and so forth, but when

  • push comes to shove, for instance,

  • in the economic field, then corporations

  • will tend to retreat behind this last line of defense

  • that this is what the customers want.

  • The customer is always right.

  • If the customers want it, it can't be wrong.

  • Who are you to tell the customers that they are wrong?

  • Now of course, there are many exceptions,

  • but this is the basics of the free market.

  • This is the first and last thing you learn.

  • The customer is always right.

  • So the ultimate authority in the economic field

  • is the desires of the customers.

  • And this is really based on a philosophical and metaphysical

  • view about free will, that the desires of the customer, they

  • emanate, they represent the free will of human beings,

  • which is the highest authority in the universe.

  • And therefore, we must abide by them.

  • And it's the same in the political field

  • with the voter knows best.

  • And this was OK for the last two or three centuries.

  • Because even though free will was always a myth and not

  • a scientific reality--

  • I mean, science knows of only two kinds

  • of processes in nature.

  • It knows about deterministic processes

  • and it knows about random processes.

  • And their combination results in probabilistic processes.

  • But randomness and probability, they are not freedom.

  • They mean that I can't predict your actions

  • with 100% accuracy, because there is randomness.

  • But a random robot is not free.

  • If you connect a robot, say, to uranium, a piece of uranium,

  • and the decisions of the robot is determined

  • by random processes of the disintegration of uranium

  • atoms, so you will never be able to predict exactly

  • what this robot will do.

  • But this is not freedom.

  • This is just randomness.

  • Now this was always true from a scientific perspective.

  • Humans, certainly they have a will.

  • They make decisions.

  • They make choices.

  • But they are not free to choose the will.

  • The choices are not independent.

  • They depend on a million factors,

  • genetic and hormonal and social and cultural and so forth,

  • which we don't choose.

  • Now up till now in history, the humans

  • were so complicated that for a practical perspective,

  • it still made sense to believe in free will,

  • because nobody could understand you better

  • than you understand yourself.

  • You had this inner realm of desires and thoughts

  • and feelings which you had privileged access

  • to this inner realm.

  • WILSON WHITE: Yeah, but that hasn't changed today, right?

  • Like, that--

  • YUVAL NOAH HARARI: It has changed.

  • There is no longer--

  • the privilege access now belongs to corporations like Google.

  • They can have access to things happening ultimately

  • inside my body and brain, which I don't know about.

  • There is somebody out there-- and not just one.

  • All kinds of corporations and governments that maybe not

  • today, maybe in five years, 10 years, 20 years, they

  • will have privileged access to what's happening inside me.

  • More privileged than my access.

  • They could understand what is happening in my brain