Placeholder Image

Subtitles section Play video

  • When do you think the singularity is coming?

  • Uh, Ray, Kurzweil says 2045.

  • Uh, I'll stick with that.

  • He's 26 years.

  • Yeah, yeah, Yeah, it could be sooner.

  • Could be later.

  • Could be 2030.

  • Could be 2070.

  • I don't think it's 20.

  • 200.

  • I don't think it's 2020.

  • It won't be 2030.

  • It's too soon.

  • I don't know, man.

  • Okay, I guess when it starts to happen, it happens real fast.

  • Well, that's sort of the point.

  • If you suppose that as a thought experiment like supposing 2025 Google Deepmind or my own a team in singular eating that an open cog, like supposed by 2025 1 of us manages to creating a I that is essentially as smart as a human being, right?

  • So it can hold a conversation like a human being.

  • And it can prove math rooms like human mathematician that can analyse a biology data set.

  • Right.

  • So we go and then they like your robots and you put it in the in the robot body.

  • I mean, that's a separate problem.

  • It could operate money, robot bodies all at once.

  • Right?

  • But unlike Sofia right now, it can really understand what it's seeing and what it's doing fully at the level of the human cancers.

  • Suppose you get there.

  • Like, how far are you from a true singularity?

  • Because this a I can then copy itself.

  • You make a million of those right of itself.

  • Yeah.

  • Yeah, right.

  • Because you can once as a smart is a human.

  • Okay, let's teach a computer science.

  • I mean, we can send it to school.

  • What can learn?

  • Not they can learn.

  • Programming can learn hardware engineering, and then you can copy a million of those.

  • Right?

  • So all million of those, maybe half of those were working on improving its own intelligence, but But they're not working in isolation.

  • Like humans are taken.

  • Share thoughts directly because they're all just running in the compute cloud.

  • Right?

  • So it seems very plausible.

  • Within a few years of getting to that human level, I I it's going to invent all manner of amazing new things, and it doesn't mean that will be instantaneous.

  • I mean, doing lab research still takes time.

  • Building new hardware still takes time, but of course, it can be working on how to speed that up, right?

  • Like having more fully automate manufacturing in laboratory research.

  • So yet and then it could take us out of the process as well.

  • Possibly it could mean that depends on the value system that the a g I has right?

  • And I mean, this is This is why it's important to, you know, give values of kindness, compassion, tolerance, inclusiveness, love for all sentient beings.

  • We want to get these positive values into the i ast much as we can, and we think we can program that into a I.

  • I think you teach it, teach you programming with the ability to learn that, but then wanted on learned that if it doesn't suit it over, that's what we don't know.

  • I mean, it's very subtle because human values air a moving target, right?

  • Like the values of suburban New Jersey in 1975 when I was in elementary school are not the same as the values in suburban New Jersey in the US right now, But I mean, back then, gay marriage was illegal on Dvir, violently opposed by by almost everybody, right and racism was very rampant.

  • They're so I mean, human values of their evolving all the time by the values of medieval Europe, you and I, and probably almost everyone listening to us deserve to burn in hell for effort, right?

  • Yeah.

  • So I'm in.

  • You don't.

  • If you gave the AI exact even values from 2019 then by 2050 it's gonna be horrible.

  • Would be like having a dictated with the values of 1950 America or something.

  • Right?

  • So so you want an AI that we'll have involving values in a way that somehow respects the ongoing evolution of human values and that hopefully, still respects it and doesn't just make its own?

  • Yeah, yeah, which is which is very subtle, right?

  • So some of my friends who are vegans and animal rights activists are like, Well, what if the what if the it treats us the way we treat less intelligent animals?

  • And you think of my legal?

  • We care about extinction of species, though I mean, not as much as we should, but in general, we don't want the extinct entire species of subhuman animals.

  • But we don't care much about one cow, sheep, Iger, wolf.

  • More or less right.

  • We care about like the genome.

  • So if you took that analogy, the AI it would like to keep humans around.

  • I mean, were the creators were the ancestors.

  • We have our own unique aesthetic value, but by that analogy it may not give a crap about one human more or less any more than we care about one wolf for pig More.

  • So when we didn't keep the Neanderthals around, no.

  • But we weren't as reflectively intelligent then as we are now.

  • And I think that there is an argument that as human cultures advanced more and more toward abundance, away from scarcity, there's more caring about the environment.

  • There's there's, there's more compassion toward toward non human animals.

  • I mean, there was a lot in, so they age society, and it sort of went down in the industrial Revolution, and now now it's It's certainly going up again.

  • But I guess the point is we are iis not only to be super intelligent, we want them to be super compassionate, like we want them to be more compassionate to us.

  • Way are to each other because we're killing each other.

  • Humans are killing each other for no good reason all the time.

  • So we want them to be morally better than us in it.

  • In some sense, right?

  • Well, they are, you know, we're not set.

  • Were not.

  • On the whole, we're not setting the best example in terms of the way human society is being regulated right now.

  • Nor the way we treat nonhuman life forms on the planet nor the applications that were deploying.

When do you think the singularity is coming?

Subtitles and vocabulary

Operation of videos Adjust the video here to display the subtitles

B1 human singularity ai intelligent suburban compassionate

SINGULARITY IN 2045: Creating Robots That Can Experience Life Like A Human | Dr. Ben Goertzel

  • 9 1
    林宜悉 posted on 2020/09/08
Video vocabulary