Placeholder Image

Subtitles section Play video

  • We are built out of very small stuff,

  • and we are embedded in a very large cosmos,

  • and the fact is that we are not very good at understanding reality

  • at either of those scales,

  • and that's because our brains

  • haven't evolved to understand the world at that scale.

  • Instead, we're trapped on this very thin slice of perception

  • right in the middle.

  • But it gets strange, because even at that slice of reality that we call home,

  • we're not seeing most of the action that's going on.

  • So take the colors of our world.

  • This is light waves, electromagnetic radiation that bounces off objects

  • and it hits specialized receptors in the back of our eyes.

  • But we're not seeing all the waves out there.

  • In fact, what we see

  • is less than a 10 trillionth of what's out there.

  • So you have radio waves and microwaves

  • and X-rays and gamma rays passing through your body right now

  • and you're completely unaware of it,

  • because you don't come with the proper biological receptors

  • for picking it up.

  • There are thousands of cell phone conversations

  • passing through you right now,

  • and you're utterly blind to it.

  • Now, it's not that these things are inherently unseeable.

  • Snakes include some infrared in their reality,

  • and honeybees include ultraviolet in their view of the world,

  • and of course we build machines in the dashboards of our cars

  • to pick up on signals in the radio frequency range,

  • and we built machines in hospitals to pick up on the X-ray range.

  • But you can't sense any of those by yourself,

  • at least not yet,

  • because you don't come equipped with the proper sensors.

  • Now, what this means is that our experience of reality

  • is constrained by our biology,

  • and that goes against the common sense notion

  • that our eyes and our ears and our fingertips

  • are just picking up the objective reality that's out there.

  • Instead, our brains are sampling just a little bit of the world.

  • Now, across the animal kingdom,

  • different animals pick up on different parts of reality.

  • So in the blind and deaf world of the tick,

  • the important signals are temperature and butyric acid;

  • in the world of the black ghost knifefish,

  • its sensory world is lavishly colored by electrical fields;

  • and for the echolocating bat,

  • its reality is constructed out of air compression waves.

  • That's the slice of their ecosystem that they can pick up on,

  • and we have a word for this in science.

  • It's called the umwelt,

  • which is the German word for the surrounding world.

  • Now, presumably, every animal assumes

  • that its umwelt is the entire objective reality out there,

  • because why would you ever stop to imagine

  • that there's something beyond what we can sense.

  • Instead, what we all do is we accept reality

  • as it's presented to us.

  • Let's do a consciousness-raiser on this.

  • Imagine that you are a bloodhound dog.

  • Your whole world is about smelling.

  • You've got a long snout that has 200 million scent receptors in it,

  • and you have wet nostrils that attract and trap scent molecules,

  • and your nostrils even have slits so you can take big nosefuls of air.

  • Everything is about smell for you.

  • So one day, you stop in your tracks with a revelation.

  • You look at your human owner and you think,

  • "What is it like to have the pitiful, impoverished nose of a human?

  • (Laughter)

  • What is it like when you take a feeble little noseful of air?

  • How can you not know that there's a cat 100 yards away,

  • or that your neighbor was on this very spot six hours ago?"

  • (Laughter)

  • So because we're humans,

  • we've never experienced that world of smell,

  • so we don't miss it,

  • because we are firmly settled into our umwelt.

  • But the question is, do we have to be stuck there?

  • So as a neuroscientist, I'm interested in the way that technology

  • might expand our umwelt,

  • and how that's going to change the experience of being human.

  • So we already know that we can marry our technology to our biology,

  • because there are hundreds of thousands of people walking around

  • with artificial hearing and artificial vision.

  • So the way this works is, you take a microphone and you digitize the signal,

  • and you put an electrode strip directly into the inner ear.

  • Or, with the retinal implant, you take a camera

  • and you digitize the signal, and then you plug an electrode grid

  • directly into the optic nerve.

  • And as recently as 15 years ago,

  • there were a lot of scientists who thought these technologies wouldn't work.

  • Why? It's because these technologies speak the language of Silicon Valley,

  • and it's not exactly the same dialect as our natural biological sense organs.

  • But the fact is that it works;

  • the brain figures out how to use the signals just fine.

  • Now, how do we understand that?

  • Well, here's the big secret:

  • Your brain is not hearing or seeing any of this.

  • Your brain is locked in a vault of silence and darkness inside your skull.

  • All it ever sees are electrochemical signals

  • that come in along different data cables,

  • and this is all it has to work with, and nothing more.

  • Now, amazingly,

  • the brain is really good at taking in these signals

  • and extracting patterns and assigning meaning,

  • so that it takes this inner cosmos and puts together a story

  • of this, your subjective world.

  • But here's the key point:

  • Your brain doesn't know, and it doesn't care,

  • where it gets the data from.

  • Whatever information comes in, it just figures out what to do with it.

  • And this is a very efficient kind of machine.

  • It's essentially a general purpose computing device,

  • and it just takes in everything

  • and figures out what it's going to do with it,

  • and that, I think, frees up Mother Nature

  • to tinker around with different sorts of input channels.

  • So I call this the P.H. model of evolution,

  • and I don't want to get too technical here,

  • but P.H. stands for Potato Head,

  • and I use this name to emphasize that all these sensors

  • that we know and love, like our eyes and our ears and our fingertips,

  • these are merely peripheral plug-and-play devices:

  • You stick them in, and you're good to go.

  • The brain figures out what to do with the data that comes in.

  • And when you look across the animal kingdom,

  • you find lots of peripheral devices.

  • So snakes have heat pits with which to detect infrared,

  • and the ghost knifefish has electroreceptors,

  • and the star-nosed mole has this appendage

  • with 22 fingers on it

  • with which it feels around and constructs a 3D model of the world,

  • and many birds have magnetite so they can orient

  • to the magnetic field of the planet.

  • So what this means is that nature doesn't have to continually

  • redesign the brain.

  • Instead, with the principles of brain operation established,

  • all nature has to worry about is designing new peripherals.

  • Okay. So what this means is this:

  • The lesson that surfaces

  • is that there's nothing really special or fundamental

  • about the biology that we come to the table with.

  • It's just what we have inherited

  • from a complex road of evolution.

  • But it's not what we have to stick with,

  • and our best proof of principle of this

  • comes from what's called sensory substitution.

  • And that refers to feeding information into the brain

  • via unusual sensory channels,

  • and the brain just figures out what to do with it.

  • Now, that might sound speculative,

  • but the first paper demonstrating this was published in the journal Nature in 1969.

  • So a scientist named Paul Bach-y-Rita

  • put blind people in a modified dental chair,

  • and he set up a video feed,

  • and he put something in front of the camera,

  • and then you would feel that

  • poked into your back with a grid of solenoids.

  • So if you wiggle a coffee cup in front of the camera,

  • you're feeling that in your back,

  • and amazingly, blind people got pretty good

  • at being able to determine what was in front of the camera

  • just by feeling it in the small of their back.

  • Now, there have been many modern incarnations of this.

  • The sonic glasses take a video feed right in front of you

  • and turn that into a sonic landscape,

  • so as things move around, and get closer and farther,

  • it sounds like "Bzz, bzz, bzz."

  • It sounds like a cacophony,

  • but after several weeks, blind people start getting pretty good

  • at understanding what's in front of them

  • just based on what they're hearing.

  • And it doesn't have to be through the ears:

  • this system uses an electrotactile grid on the forehead,

  • so whatever's in front of the video feed, you're feeling it on your forehead.

  • Why the forehead? Because you're not using it for much else.

  • The most modern incarnation is called the brainport,

  • and this is a little electrogrid that sits on your tongue,

  • and the video feed gets turned into these little electrotactile signals,

  • and blind people get so good at using this that they can throw a ball into a basket,

  • or they can navigate complex obstacle courses.

  • They can come to see through their tongue.

  • Now, that sounds completely insane, right?

  • But remember, all vision ever is

  • is electrochemical signals coursing around in your brain.

  • Your brain doesn't know where the signals come from.

  • It just figures out what to do with them.

  • So my interest in my lab is sensory substitution for the deaf,

  • and this is a project I've undertaken

  • with a graduate student in my lab, Scott Novich,

  • who is spearheading this for his thesis.

  • And here is what we wanted to do:

  • we wanted to make it so that sound from the world gets converted

  • in some way so that a deaf person can understand what is being said.

  • And we wanted to do this, given the power and ubiquity of portable computing,

  • we wanted to make sure that this would run on cell phones and tablets,

  • and also we wanted to make this a wearable,

  • something that you could wear under your clothing.

  • So here's the concept.

  • So as I'm speaking, my sound is getting captured by the tablet,

  • and then it's getting mapped onto a vest that's covered in vibratory motors,

  • just like the motors in your cell phone.

  • So as I'm speaking,

  • the sound is getting translated to a pattern of vibration on the vest.

  • Now, this is not just conceptual:

  • this tablet is transmitting Bluetooth, and I'm wearing the vest right now.

  • So as I'm speaking -- (Applause) --

  • the sound is getting translated into dynamic patterns of vibration.

  • I'm feeling the sonic world around me.

  • So, we've been testing this with deaf people now,

  • and it turns out that after just a little bit of time,

  • people can start feeling, they can start understanding

  • the language of the vest.

  • So this is Jonathan. He's 37 years old. He has a master's degree.

  • He was born profoundly deaf,

  • which means that there's a part of his umwelt that's unavailable to him.

  • So we had Jonathan train with the vest for four days, two hours a day,

  • and here he is on the fifth day.

  • Scott Novich: You.

  • David Eagleman: So Scott says a word, Jonathan feels it on the vest,

  • and he writes it on the board.

  • SN: Where. Where.

  • DE: Jonathan is able to translate this complicated pattern of vibrations

  • into an understanding of what's being said.

  • SN: Touch. Touch.

  • DE: Now, he's not doing this --

  • (Applause) --

  • Jonathan is not doing this consciously, because the patterns are too complicated,

  • but his brain is starting to unlock the pattern that allows it to figure out

  • what the data mean,

  • and our expectation is that, after wearing this for about three months,

  • he will have a direct perceptual experience of hearing

  • in the same way that when a blind person passes a finger over braille,

  • the meaning comes directly off the page without any conscious intervention at all.

  • Now, this technology has the potential to be a game-changer,

  • because the only other solution for deafness is a cochlear implant,

  • and that requires an invasive surgery.

  • And this can be built for 40 times cheaper than a cochlear implant,

  • which opens up this technology globally, even for the poorest countries.

  • Now, we've been very encouraged by our results with sensory substitution,

  • but what we've been thinking a lot about is sensory addition.

  • How could we use a technology like this to add a completely new kind of sense,

  • to expand the human umvelt?

  • For example, could we feed real-time data from the Internet

  • directly into somebody's brain,

  • and can they develop a direct perceptual experience?

  • So here's an experiment we're doing in the lab.

  • A subject is feeling a real-time streaming feed from the Net of data

  • for five seconds.

  • Then, two buttons appear, and he has to make a choice.