Subtitles section Play video
(Laughter)
(Laughter)
That's SpotMini.
He'll be back in a little while.
I --
(Applause)
I love building robots.
And my long-term goal is to build robots
that can do what people and animals do.
And there's three things in particular
that we're interested in.
One is balance and dynamic mobility,
the second one is mobile manipulation,
and the third one is mobile perception.
So, dynamic mobility and balance --
I'm going to do a demo for you.
I'm standing here, balancing.
I can see you're not very impressed. OK, how about now?
(Laughter)
How about now?
(Applause)
Those simple capabilities mean that people can go almost anywhere on earth,
on any kind of terrain.
We want to capture that for robots.
What about manipulation?
I'm holding this clicker in my hand;
I'm not even looking at it,
and I can manipulate it without any problem.
But even more important,
I can move my body while I hold the manipulator, the clicker,
and stabilize and coordinate my body,
and I can even walk around.
And that means I can move around in the world
and expand the range of my arms and my hands
and really be able to handle almost anything.
So that's mobile manipulation.
And all of you can do this.
Third is perception.
I'm looking at a room with over 1,000 people in it,
and my amazing visual system can see every one of you --
you're all stable in space,
even when I move my head,
even when I move around.
That kind of mobile perception is really important for robots
that are going to move and act
out in the world.
I'm going to give you a little status report
on where we are in developing robots toward these ends.
The first three robots are all dynamically stabilized robots.
This one goes back a little over 10 years ago --
"BigDog."
It's got a gyroscope that helps stabilize it.
It's got sensors and a control computer.
Here's a Cheetah robot that's running with a galloping gait,
where it recycles its energy,
it bounces on the ground,
and it's computing all the time
in order to keep itself stabilized and propelled.
And here's a bigger robot
that's got such good locomotion using its legs,
that it can go in deep snow.
This is about 10 inches deep,
and it doesn't really have any trouble.
This is Spot, a new generation of robot --
just slightly older than the one that came out onstage.
And we've been asking the question --
you've all heard about drone delivery:
Can we deliver packages to your houses with drones?
Well, what about plain old legged-robot delivery?
(Laughter)
So we've been taking our robot to our employees' homes
to see whether we could get in --
(Laughter)
the various access ways.
And believe me, in the Boston area,
there's every manner of stairway twists and turns.
So it's a real challenge.
But we're doing very well, about 70 percent of the way.
And here's mobile manipulation,
where we've put an arm on the robot,
and it's finding its way through the door.
Now, one of the important things about making autonomous robots
is to make them not do just exactly what you say,
but make them deal with the uncertainty of what happens in the real world.
So we have Steve there, one of the engineers,
giving the robot a hard time.
(Laughter)
And the fact that the programming still tolerates all that disturbance --
it does what it's supposed to.
Here's another example, where Eric is tugging on the robot
as it goes up the stairs.
And believe me,
getting it to do what it's supposed to do in those circumstances
is a real challenge,
but the result is something that's going to generalize
and make robots much more autonomous than they would be otherwise.
This is Atlas, a humanoid robot.
It's a third-generation humanoid that we've been building.
I'll tell you a little bit about the hardware design later.
And we've been saying:
How close to human levels of performance and speed could we get
in an ordinary task,
like moving boxes around on a conveyor?
We're getting up to about two-thirds of the speed that a human operates
on average.
And this robot is using both hands, it's using its body,
it's stepping,
so it's really an example of dynamic stability,
mobile manipulation
and mobile perception.
Here --
(Laughter)
We actually have two Atlases.
(Laughter)
Now, everything doesn't go exactly the way it's supposed to.
(Laughter)
(Laughter)
(Laughter)
And here's our latest robot, called "Handle."
Handle is interesting, because it's sort of half like an animal,
and it's half something else
with these leg-like things and wheels.
It's got its arms on in kind of a funny way,
but it really does some remarkable things.
It can carry 100 pounds.
It's probably going to lift more than that,
but so far we've done 100.
It's got some pretty good rough-terrain capability,
even though it has wheels.
And Handle loves to put on a show.
(Laughter)
(Applause)
I'm going to give you a little bit of robot religion.
A lot of people think that a robot is a machine where there's a computer
that's telling it what to do,
and the computer is listening through its sensors.
But that's really only half of the story.
The real story is that the computer is on one side,
making suggestions to the robot,
and on the other side are the physics of the world.
And that physics involves gravity, friction, bouncing into things.
In order to have a successful robot,
my religion is that you have to do a holistic design,
where you're designing the software, the hardware and the behavior
all at one time,
and all these parts really intermesh and cooperate with each other.
And when you get the perfect design, you get a real harmony
between all those parts interacting with each other.
So it's half software and half hardware,
plus the behavior.
We've done some work lately on the hardware, where we tried to go --
the picture on the left is a conventional design,
where you have parts that are all bolted together,
conductors, tubes, connectors.
And on the right is a more integrated thing;
it's supposed to look like an anatomy drawing.
Using the miracle of 3-D printing,
we're starting to build parts of robots
that look a lot more like the anatomy of an animal.
So that's an upper-leg part that has hydraulic pathways --
actuators, filters --
all embedded, all printed as one piece,
and the whole structure is developed
with a knowledge of what the loads and behavior are going to be,
which is available from data recorded from robots
and simulations and things like that.
So it's a data-driven hardware design.
And using processes like that,
not only the upper leg but some other things,
we've gotten our robots to go from big, behemoth, bulky, slow, bad robots --
that one on the right, weighing almost 400 pounds --
down to the one in the middle which was just in the video,
weighs about 190 pounds,
just a little bit more than me,
and we have a new one,
which is working but I'm not going to show it to you yet,