Placeholder Image

Subtitles section Play video

  • Welcome to Tesla Ai Day 2022.

  • I do want to set some expectations with respect to our optimist robot.

  • Um, as as you know, last year, it was just a person in a robot suit.

  • Uh, but we've not we've come a long way and it's I think, you know, compared to that, it's going to be very impressive.

  • So should we, should we bring out the vote before we do that?

  • We have one one little bonus tip for the day.

  • This is actually the first time we try this robot without any backup support, cranes, mechanical mechanisms, no cables, nothing.

  • You ready?

  • Let's go.

  • Mhm.

  • Think about something.

  • So this is essentially the same full self driving computer that runs in your Tesla cars by the way.

  • This is this is literally the first time the robot has operated without a tether was on stage tonight.

  • So the robot can actually do a lot more than we just showed you.

  • We just didn't want it to fall on its face.

  • Uh, so we'll show you some videos now of the robot doing a bunch of other things.

  • Yeah, we wanted to show a little bit more what we've done over the past few months with about and just walking around and dancing on stage.

  • Um, just humble beginnings, but you can see the autopilot, neural networks running as is just retrained for the bud directly on that on that new platform.

  • That's my watering can when you, when you see a rendered view.

  • That's that's the robot, that's the that's the world the robot sees.

  • So it's it's very clearly identifying objects like this is the object.

  • It should pick up, picking it up.

  • We use the same process as we did for autopilot to collect data and train your networks that we didn't deploy on the robot.

  • That's an example that illustrates the upper body a little bit more.

  • That what what you saw was what we call bubble.

  • See that's our uh sort of rough development robot using semi off the shelf actuators.

  • Um But we actually have gone a step further than that already.

  • The team's done an incredible job.

  • Um And we actually have an optimist with uh fully Tesla designed and both actuators um battery pack control system everything.

  • Um It wasn't quite ready to walk but I think it will walk in a few weeks but we wanted to show you the robot the something that's actually fairly close to what will go into production and um and show you all all the things that can do.

  • So let's bring it up, do it.

  • Mhm.

  • Mhm.

  • With the degrees of freedom that we expect to have an optimist production unit one which is the ability to move all the fingers independently move the uh to have the thumb have two degrees of freedom.

  • Uh So it has opposable thumbs and both left and right hand.

  • So it's able to operate tools and do useful things.

  • Our goal is to make a a useful humanoid robot as quickly as possible.

  • But optimist is designed to be an extremely capable robot, but made in very high volume, probably ultimately millions of units.

  • Um, and it is expected to cost much less than a car.

  • So, uh, I would say probably less than $20,000.

  • The potential for optimistic is I think appreciated by very few people.

  • Hey as usual, Tesla demos are coming in hot.

  • So that robot that came out and did the little routine for you guys.

  • We had that within six months built, working on software integration, hardware upgrades over the months since then.

  • But in parallel, we've also been designing the next generation, this one over here.

  • Obviously there's a lot that's changed since last year, but there's a few things that are still the same.

  • You'll notice we still have this really detailed focus on the true human form.

  • So on the screen here, you'll see in orange are actuators, which we'll get to in a little bit and in blue, our electrical system.

  • So in the middle of our torso.

  • Actually, it is the torso.

  • We have our battery pack.

  • This is sized at 2.3 kilowatt hours, which is perfect for about a full day's worth of work.

  • What's really unique about this battery pack is it has all of the battery electronics integrated into a single PCB within the pack.

  • So going on to sort of, our brain, it's not in the head, but it's pretty close.

  • Um, also in our torso we have our central computer.

  • So we still are gonna, it's gonna do everything that a human brain does, processing vision data, making split second decisions based on multiple sensory inputs and also communications.

  • So to support communications it's equipped with wireless connectivity as well as audio support.

  • And then it also has hardware level security features which are important to protect both the robot and the people around the robot.

  • So now that we have are sort of core, we're gonna need some limbs on this guy.

  • Um and we'd love to show you a little bit about our actuators and are fully functional hands as well.

  • So there are many similarities between a car and the robot when it comes to power train design.

  • The most important thing that matters here is energy mass and cost.

  • In the particular case, you see a car with two drive units and the drive units are used in order to accelerate the car 0 to 60 MPH time or drive a city drive side while the robot that has 28 actuators.

  • And it's not obvious what are the tasks that actuator level.

  • So we have tasked that are higher level like walking or climbing stairs or carrying a heavy object which need to be translated into joint into joint specs, the rotary actuator in particular has a mechanical class integrated on the high speed side, angular contact ball bearing and on the high speed side and on the low speed side, cross roller bearing and the year train is a strain wave year.

  • Um there are three integrated sensors here and bespoke permanent magnet machine.

  • So our actuator is able to lift a half ton, 9-foot concert grand piano.

  • Our fingers are driven by metallic tendons that are both flexible and strong.

  • We have the ability to complete wide aperture power grasps while also being optimized for precision gripping of small, thin and delicate objects.

  • Some basic stats about her hands that has six actuators and 11° of freedom.

  • It has an in hand controller which drives the fingers and receive sensor feedback reported directly from autopilot to the bots situation.

  • It's exactly the same occupancy network that we're talking to a little bit more details later with the autopilot team that is now running on the But here in this video, the only thing that changed really is the training data that we have to recollect.

  • We're also trying to find ways to improve those occupancy networks um using work made on your radiance fields to get really great volumetric rendering of the butts environments for example, here.

  • So machinery that the but might have to interact with.

  • So we've been training more neural networks to identify high frequency features, key points within the body, camera streams and track them across frames over time as the boat navigates with its environment.

  • And we're using those points to get a better estimate of the butts post and tragic trade within its environment as it's walking.

  • And this is a video of the motion control code running in the opposite.

  • Similar simulator showing the evolution of the robots work overtime.

  • So as you can see, we studied quite slowly in april and start accelerating as we run a lot more joints and depart.

  • More advanced techniques like arms balancing over the past few months.

  • We wanted to manipulate objects while looking as natural as possible um and also get there quickly.

  • So what we've done is we've broken this process down into two steps.

  • First is generating a library of natural motion references um or we can call them demonstrations and then we've adapted these motion references online to the current real world situation.

  • So let's say we have a human demonstration of picking up an object.

  • We can get a motion capture of that demonstration which is visualized right here as a bunch of key frames representing the locations, the hands, the elbows, the torso.

  • We can map that to the robot using inverse cinematics.

  • And if we collect a lot of these now we have a library that we can work with.

  • But a single demonstration is not generalize a ble to the variation in the real world.

  • For instance, this would only work for a box in a very particular local location.

  • So what we've also done is run these reference trajectories through a trajectory optimization program which solves for where the hand should be, how the robot should balance during, uh, when it needs to adapt the motion to the real world.

  • So for instance, if the box is in this location, then our optimizer we'll create this trajectory instead.

  • I think the first thing within the next few weeks is to get optimists at least that part with bumble see the other, but prototype you saw earlier and probably beyond.

  • Um, we're also going to start focusing on the real use case at one of our factories, um, and make this project a reality and change the entire economy.

  • All of this was done in barely six or eight months.

  • Thank you very much.

Welcome to Tesla Ai Day 2022.

Subtitles and vocabulary

Click the word to look it up Click the word to find further inforamtion about it