Flexible automation that fits-podcast transcript
Dale Tutt: Is there a distinct advantage with the humanoid robots that we haven’t talked about already?
Rahul Garg: The one big piece is the flexibility that they bring, right, similar to a human to do many different things, and the dexterity that they bring, similar to a human in terms of many different things. The other interesting aspect that they bring to the table is doing things that humans could not. for example, turning their head around by 180 degrees and seeing what’s in the back. Today, the degrees of freedom that a humanoid has is relatively lesser compared to a humanoid, and that’s primarily because of weight and cost constraints and the intelligence required behind it. But as the compute power capabilities improve, as the cost and the weight of these things reduce, they will bring in a lot more in terms of their ability to do things much more beyond what a human can do. And this goes to the original context we were talking about. Why these humanoids are very exciting is because they can fit into a human environment as is today. Think about how that whole process can be further enhanced by not being constrained by what a human can do today.
Many assembly processes in an airplane have been designed for supporting a human to do a certain function. Maybe the way the routing of the cable is done or the harness is done, and how that arm will go in to lock a certain piece and put in a clip is designed around how a human arm would move. Now imagine if you don’t have to worry about the constraints of a human arm or the size or whatever the length, and you can have a humanoid that has a lot more dexterity and flexibility. You may change the whole manufacturing process in a much more efficient way, right, versus being constrained by what a human is possible or could possibly do. Getting an alternative machine to kind of come in and replace that process is one of the biggest value proposition, right?
And the ability for that human right to interact with human tools, legacy systems, right, and pretty much fit into that existing environment, drop it in any simplistic way, is the biggest value add. The other big value add, the huge, much bigger value add, is that you train a humanoid once or a robot once, you don’t have to worry about it for the rest of its life. You need to increase your capacity. You’ve got that guy trained one time, you add five more of those, and they get trained within the flashing of a software. You don’t need to worry about it for the rest of its life. It’ll keep learning, and you’ll keep learning more. And then you replicate that by five, 10, 100,000 times. that the biggest value proposition out there that is going to get uncovered as these things start getting more adoption in the field.
Dale Tutt: But it certainly changes the equation for the learning curve. So as you’re describing, I was having the vision of the Matrix movie, and they’re like, can you program me to learn how to fly a helicopter? And they sit there and download the program, and they start flying the helicopters.
Rahul Garg: That’s the labor shortages, the economic value proposition, the flexibility to handle changeover, changing workouts, changing workflows, and working in an automation environment, working in an already set up facility where there’s already robots. One of the companies that we’ve been talking to, they have built these humanoids or robots, actually, that’s what they’re calling them, to unload trucks. You know, you would think unloading of a truck is a very simple process, but it is not because you have various shapes, sizes of boxes, the way they are packed, it requires some level of understanding dexterity to move all that around. Using AI and using all of the flexibility that we’ve been talking about, they normally are loading in a more efficient way, they are laying them and setting them up for the next step of the process in a more efficient way. There are no errors in that process. The box is always labeled up or labeled down, however you want it. It’s already pre-sorted, how it needs to be used. And it’s making it very easy for them to have 10 of these loaded and unloaded 24 hours a day without having to worry about some of the typical work constraints issues that may be going on in a logistics factory, for example, right? Pretty amazing things that are already beginning to happen.
Dale Tutt: You’ve touched on so many great points there. And I think that you’re actually able to put the AI into the electronics now so that you can actually do the training and you can do the advanced sensing. the optical sensors. This is now why its time has come. I think that really being able to bring these technologies together in a way that’s meaningful for people. And so it’s, and I was going to say, you know, you talked a little bit about generative AI there in its play. There’s also a lot of discussion around physical AI and how it’s embodying the physical geometry. And so could you define physical AI for those people that maybe are not familiar with that? How is it helping in the development of humanoid?
Rahul Garg: The physical AI, the way I look at it is having the ability, first of all, to have different modalities. When we talk about generative AI in a traditional context, it’s more around using text, maybe some pictures, and trying to build some new capabilities based on that. In the context of the physical AI, now you’re bringing in vision, you’re bringing in perception, you’re bringing in a lot more action, you’re bringing in a lot more reasoning, and being able to evaluate how the function or the action that you’re looking the humanoid to perform will interact and how will it react to the physical action that it’s trying to do. Let me make it a little bit more simpler. You have a humanoid arm and you’re trying to get it to move and get that arm to move straight ahead. Now, to get it to move straight ahead, you need certain commands that need to be given to it. When the pandas is just becoming straight, it’s by your side and now you’re trying to raise it and lift it straight ahead. But you need to have the ability to do that vision sensing that there is a wall in front of me and I need to be able to avoid that wall.
To avoid that wall, I cannot extend my arm all the way straight. I need to bend it down at my elbow angle a little bit because I’m going to hit that wall otherwise. To me, that’s all part of that physical AI, where you’re interacting now more with the real environment and being able to take decisions and take actions based on that real physical world that you’re interacting in. That whole computer vision, the sensor fusion, the motion planning, they all become a lot more important in that context. And being able to react based on the sensor feedback that you’re getting is becoming very critical. This is one of the things we spoke about earlier, how the development of sensors has improved dramatically. For example, a very simple case is sensors could not detect whether you’re standing in front of a mirror or it’s just a straight pass-through. Now the vision sensors have the ability to recognize that it’s a mirror. So don’t just walk into it. These are some great capabilities that have come about, which have obviously helped the development and the rise of these things.
Dale Tutt: Trying to think about what the trajectory of humanoid robots look like. You’ve talked about the manufacturing ramp-ups, where they’re going to deliver 120, 150,000 this year versus what we’ve done in the past. And then you’ve talked about the example or how much it’s advanced, even in last months or two since CES. Are we on a hockey stick trajectory? Where do you see this going in the future with the development and advancements of the humanoid robots in terms of numbers and capabilities?
Rahul Garg: I think in the next five years, we are certainly going to move from the hype to the scalable commercial deployment. That’s going to be the one big thing, right? Deployment will be happening. There are numbers all the way from 2 to 3 million of these in the next couple of years. Having said that, it’s not that everyone is going to go all out and start getting these out. So there are various experts who are trying to figure out what’s going to happen from not just a technology perspective. There’s a whole human aspect of this as well, and the environmental aspect of it, and the social economical aspect of these things. There’s a lot more consideration to it than just the technology piece of it. Because right now, it’s a new technology, and it’s got that new flavor of the month aspect of it. But certainly, we are going to see more rapid adoption. One of the other reasons why we are going to see more rapid adoption is the cost coming down. We are going to see the cost come down to that utopia number of $20,000, $30,000. We’re not there as yet.
From a design manufacturing perspective, they’re somewhere in the $150,000, $200,000 range, but that cost will come down. And especially many of the companies in China are focusing on that in a very aggressive way, right? In fact, one of the companies called Unitary, they are selling these for like $599, right? $5,999 a year. It’s going to come down, right? But then that version cannot do a whole lot other than walking along with you. I think cost coming down. will be a big factor. The ability to do a lot of things with the hands, that is still a little bit of a bottleneck. There’s not enough in terms of how the fingers and the hands can move. It’s quite amazing what our hands, as a human, what all they’re capable of, the manipulation that they can do. That still needs some more engineering work that is still out there, and there’s a lot of R&D pouring into that. There are actually companies that are just focusing on that piece itself. So there are still some of these things, but certainly manufacturing, warehousing, we’re going to see a much more larger scale adoption, especially where there are repeatable tasks, where there’s high labor shortages.
The ROIs are quite easily visible. Companies like BMW, Mercedes, Amazon, they have only publicly claimed, right? They’re already starting to use them in their production shops. So it’s not just a question of scaling it. Another big area, actually, by the way, that people are looking at is around healthcare and elder care. You know, I saw an example of a company which has designed these humanoids to provide motion to a person who is completely bedridden. Now, imagine that person who’s been in that position for a long, long time, being able to be strapped to a humanoid and being able to move around their home or whatever. Those are some very interesting cases that are coming along as well.


