Thought Leadership

Advanced robotics and AI in industrial manufacturing- Episode 3 transcript

Chris Pennington: Thanks for joining us today, Max. It’s been great to share your knowledge and experience with our listeners.

Now designing and building an automated factory using simulation is one thing but making sure that those robots can work in harmony with each other is another. Next I’d like to introduce an expert who can enlighten us on the processes that go into making robots move. Hello Etienne and welcome to the podcast. Could you go into a bit more depth about your role and experience first please?


Etienne Ferre:
Hi, Chris. My group comes from a company named Kineo that was founded in 2001 with another research robotic lab in France. Siemens acquired our company in 2012 and since then we continue the same business to develop software components for the industry in general and PLM in particular.

The software component, as you know is a library with C++ interface, so it’s ready for other developers to integrate our algorithm into their application. Today we have three main libraries. The path planner for Kineo the robot and for any type of articulated system, the collision detector KCD, and the flexible cable simulator. And those libraries are distributed to external customers outside of Siemens, but also for other Siemens software like NX or Process Simulate. Process Simulate, automatic path planner, the collision detection, and the simulation of the flexible cable are all provided by Kineo.


Rahul Garg:
  Thanks, Etienne, that’s a great introduction to some of the things you are doing. One of the things you mentioned is that you provide your applications as foundation libraries and that are used inside Siemens digital industry software applications but also used by other companies. Could you give us a sense as to how other robot manufacturers also use some of these libraries?


Etienne Ferre:
Our libraries are mainly integrated in in software that will use a robot, it’s not directly the robot vendor that we use our libraries of past planning and so on, but more in the integration phase. When you go to the shop floor, you have to integrate a robot and you have to pilot the robot and to program the robot automatically, sometimes. They can use our software in order to have a complete system that will automatically program the user robot. In that sense, they could use any type of robot from any vendor.


Rahul Garg:
You know Etienne, one of the things we spoke about earlier is that for manufacturing systems and automation systems to work in perfect harmony and get the highest possible throughputs, you need the automation system, the robot and the human all working together. Obviously, that has challenges. Could you explain to us how Kineo helps with that?


Etienne Ferre:
 Well, when we speak about pathing, we often distinguish between a structured environment and unstructured environment. A structured environment is what we used for decades. Now that everything is known in advance, so we know the position of the robot and the obstacles, we can program the robot in the digital factory, in processor for instance, and the robot will always execute the same movement in the reality.

That’s why on the shop floor very often the robotic cell is in a cage to ensure that the environment remains structured and safe. Contrarily, in an unstructured environment, it’s not known in advance.  Obstacles sometimes were not present in the simulation, or the parts are not always the same, and sometimes a human can enter in the workspace of the robot.

Typically, we don’t know in advance what the motion of the human will be so if we have that type of robot that can share the same space of the of the human, this is what we call a collaborative robot. Sometimes they share the same task, but very often they share the same workspace. In that case, the question of safety is very crucial in the literal sense, because the interaction between the robot and the human can be sometimes very dangerous. If the robot is heavy, talking about robot safety is a very complicated topic because it involves many aspects.

Just like autonomous cars, we don’t claim that we will provide the complete system solving everything, but if we are looking at the system for quickly compute the trajectory that will avoid the human, we can do it because Kineo is a system that takes the human worker enter inside the robot workspace. We can automatically check that the trajectory is still safe for the human, and if not, we can automatically recompute it very quickly to avoid the human and this is what we call reactive path planning.


Rahul Garg:
Could you give an example of where something like this is used in the industry?


Etienne Ferre:
Well, this is still research. It’s not really integrated in day-to-day industry, but we have some research project in the past in Europe. We wanted to have a camera that will look at the motion of the human and in real time allow the human to disturb the motion of the robot and the robot adapt its trajectory. It’s still a prototype and the I don’t know if it is already integrated in a real industry.


Rahul Garg:
But what about in an automotive assembly line with a lift assist? How is that managed?


Etienne Ferre:
A lot of system that we carry, for instance, the load of the seat to help the human assemble the seat inside the car.  In that case the motion will be produced by the human that will drive the seat inside the car. We know you can check in advance that the motion will be possible with respect to the collision in a space to enter the seat inside the inside the car. We will check that there is a motion, and the motion will be executed by the human on the shop floor.


Rahul Garg:
That makes the job of the operator a lot easier. With the lift assist, they don’t have to carry the heavy objects and making sure there will not be any collision in the path of carrying as well.

Etienne Ferre: Yes, and thanks to the digital twin, we can check that the motion will be possible the shop floor.

Rahul Garg: When you talk about motions and robots and lift assist, there are obviously a lot of cables also that come in the way of robots. Is that something that that Kineo helps with as well?


Etienne Ferre: 
Yes. During the development of path planning many years ago, we often had the request from the customer directly to take into account the motion of the keyboard because it’s attached to the robot. If you look at the real robotic cell, there are plenty of cable everywhere around the robot, and those cables can disturb the motion of the robot.

This is why we have developed a real time flexible cable simulator. And thanks to that simulator, the motion of the cable can be validated on the digital twin. Inside process simulate, that the cable will be long enough or that the maximum tension is respected. Thanks to that library, we can simulate and predict what will happen on the shop floor. We can simulate the gravity, the contact with the obstacles, contact with the other cable, and also how the cable is attached to the robot, because there are very complex systems to attach a cable to a robot that we call a dress pack. Thanks to Kineo we can simulate everything that happened in the cable inside the factory.


Rahul Garg:
I would imagine that can become very valuable. Because you don’t want a cable joint to be breaking somewhere through your operation, so being able to evaluate every little element through the digital twin of window cable becomes a very important capability for someone leveraging robots.

Etienne, just one more question from my end. When we were talking to Max earlier, who’s using a lot of robots in the Erlangen factory, he spoke about multiple robots being used at the same time and trying to make sure that there is no collision in two or more robots. Is that something that that Kineo can help with and analyze?


Etienne Ferre:
Yes. This is an ongoing project that we have today because for one robot, it works very well, and we have many customers using our algorithm in their systems. Today we are working on multiple robot motion planning. As Max said in the industry now robotic sale, in the automotive industry for example. For sport welding, you can have a six to eight robots working together in the same workspace.

In the industry, we synchronize the motion with signals exchange. Thanks to Kineo, you can automatically put the right signal at the right time to reduce the cycle time of the global process. This is what we call interlock solving and this is done automatically thanks to the Kineo algorithm, allowing us to see something available in process simulate.


Rahul Garg:
  I would imagine as the adoption of robots increase, the ability to have multiple robots working in the same cell will become more critical. Being able to ensure there are no collisions in that process is obviously is going to be of tremendous value.


Etienne Ferre: 
Yeah. This is something that is done manually today by the by the engineer. In that process, that engineer has to put the right signal and it’s very tedious and it takes a lot of time to tune the robotic cell in order to avoid the crash between the robots. Thanks to Kineo in a matter of seconds, you can automatically put the right signal at the right place.


Rahul Garg:
That can be very helpful for the programmers, making sure that they can get this done a lot more efficiently.

You spoke about multiple robots and how we do the path planning. How do you all go about trying to determine what’s the best path to for a robot to follow? Is there some inspiration you’re getting from somewhere or what are the algorithms you have been using to figure that out?


Etienne Ferre:
I aim to make people understand how our algorithm works. I often say that the path planning algorithms tries to mimic the way the electric arc finds the path. When there is a lightning during a storm, if you film in a very slow motion the creation of the lightning, you can see that there is a tree growing with several branches attracted by the ground. As soon as a branch touches the ground, there is a path, and the electric arc is created.

Our algorithm works in a similar way and when we search a robot trajectory, we start from the starting position of the robot, and we shoot at random position in the space in the space of the degree of freedom of the robot. If a position is collision free, we will add it to the tree as a new branch and we make grow a tree. As soon as what position in the tree can be linked directly to the ending position, we know that there is a path, and we can return the path. This family of algorithms is called probabilistic remap approach because we build a run map at runtime.
 
Rahul Garg:
Yeah, that’s good. Nature has its own ways of inspiring us. This is probably one of those best examples.

Chris Pennington:
We’ve seen a lot of advancements in how robots are trained to handle new tasks and processes. Which technologies do you think will be the most promising to help robots become more autonomous and more mobile?


Etienne Ferre: 
In the autonomous robotics literature we have the concept of perception, decision, and action. The perception is to recognize and to understand the real world. Where are the objects? What type of object is it? A part to grasp or an obstacle?  For this we use the vision technology and object or condition and there are many improvements in those topics today thanks to deep learning techniques.

Decision is to decide the lists of action required to perform a task. For example, to place a part in the box, the robot must move to the grasping position, grasp the part, and move to a position inside the box and release the part. This process is maybe the most difficult to solve in the industry, because for Pick AI, placing is very easy. But for complex industrial tasks, we still rely on the human intelligence because there is much complex knowledge to have and many business constraints that are very difficult to tackle automatically. Maybe now that we have AI techniques with LLM, we may have some promising system, but it’s not solved today.

The last process in the concept of perception decision action is the action and motion. This is when the robot decides to go to one position. It must compute the valid motion to go there and for this can do it with our laboratory. And it can compute several types of motion for almost any type of articulated system.


Chris Pennington: 
Is your team developing any autonomous capabilities around new algorithms? In particular, anything around integrating AI?


Etienne Ferre
: We stay at the at the level of trajectory planning and trajectory optimization and for this we use the concept of probabilistic framework approach. It’s a family of algorithms that we use, and it works well for integration of AI. It depends on what you are calling AI. Artificial intelligence is a complex algorithm. So, in that case, yes, we are integrating AI.

But if you are talking about LLM or deep learning techniques, no, not yet.
Because to me it’s maybe not the right algorithmic tool for path planning.
LLM or deep learning cannot solve all the problems and sometimes other techniques which are simpler, and they work better.


Rahul Garg:
One of those areas where I believe AI will become more valuable is on the front end of the whole process, which is the perception and trying to determine what the environment is. Vision, that’s where lot of AI capabilities will become more valuable.


Etienne Ferre:
Vision and also interaction with the human worker in order to have a conversation between the human worker and the systems in order to know what to do, where to go, and so on. But inside in the core algorithm, I don’t think that AI is a is the right tool.


Rahul Garg:
What you’re saying is, from a tool perspective, AI will become valuable in the field of robotics. Generally speaking, more valuable in the front end, on the perception piece number one.Number 2, it will become more valuable in the decision-making piece. But then the action piece is where we need the finite algorithms, which will be more deterministic in the in their approach.

I think that’s a very good way to understand the value of emerging technologies and how they will get leveraged as we speak. But even that’s based on what you have been doing over the last few years. And it looks like some great capabilities are  already in place all the way from evaluating where a connector and a cable will be interfering with the robot to even evaluating multiple robots in the path and for multiple robots, evaluating where the human and a robot would all work together.
Being able to get all of that working in a digital way before you actually bring it all into the real world becomes very valuable.

Chris Pennington: Etienne, thanks for supporting this discussion. It’s fascinating to get a glimpse into the future of industrial robotics.

Leave a Reply

This article first appeared on the Siemens Digital Industries Software blog at https://blogs.sw.siemens.com/thought-leadership/advanced-robotics-and-ai-in-industrial-manufacturing-episode-3-transcript/