Advanced robotics and AI in industrial manufacturing- Episode 5 transcript
Chris Pennington: We’ll now be speaking with Alex Greenberg, our digital twin expert. Hi, Alex, and welcome to the show. Could you go into a bit more depth about your role and experience in Siemens?
Alex Greenberg: Yeah, definitely. Hi Chris. Thanks for having me. My name is Alex Greenberg and I’m a director in the digital industries software digital manufacturing division. My focus is on advanced robotics simulation software called Process Simulate, which is aimed at doing the manufacturing domain, which means simulation of robotic station and stations and lines that assist in building products across various industries like automotive, aerospace, electronic appliances, and many others. I’m excited to be here. Thanks for having me.
Chris Pennington: Excellent. The Process Simulate application sounds a really interesting area for us to talk about. Perhaps you could give us a bit more information on exactly what that is and how it helps your customers.
Alex Greenberg: Yes, certainly. Process Simulate as I mentioned, this is our main simulation solution for automated processes in the manufacturing domain. It’s basically a digital solution for ordering products, automated production processes, not only automated but also manual.
Our main topic today is robotics and automation with this 3D environment. If I would have to summarize it in one sentence: you can optimize your manufacturing process so that the output of this environment of this solution is validated and the robot program is ready to download to the robot on the shop floor, where you can say with high confidence that they will work right even from the first time when you run the real robot.
And this is the main value proposition of process. Simulate then build the digital twin of your manufacturing line or station, introduce the resources, the robots, the tools, the parts that you wanted to assemble, plan your process, simulate that, optimize that, then engage in what we call offline programming, which means to create the actual robotic program within the 3D environment for the specific robot you are using.
There are many vendors in the world that produce robots and basically download the program to the robot when it’s validated risk-free. Avoiding collisions, reducing risk for things going wrong, and so on and so forth.
Rahul Garg: That’s actually a good sense of what the solutions can do. Two questions come to my mind based on what you said there. The first question is as technologies improve, do you see a situation where the offline programming becomes lesser, and the robots begin to start using more capabilities and learning in real time? That’s one question.
And the second question is, as robots work how do we integrate the humans? And how do you even simulate the human factor during the factory optimization process?
Alex Greenberg: Thanks Rahul. These are very good questions. Let me break it down. First of all, I would start with what I would call the classic part of robotic simulation. We have the advanced robots that are kind of smarter, more intelligent, and can perceive stuff and understand stuff in the manufacturing environment and decide and act upon it. But we also have the old good classic robots that you need to explicitly program. What is common to the both of them is that there are some classic challenges that are pertaining to both these groups.
How do you select the right robot? How do you fit this into the working environment? Pertaining to your question about human labor, how they fit within the station, where there are humans as well? Do you need some fencing? Do you need some safety measures? Can the robot reach? Can the robot access without collisions? Can the robot do the job with the associated payload that the robot can lift and at the same time, can it live by a human in the environment and take the necessary safety precautions?
These are all pertaining to both groups, both to the classic industrial robots but also to the more intelligent ones. This is the first pillar. And the second pillar is, as you mentioned, we have a lot of advanced robots with more perception capabilities, more logic, reasoning, understanding the environment and being able to execute tasks based on the intelligence that is already embedded in the robot.
Of course this changes the simulation paradigm where instead of explicitly programming robots, the simulation engine would probably engage in scenario management or scenario assessment which would build a concept and then introduce the smart robots within the environment and test the various scenarios to understand what is the optimal design where you utilize such intelligent robots in your environment and make the best decisions based on a lot of aspects like what’s the cycle time required and what is the level of performance and the level of success required. And of course, cost considerations.
All this will come together, but this will more become less an explicit programming experience and more scenario and alternatives validations in order to reach the best optimal solution for the required time ask.
Rahul Garg: That’ll be quite exciting in terms of the evolution of robots as well in terms of how the scenarios will become the way to function with the robots versus preprogramming them for every task of what they need to do.
And keeping that in mind, one of the areas where we see a lot of progress, at least a lot of research going on and a lot of new startup companies coming in is around the whole area of humanoids and using this whole part of work of a humanoid trying to do certain tasks in the context of a scenario perhaps and being able to perform many tasks that the human would normally be doing. How do you see that evolving and how do you see that coming even into the into industrial production environment?
Alex Greenberg: Yes. Again, a very good aspect. We are all aware of the news coming every day lately about humanoid robots. And I think the attempt here is to kind of create a generic purpose robot, OK, that would be quickly able.
To deploy any tasks that you want by simply demonstrating that there is a lot of activity going on around like teleoperating robot and teach it and then let it do the task. Of course, then you can think about simulation directions. Imagine that you can build a digital copy of this human humanoid and then operate it through some kind of AVR experience with some motion capturing technology. You are immersed in a digital environment, wearing a headset and some motion sensors, and then you can teleoperate this robot virtually instead of needing to use the physical equipment.
This is something that really shorten processes of training those robots for various tasks and imagine that also you are an operator. You mentioned Raul previously that humans are around those robots, either classic or humanoid. There is an aspect of introducing the shop floor operators to this kind of environment. Imagine if you can immerse them into a virtual reality environment. They can really sense how this would look in reality and understand how the robot will operate around them. This would reduce the anxiety from such an implementation of such a technology and basically train the operators to work around the robots using the digital twin.
There are multiple value streams here that I see where simulation especially immersed one with using VR and AR technology can help train these robots create synthetic data sets in order to train these robots, but also get the humans accustomed to such machines operating around them. Understanding how they would operate within this environment where the task is split between the robot and the human, and they need to work is really exciting domain. Of course, the human rights are only starting to enter this. We are in the first steps. We see some signs of companies trying to implement them for manufacturing task and road ahead of them until they can robustly implement these robots into the manufacturing domain.
Rahul Garg: That’s definitely a very exciting era, if you will, in terms of how automation could change with humanoids coming into the picture and you’re right we are in the perhaps in the very early stages of getting that going. Very exciting to see what can happen there.
Alex Greenberg: Yeah, I fully agree with you. That’s a very exciting domain. In my view, the human rights main value proposition is around the kind of general purpose. These robots aim to be very flexible, very adaptive, cost effective and being able to replace human tasks which are repetitive and dull. And a little bit this workload from the actual humans, and it would be exciting to see where it goes.
Chris Pennington: Interesting. I think it’s going to be quite some time off before we see humanoids around the shop floor, but in the meantime, I wanted to kind of bring the conversation back a little bit to an earlier session we had where we were discussing how the Siemens fact. In Erlangen is using simulation technology and the digital twin to its advantage.
And Alex, I wanted to ask you: could you maybe give us some more use cases for the comprehensive digital twin in simulation?
Alex Greenberg: Yes, certainly. First of all, Erlangen is a very good example of how extensively the use the simulation technology of our simulation product, by the way, which is called process simulate, I believe. Max mentioned the name of the product, but if we are looking at other domains so Erlangen is a very nice representative of an electronic factory.
But you can look also at that at other industrial domains like automotive, where a tool like process simulates on the critical path. I assume most of you have seen. Videos of huge robotic lines with hundreds of robots building the cars that we drive every day now without simulation, without offline programming and virtual commissioning. You would not be able to build such lines effectively.
In fact, 100% of these lines are first simulated, including all the aspects of the assembly process, both robotics and in manual labor, and only when there is a full validation through simulation. Those programs are taking from the virtual environment to the actual one. Basically, you can say that without simulation these products couldn’t be built, especially in the automotive industry.
Now if you look at the entire supply chain over the motive as well. All the automotive suppliers and coordinate builders use the same tools as part of their collaboration with the original equipment manufacturers. If you jump to the aerospace defense, it’s kind of like this less mass production, but again, the simulation of the processes, especially if they are automated, let’s say for drilling and riveting of the air frames, this is also fully simulated before and before you actually deploy it on the shop floor, because the cost of damage or cost of wrong execution of the process is very high in this domain.
you really need to validate everything before you actually go to the shop floor and execute the automation you plan.
Electronics we already touched with Erlangen consumer goods. We have a lot of customers that build the washing machines, refrigerators and again when they introduce both automated and manual labor, they have their own challenges pertaining to each domain. In robotics, we already discussed the reachability of the robots. The cycle times the validation of the robot program, the virtual commissioning and we talk.
And when we talk about human simulation, you obviously need to understand what kind of ergonomic risks you have with the human that you place in that station, being able to reach all the working positions. Would they potentially have some ergonomic issues, like a strain on their back or neck or elbow, and how you can fix those? Once you have you see a problem in simulation, it’s very easy to fix the cost of error correction is very small because you didn’t build and still anything yet, so it’s very easy to correct the position of a drawer for manual operator. If that cascades to the shopper, the people might develop some ergonomic issues and then you need to fix the physical equipment, which of course fixing such error later in the process in the shopper is much more expensive.
I hope Chris, this answers your questions were like the value is very diverse. Each industry domain has its own challenges, but the versatility of the simulation tool allows customers to tackle those challenges efficiently.
Chris Pennington: Yeah, perfect answer the question. Thanks, Alex. Another thing I wanted to talk about, which again is a topic that’s been raised previously, is artificial intelligence.
Now, when we look at artificial intelligence, we talk about it at siemens. We talk specifically about industrial AI. We know it’s going to play an important role; we know it plays an important role today.
But I wanted to just explore a little bit with you, Alex, around how we define industrial AI and what specific capabilities you’d pull out of industrial AI that would support that comprehensive digital twin that we’ve talked about.
Alex Greenberg: If we talk about AI in our domain, again we are talking about simulation for a digital assembly. Digital manufacturing, so obviously we follow the trend in the market and obviously we mentioned humanoids before, so humanoids entirely are driven by AI and AI models that drive those humans and control them. Obviously, we are we need to adjust to this kind of trends now even before the human rights, we have robots that can see can sense, they have cameras, they have 4 stored sensors, and they are able to see what is in front of them and use AI models to digest what they see.
For example, if they want to pick some objects from an unstructured bin, just some parts that are dropped there today. We have technologies that are able to look with the camera at this bin and tell the robot what they see, distinguish between the objects, and send the robot to pick the right objects. Ok and put them into the manufacturing line.
This, by the way, simplifies a lot of the feeding system into the line where you don’t need to structure the parts feeding, but you can just put them in the box and the robot will be able to handle them.
This is a very important trend that is recently pertaining to our industry. By the way, if you recall Jens Meckel discussion, he talked about PKI. This is something that factory automation developed, and of course it’s very important to be able to simulate such a scenario.
For example, when you put a robot to pick parts from a from a bin in an unstructured way, you need to make sure again, as I mentioned previously, that you select the right robot, you select the right gripper that is able to reach all the parts within the bin and to empty it effectively. Again, this this is where simulation act to support the actual AI driven vision-based application. Another topic I would mention is the synthetic data.
Of course, when you have robots equipped with vision and perception through vision, you need to train this. Classically you would have to take a lot of pictures of the bin, label them, build the right model.
and then feed it into your vision pipeline and this requires a lot of expertise.
Now imagine that in the digital twin you can create such datasets automatically and they are automatically labeled. And then all this manual labor to create those digital. Actual data sets would become synthetic task that the software tool would do automatically for you. This is significantly reducing the time to achieve an efficient training of the vision algorithm without needing any physical parts or physical bin. You can all do all of that digitally.
I hope this was this came through clear, but the digital environment helps you in many aspects of implementing AI based automation into a shop floor.
Chris Pennington: This is really helpful, Alex. Appreciate that explanation. Now, of course I know many companies really struggle to understand how they can take advantage of AI and how to deploy it. It’s good to hear those examples from you of practical uses of AI.
But I also know Rahul, you’ve had quite a lot of experience recently with AI and I’d like to just pull you back into the conversation to see if you’ve got any opinions and comments on how AI has been used within industry.
Rahul Garg: Before we go into the AI topic, some of the things Alex mentioned, right is when you look at a robot and the systems and the and the capabilities that it’s providing in the past I would say a lot of the robots were used for automating a manual task right repeated tasks, but now with the advent of AI and some of the examples Alex was highlighting, you are now moving the field of adoption and usage of robots, even in a more flexible environment as well. And typically, when you are working in a factory environment you have a lot of brownfield work going on and that brownfield work implies that something is already going on and you’re trying to do something more, something different.
That’s where robots with AI capabilities will become even more valuable where they will be able to adapt more easily in a more flexible way to the needs of what that environment is, is going to provide or what they will have, how they will have to function.
Just hearing Alex speak about some of the ways how our customers are looking to do this more of that flexibility and automation becoming the key drivers of how robots get more extensively used and AI becoming a very key enabler in flexibility and automation, right? Bringing both of those together with what AI can provide and I think some of the examples Alex you mentioned around synthetic data right around learning. I think those become valuable ways to do that as well.
Chris Pennington: Yeah, I think we’re going to see AI kind of going into the background and almost being something transparent that’s doing a lot of work without really being visible. It’s going to be interesting to see how things move forwards.


