Thought Leadership

The importance of AI and data in the manufacturing process podcast – Transcript

Artificial intelligence and data go hand in hand, whether that’s as data used to train AI models or as AI models used to analyze and retrieve data. In the industrial world, there’s a fine balance between leveraging data to fine-tune shop floor AI systems meaning data intelligence is an equally valuable asset as well. Similarly, AI itself can provide a form of data intelligence through the lens of an Industrial Copilot capable of aggregating vast quantities of data into easy to access and understand formats. Bringing these elements together will be vital in realizing the future of data-driven manufacturing.

Learn more in a recent podcast with Ralf Wagner, available here, or keep reading for a transcript of that conversation.

Spencer Acain:

Hello and welcome to the AI Spectrum Podcast. I’m your host, Spencer Acain. In this series, we explore a wide range of AI topics from all across Siemens and how they’re applied to different technologies.

Today, I’m joined once more by Ralf Wagner, Senior Vice President of Data-Driven Manufacturing at Siemens. In previous episodes, we’ve looked at the major applications of Insights Hub that drive the data-driven manufacturing process that the manufacturing industry as whole is moving toward. And then we’ve also examined the applications of AI within that process to a degree.

And from what you’ve been saying before and the four major areas you mentioned, it sounds like a lot of the models that you’re using are pre-trained, like you literally have a section called Out of the Box, these models must be coming ready to go. But are you offering the ability to further train these models and refine them? Would there even be much benefit in fine-tuning models for a lot of these applications for particular customers or use cases? And is it something you could do with or without an AI expert because as you mentioned, a lot of manufacturing companies aren’t staffing data scientists who could train these models manually as it were.

Ralf Wagner:

That is actually a very important point for us that we tailor our solutions and build them towards the target and user persona, which is actually then sitting on the other side of the screen using our software. And this is something we stumbled upon as well. A few years back when you talked about AI, when you talked about ML, it was a week and months long process to collect the data, to gather data, to cleanse the data, as well as then training the model and managing the model and see model drift and analyzing this. We saw that this is a major hurdle in order to get AI really into the hands of the OT people, the manufacturing colleagues and get value out of it.

That’s why we approached it in the four quadrants and in the quadrant one and two where we have the Out of the Box solutions and the tooling, we try to keep the AI in the background and actually we work with configurable AI, you could even say. We have that model there. You could call that pre-trained. They’re optimized already out of the box for these certain specific use cases because there is no AI model one size fits all.

There is for these certain use cases, they come pre-trained and then you have maybe a set of two to four to six weeks of data which you collect and then you throw it into that model and you just configure that in a wizard-like approach in seven steps, a domain and a process engineer can easily do this then in a matter of a few hours in order to get that model up and running and executing and that model is monitoring itself. It gives itself a quality index and it sees the deviation and the process expert can actually work with that model after a very short period of time, you don’t need to go through this months-long of model training and validation and optimization and the like.

We overcome this process from the past with these pre-built Out of the Box solutions. While we still offer with the third quadrant all the flexibility a data scientists might want to have playing around with all the challenges this brings to get very, very specific use cases also being addressed within Insights Hub and the data they’re collecting from the shop floor. We want to be open in all directions and we are very transparent and clear where the strengths and the benefits for the one or the other approach actually are.

Spencer Acain:

I see. It sounds like you’ve really thought about this and have found a way to offer the tools that your customers need to implement AI, leverage AI and their own data in a way that is most convenient to them without having to necessarily become an expert in the field of AI and data science themselves.

And I like to change tacks a little here. And if we talked a lot I think about these type of AI solutions that are very transparent to the user, they exist in the tool and to be helpful, but you’re not necessarily directly interacting with them. But what about I think the more direct interaction that you’ve talked about, which of course is the production Copilot. What are some of the benefits that brings compared to these more transparent offerings that we’ve been talking about up till now?

Ralf Wagner:

Those plans, the production Copilot is the newest kid on the block, and since we released it end of last year, it actually opens a complete new world of use cases, how to interact and work with your manufacturing data to get to what we all started with to this faster decision making more insights out of data in order to continuously improve your production according to your KPIs and targets. This is opening a complete new field, you could even say a new ball game when it comes to AI and accessibility and working with data.

What we have now started to develop with the Insights Hub production Copilot, actually a set of use cases from the very beginning, and then we take it from there into the next level for Hannover Fair. What we released today is basically the four areas where we allow, first of all our customers to just drag and drop into the Insights Hub data lake, the maintenance documents from their equipment which they have on the shop floor.

Every of their supplier of production lines of machines, of equipment, of all kinds of types, you get typically maintenance and user manuals as PDF, thousands and thousands of pages for what you see actually as equipment on the shop floor. You can just drag and drop those documents into Insights Hub, and then the production Copilot is there to only minutes after you have dropped them is here to ask all kinds of questions related to your equipment. You could ask a question like, “What’s the part number of the pneumatic cylinder in machine XYZ?” And you get the part number if you want to have the spare part being ordered immediately. Instead of going through all the documents and look for the right ones and then find the right page and then find right part number, the Copilot answers your question immediately in a matter of seconds.

Or you can then ask, “What’s the maintenance procedure for that paramedic cylinder, by the way?” And it gives you the details and lists up what you need to do in order to do a regular maintenance procedure for that part. And in all the cases you get the reference with the little link down in the chatbot window, you get the reference to the original document exactly to the right paragraph where that information is originated from. You could always check if you want. That’s the first use case. And in the early access, which we had with a few key customers before we released their production Copilot, they said, “Wow, it sounds like a simple use case, but it has a major impact on finding the right answers when it comes to any questions related to the production equipment.”

The second use case, which we have enabled already is in the Insights Hub Monitor, which is the basic condition monetary application. You get all the time series data, everything which you collect, but also all the events coming from the machine which you want to collect are actually shown there. And in many cases, the events, they don’t even have self-descriptive names. If there is an event coming from a compressor machine, for example, and this says, “F-60 comm error, A set B,” you have no clue what to do. Maybe you are an expert and you saw that already yesterday and you had looked at it 10 times already, but this is probably not very realistic. You see that and you just right-click on that error code in the description of the event and say, “Ask the Copilot.” And it gives you a complete description of what that error code means, how you typically deal with it, as well as the reference to the original document.

This is more event-driven and it gives you value which is already Insights Hub, it extends that value, by a couple of mouse clicks and gives you the right source and error description for what is actually happening when you see such an event message coming from the equipment. Again, it’s faster getting to answers and better decision-making, therefore. That was the second use case which we enabled, the third one is related to your time series data coming from the equipment as well as all the events. You can start to say, “I defined the time window for the last few weeks,” and I will ask, “Was there any pattern in the battery coding process which you see?” And then you get an answer back and it actually explains you, “Yes, there is a strange ratio which is being derived out of the coding bar gap and something else which you would not have seen with your eyes. But over time if that trend continues, you will have an impact on your quality.”

This is something that you can then specifically ask a process engineer, for them this is unbelievably helpful to have this dialogue with the data so they then can actually set an alert if something like that ratio is happening again, and then it triggers an alarm and informs people before something happens again. This is asking the data, have a dialogue with your data coming from the equipment. And the last use case as we have that close relationship between IoT and MES as I described previously. You can also ask question, “How was last shift doing?” And you get the answer, “Last shift, we were actually 12% below target and these were the three main events and these were the reasons why we actually were below the target of the last shift.”

Then you can ask, “Is there any impact on certain orders or customers?” Then you get a list, “Yes, these three work orders have been significantly delayed because of that 12% under target.” And then you look at those work orders and you ask which customers have been impacted by those, are linked to those three work orders. And then you see, “These three are actually for two customers. One is uncritical, but one of the customers actually a high priority customer. And wow, if I don’t do anything special, that order will be delayed.” Then you go back to the planning tool and say, “We need to make sure that these two work orders which have been delayed in last shift need to be expedited and be sped up in order to make sure that we are not getting any penalties for late delivery.”

These are things we can then simply ask in that dialogue mode for the production Copilot to get this additional insights and faster decision-making and actionable know-how from your data. This is where we are and what is being released, and we are working right now for Hannover Fair, which is actually happening end of March this year, to extend those capabilities significantly because we are developing a agent-based system in the background. You can tap in any other data source in order to allow the Copilot to take advantage of even more and contextualized information to get even more and better and precise answers than the four use cases I was just describing. And we are giving this into the hands of our customers and users, and we call this now, the working title is a Copilot Studio where you can actually then extend the capabilities which we bring out of the box towards your system to your IT and OT environment in order to get even more out of a Copilot and chatbot interface.

And this then comes also with every morning the shift is starting automatically. A shift report is being generated and put as a PDF there. And in the handover, the checklist is being pre-filled because you have all this information already in the system, all this automation and planning and tasks, which can also be then taken over by the Copilot will make life way more easier and more transparent and efficient for manufacturing companies going forward using this technology. Pretty excited about what we have been released just the end of the year and what is right around the corner to be released to get this into the hands of our customers.

Spencer Acain:

Wow. It really sounds like you’ve just got a ton of these really incredible AI offerings either available now or coming down the pipe very soon, and it’s really going to almost redefine the way people are interacting with all this type of manufacturing, MES machine data. It’s been available for a long time, but maybe not leveraged to the fullest extent. And I think that’s a great place to wrap up this episode. Once again, I have been your host, Spencer Acain, joined by Ralf Wagner on the AI Spectrum Podcast. Tune in again next time as you continue exploring the exciting world of AI.


Siemens Digital Industries Software helps organizations of all sizes digitally transform using software, hardware and services from the Siemens Xcelerator business platform. Siemens’ software and the comprehensive digital twin enable companies to optimize their design, engineering and manufacturing processes to turn today’s ideas into the sustainable products of the future. From chips to entire systems, from product to process, across all industries. Siemens Digital Industries Software – Accelerating transformation.

Spencer Acain

Leave a Reply

This article first appeared on the Siemens Digital Industries Software blog at https://blogs.sw.siemens.com/thought-leadership/2025/06/06/the-importance-of-ai-and-data-in-the-manufacturing-process-podcast-transcript/