Capturing manufacturing data and processes with AI podcast – Transcript

Artificial intelligence is having a major impact on the way products are designed and manufactured. Applying AI to the manufacturing industry can help bring a new layer of speed and adaptability to an industry that has traditionally been slow to adopt new trends. AI offers the ability to understand, package, and transport information between systems and users in a more efficient and intuitive way than ever before, enabling new approaches and innovations across the breadth of a factories operations.
Listen to the full discussion between Dr. Justin Hodges and Boris Scharinger here, or keep reading for a transcript of that conversation.
Spencer Acain:
Hello, and welcome to the AI Spectrum podcast. I’m your host, Spencer Acain. In this series, we talk to experts from all across Siemens about a wide range of topics and technologies, and how AI is being applied to them. Today, I’m joined once more by Dr. Justin Hodges and Boris Scharinger, as they continue their discussion on the applications of artificial intelligence across the product design and manufacturing process.
So, we’ve kind of exhaustively talked about, well, I shouldn’t say exhaustively, but we’ve exhaustively scratched the surface of how AI is being applied to the product design process. And I think at this point, I’d like to take it to the next step and open the floor further to you, Boris, to explore the ways that AI is being used to manufacture these products. So, once you’ve applied your AI technology to help design your latest product, how is now AI helping you build it?
Boris Scharinger:
Yeah, so many different use cases, whether it’s on the equipment side, asset management, operational equipment efficiency by the usual use cases, predictive maintenance, for example. Of course, that helps, that makes a shop floor process plant more efficient, maybe also more resilient against unplanned downtimes, right? If we look at predictive maintenance. But then we have the space of robotics. And we all know, and we all see in the media how huge the advancements are almost on a weekly base that we see in robotics, whether it’s on the gripping side, palletizing, de-palletizing stuff, that used to be too complex for machines to be executed autonomously. But that changes a lot. What changes slowly in the space of robotics is, by the way, really the cobot thing, right? The true collaboration of human workers and robots on the shop floor next to each other, all the different safety topics that we need to address in that space. Oh, we have quite a journey still in front of us. And from my personal perspective, to solve or crack that nut, it’s almost as complex as autonomous driving.
Apart from robotics where AI really will make a difference, makes a difference today, and will make a difference, the art form and the discipline of machine vision, quality inspection by machine vision, including adjacent processes, counting, sorting, and whatnot, all of that has made major advancements. And just to give you one example from our own portfolio, if I may, we have this machine solution, machine vision solution inspector in our portfolio. And it takes anyone 45 minutes to set it up. And you don’t need to have a background in AI or in machine vision. And why? Because AI is not only doing the image analytics, AI is also supporting you with a wizard how to set up the solution.
So AI looks using the camera at how you adjust the lights, for example. So lighting conditions become optimal, and the wizard will guide you to set it up in an optimal way that works. So AI as means to make that technology particularly accessible also to non-experts, which is a huge deal today. We still see AI on the shop floor being tailored towards being set up and used by AI experts, and this is definitely not something that we can afford going forward. So yeah, these we use case examples, disciplines, where AI really makes a difference in manufacturing and operations.
Dr. Justin Hodges:
There’s the word operations. That’s definitely how I think about it. And maybe it’s a little distinct from manufacturing, but it’s certainly not identifying the designer, the person sitting in front of a giant HPC doing calculations before it’s even manufactured. But yeah, I think the people doing the operations have a lot to gain. I think whether it’s inspecting broken parts or doing installation, it’s obviously very good to have consolidated knowledge upfront being communicated to you while doing that. But also I think back to my days doing experimental measurements, if I’m operating a mixing vessel in a facility for chemical processing or something, I don’t have the wealth of information that I do when I do a simulation. In a simulation, it’s a continuum. You can pull anything you want. You can look at any behavior in some small region, you can go to the exact opposite part of the simulation in the operating equipment and inspect something else. You can write in simple mathematical expressions to pull out things to give you more information.
But when you measure things in operation capacity, like when you’re on the shop floor, I mean, you have five numbers, maybe. You have a few probes that measure temperature, or speed, or pressure. It’s really, really limited. And I think one of my favorite cases I saw from DTU was they took the simulation data, so someone ran simulations on some equipment. In that case it was a wind turbine blade. And they took the simulation result. And I keep using the word consolidate because it’s really embedding that knowledge into an AI model. So you can imagine all of this huge, dense fields of simulation data, now represented by more or less a transfer function, AI model kind of acts like that. You provided input, it gives you the output, which is very lightweight and can be deployed in the shop floor, or in this case on a wind turbine blade in operation.
And when the user or the operation person saw this thing deflecting this wind turbine blade kind of bouncing, shaking, I guess, if you want to call it that, normally you may have a couple sensors that measure strain gauge, right? So you have discrete points where you measure displacement, and you can visualize that on some sort of computer, iPad, or something on the scene. But with this sort of computer vision and this embedding of the simulation knowledge in machine learning models, they were able through their glasses to see the full field of displacement and stress, and all of these things. And it’s like providing you the information from the person in the design that has unlimited access to question answers and design information. But now, rather than just having discrete few sensors, you have more of that on hand. So I’m a huge fan of the operation persona. I think that there’ll be a ton to gain there with machine learning.
Boris Scharinger:
That’s an interesting way to look at it. So, what you’re saying in my own words is basically core information, not data, is transported, packaged in a machine learning model that is exchanged between two parties, right? That reminds me of another interesting use case. We don’t see that yet implemented often, but I can see this on the horizon where in a supply chain, in supplier-customer relationships, information about the materials supplied, are exchanged in form of machine learning models. So just imagine that you have a supplier of raw material, say cotton, into a certain fabrication process, and on the side of the customer you have machines that are processing that cotton. And those machines need to be calibrated to the specific attributes and features of a batch of cotton. Because cotton is a raw material, it’s nature, it differs. And in food and bev, for example, you have these types of relationships all over the place.
And now a supplier can provide a certain type of machine learning model with the delivery of the batch of the raw material, helping the customer to calibrate their machines a lot faster, not by trial and error, but a lot faster to the specifics of that batch, again, reducing waste, reducing the waste of precious resources and energy and whatnot. And I think we will see an era at a certain point in time starting where not raw data points are exchanged over across along the lines of a supply chain, but pretty key information encapsulated in machine learning models. Kind of similar to the wind turbine example that you just used just in…
Dr. Justin Hodges:
Yeah. I think the cotton example, it triggers a memory and relationship to our tests group who has been doing AI in their offering for a while now. And yeah, quality assurance, whether it’s an experiment you’re setting up, or whether it’s a material you’re processing. Humans are not designed very well-to-do this sort of thing. If I take 20 sensor measurements and I show the user their sensor measurements over time as a signal, as a signal processing exercise, I mean, it’s going to be impossible to look at 20 different sensor outputs and try to understand the relationship of them and identify if there’s a problem. But machine learning is very good at that, and identifying if maybe you did a poor calibration, so that you don’t not waste the material or waste an experiment that you’ve set up and go to acquire data. I think it’s very good for quality assurance as well.
And that, I mean, other examples would be like, if your sensor or a component starts behaving anonymous with an anomaly, so it’s now behaving unusually, and you can get that user feedback to the operator so that they can shut it down and fix it, or adapt something else so that there’s no harm done to the equipment or the material, or whatever it is that they’re testing. So yeah, I get tunnel vision sometimes on the simulation side because that’s my home turf, but the testing side is as well, very, very fruitful for AI. And I think that shows up in the operator persona, that shows up in the manufacturing, in the shop floor and things like that. I would agree, yeah, there’s a lot there.
Spencer Acain:
Well, that sounds like a great note to end our discussion for today on. So thank you once again, Justin and Boris, for joining me here today. Once again, I have been your host, Spencer Acain on the AI Spectrum Podcast. Tune in again next time as we continue exploring the exciting world of AI.
Siemens Digital Industries Software helps organizations of all sizes digitally transform using software, hardware and services from the Siemens Xcelerator business platform. Siemens’ software and the comprehensive digital twin enable companies to optimize their design, engineering and manufacturing processes to turn today’s ideas into the sustainable products of the future. From chips to entire systems, from product to process, across all industries. Siemens Digital Industries Software – Accelerating transformation.