Thought Leadership

Industrial machinery and AI – Episode 7 transcript


Chris Pennington: 
With all the software development being considered with AI, how can an enterprise company maintain some consistency in governance?

Subba Rao: Yeah, you cannot allow bespoke AI generated code floating around. The governance, security and controls are very critical and that is an embedded feature within Mendix. And also, as an organization, IT teams with a manufacturing organization, they definitely would keep a governance structure in place with all these applications that can give flexibility to the users to build the applications and leverage these new technologies within certain bonds, but the same time go through a process of the complete software deployment cycle site. In terms of validating those for the industrial use cases and deploying or making it across different manufacturing facilities or across their organization.

And one of the key things for this governance the AI context and also governance is it needs to come across the complete lifecycle. When I say a complete lifecycle from the initial use case, I mean solving your business problem and how you deploy application with AI underneath with data consistency. What I mean by data consistency is the data has to be sanitized and data that is spread across different systems needs to be brought into your AI models with a good consistency check. We call that a garbage and garbage out is the saying that goes in the industry.

And if you don’t have the data in the right way, the results that come out of any AI applications will not be usable. And this is where the controls and governance will also place a key role in terms of making sure it is per the business needs. Really consistent information and data that’s solving the real business problems.


Rahul Garg:
I think it’s a very good point Subba. In fact, I wonder because many a times in the last few months there have been some messages that perhaps AI is not getting all the benefits that one was expecting in the enterprise level. Especially around AI for from software that’s been written. I think one of it, one of the core reasons could be around not having a well-defined deployment strategy, a well-defined governance strategy.

And not having a good understanding of the maturity of the capabilities of the software. And that’s why they fail or the proof of concepts that many a times customers take on without having the right boundaries in place. I think by adding some of these boundaries in place, you you’re increasing your lihood of success tremendously.


Subba Rao: 
Absolutely. It’s  jumping on a technology and doing POC is attractive, but actually making sure all the other elements to scale across enterprise is very critical. Otherwise, the results  of the reports that were saying OK, 90% of AI projects are not reaping the benefits. That’s probably towards not having this broader vision. It’s keep keeping the controls in the right way and checking what is actually needed for enterprises to scale.


Rahul Garg:
You know with all the growth that’s happening in in software and everything else, some of the things we’re talking about and especially in the context of even one of our core competencies around the digital twin, can you share some thoughts on how Mendix can enable the laying the foundation for the industrial metaverse?


Subba Rao: 
Oh, that’s another great evolving topic I should say. Industrial metaverse, which is bringing the digital twin to actual practical use are the two elements that are really critical. All the visualization that happens in the industrial metaverse, one actually builds the scene, right?

What do I mean by building the scene? You have the factory; the factory has different equipment assets and you as an organization if you’re thinking about a metaverse, you need that scene to be built. This is one area where Mendix is definitely helping, or we are bringing solutions to build that scene that includes your bill of equipment data from your factory. All right, let’s take in the manufacturing context and also all the scene information which might be USD files. In this case it’s with NVIDIA Omniverse. These are the elements that help you build the scene.

Once the scene is there, then in this case we are talking about the scene for the factory. The factory has got multiple lines. And each line has got multiple machines and machines have different sub equipment and all those things needs to be modeled and visualized. That’s part of the scene. The next step is how you connect that scene definition, all the equipment that are there in the scene to the actual real-world data that is coming from your machines so that’s the connectivity and data fab data aspect which is where the data fabric plays a key role in terms of  providing the real information into the digital world into the scene. The building is a scene and making the connectivity to the data, the right data sources, are one element where Mendix plays and then the scene needs to be used in a broader context as a user experience as a production supervisor.

Take an example. As a plant manager or production supervisor, you have your metaverse for your factory where the scene is this, but your context is you want to drill down into some specific equipment’s metrics or KPIs or downtimes or even some additional  quality metrics. Those are all  different experiences that you need based on this specific scene that you can navigate. Those are the two elements that you’ll see and underlying the data fabric is the core element to even make it more bring the real-world information to the digital world and vice versa.


Rahul Garg:
Yeah, yeah, I think that this is going to be a very fast evolving field as well. And the value of the industrial metaverse being able to simulate the different functions in the high-fidelity visualization format is going to become even easier.

And as it becomes easier, its adoption will increase and as its adoption increases, people will continue to get more and more value. And I think you mentioned very briefly this whole context of data fabric. Could you elaborate on that a little bit more?


Subba Rao: 
The way I see data fabric is , ok, we have in any organization an engineering data that is in the PLM systems or the other engineering systems that they’re using. There’s a lot of manufacturing data that might be in the MES systems or quality systems or SCADA systems.
And other OT systems and there’s a lot of simulation data that depending upon what simulation tools organizations are using that that is over there and every piece of the and ERP data obviously the business data is in the ERP systems.All these data have their own domain nomenclature.

Let’s move semantics. The data fabric is how you create that ontology or knowledge graph.Putting the relationships across all the different semantics that might be in specific domain right into an organization specific or  a complete knowledge graph.

That really puts the map of the organization’s domain and elements together. And this is where with our recent acquisition of Altair, Altair has got a RapidMiner. RapidMiner Graph Studio is an advanced tool which helps organizations build this knowledge graph ontology for all the data sources, all the different data and their domain semantics.

And once you have the knowledge graph then leveraging those relationships and building an application and even taking the power of AI becomes much easier and also more useful for the users solving their business problems and also the realizing the true digital thread and digital prints.


Rahul Garg:
Yeah, I think that’s a great point. I guess another way I think of it sometimes is knowledge graphs will give you the ability to combine all the relevant data sources that could be sitting in various different organization silos. All the way from a team’s meeting and meeting notes to some ERP or PLM financial system and bringing all of that together in the right context. Because if for example you’re trying to find out some challenge about some product and trying to figure out how to solve that from a service issue and you’re trying to see what was discussed about that product in some other meeting and some other times, a knowledge graph can help you bring all that relevant information together, right? it becomes a little bit easier for you to dig into the right information.


Subba Rao:
Absolutely, yep. And then that’s where  I was extrapolating that further into the industrial context where the more, we get into engineering technologies and IT and OT, the data complexity is high and having the knowledge graph and ontology is definitely a step that organizations should start looking at to even reap the benefits from this AI advancements.


Rahul Garg:
That’s great. That’s fantastic.

Chris Pennington:
 That was an enlightening discussion. Now, before we close out, do either of you have closing thoughts you want to leave our listeners with?


Subba Rao:  
Yeah. One key thing I always emphasize is as the AI and all these advancements are going on and how you embrace into the organization, obviously technology for the sake of technology should not be used, it should be to solve a problem. And to do that, the underlying aspects that need to be worked out depends upon the maturity.

For me, two things are very critical that organizations should start looking at: the data consistency and how do you bring that, and data strategy that you need to put in place. Then the next aspect is having the human in the loop. Automation is good, but automation with human in the loop is very critical for the industrial industry.


Rahul Garg:
I think that’s a good point, Subba and Chris. From my point of view, to put some more color to what Subba said, I would say for a company to start on this journey of leveraging AI in the industrial context. There’s a lot of technology available but think in the context of use cases. If you can identify the right use cases, trying to do an assessment of and feasibility of those use cases. What’s the value that you can possibly get from those use cases?

Then you can get to the solution definition of what’s possible from a solution perspective and being able to operationalize it. I think if you as a company were to try and create a road map of potential use cases that are there, you could leverage AI, low code applications and Mendix once you have that better defined through this what I would call as a map of all potential use cases by doing a site assessment. Then the likelihood of success becomes a lot higher, and you have a very strong road map. Doing that work up front will really reap longer term benefits in your journey in this direction.

Leave a Reply

This article first appeared on the Siemens Digital Industries Software blog at https://blogs.sw.siemens.com/thought-leadership/industrial-machinery-and-ai-episode-7-transcript/