Thought Leadership

The application of Model-Based Systems Engineering – ep. 4 Transcript

By Nick Finberg

In this fourth episode of the Model-Based Systems Engineering (MBSE) podcast series transcript, I am joined again by Tim Kinman, Vice President of Trending Solutions and Global Program Lead for Systems Digitalization at Siemens Digital Industries Software. We are also talking with Matt Bromley, Vice President, Product Strategy and Technology, and Mark Malinowski, MBSE Solutions Director – Siemens EDA.

The experts today discuss the impact that complexity is having on the electronics design process. They will help us to understand how digital twins and digital threads assist in managing collaboration by enhancing traceability and observability. Lastly, they will provide insight into the challenges faced when decomposing down to silicone and then recomposing the system.

Read the transcript (below) or listen to the audio podcast.

Read the podcast:

Nick Finberg
Nick Finberg, Writer – Global Marketing at Siemens

Nick Finberg: Hello, and welcome back to Model-Based Matters. I’m your host, Nick Finberg. And with me, as always, is Tim Kinman. We have had several discussions around Model-Based Systems Engineering, with an overview between just the two of us. We have also done a deep dive on product architecture, and our last episode discussed how teams could work together for single system architecture. But today, we’re also joined by Mark Malinoski and Matt Bromley from the EDA space to talk about connected development vertically instead of horizontal connections that we’ve had for connected engineering. Welcome to the podcast, Matt and Mark. Before we dive into the meat of MBSE for the EDA space, could you give the audience a little bit of background on yourselves? What does Systems Modeling play in the role that you do today?

Matt Bromley
Matt Bromley, Vice President, Product Strategy and Technology

Matt Bromley: Sure. Thanks, Nick. I’m Matt Bromley. Within the board system space, we mean by that is connected electronics in a single box. I work on the product strategy and technology roadmaps within Siemens EDA. It would be good to clarify early on that EDA is a multi-domain space in its own right. And so, when we think of EDA and Siemens EDA, we think of a PCB domain, IC domain, and FPGA domain. So we will talk a lot today about the interaction between those multiple domains within EDA.

Nick Finberg: Awesome. Thank you. Mark, can you tell me a bit about yourself?

Mark Malinowski: Sure. Mark Malinowski. I have spent most of my electrical engineering career in some form of either designing or managing the product lines of a complex system, multi-domain systems, primarily in the electronics and integrated circuit debug and test space. Later, moving into design and design flows, I realized that our customers also faced many challenges. So, I began to focus on this MBSE for the electronic space over the past five years. At Siemens EDA, I work across the domains Matt mentioned that make up the EDA space, commonizing architecture decomposition and continuous verification.

Mark Malinowski
Mark Malinowski, MBSE Solutions Director – Siemens EDA

Nick Finberg: Yes. In all of our episodes, the system architecture has been a consistent thread in creating it with a single product definition and using it to guide your development across different groups. And even further with going into EDA, because you have so many domains within your box that everyone else kind of puts you in. But today, we’re going to talk about decomposition. How are we going to go into the verification of all these processes? What does the EDA space look like to anyone that doesn’t know? And how does MBSE fit into that?

Tim Kinman: Before we dive in, I would like to hear from Mark and Matt about changing the industry. And as we move forward on this whole area of the importance of MBSE, part of the reasoning of this discussion about moving vertically is more the behavior that typically has a higher level, vehicle level, and system-level being pushed further down in the EDA stack into the system-on-chip. And so, I think maybe a quick conversation about how the industry is dealing with this transition might be helpful and kind of sets the stage. And then, we’ll dive into how you would address those types of problems.

Matt Bromley: Tim, there’s a great analogy around complexity we can bring here. Mark and I had this chat earlier. And if you look at the complex changes in electronics, over maybe the last 30-50 years, they’re vast. We’ve been working in the microelectronics industry, pushing microprocessors for about 50 years. That’s nearly half as long as we’ve been formally pushing gas-pumping machines – started in about the 1920s. Microelectronics began in the 1970s from that first microprocessor, an Intel 4004, with about 2,300 transistors. A modern microprocessor might have 13 billion transistors on it. So that’s a raw increase in complexity of about five and a half million-fold. Yet that gas pumping machine, really from a functional description and verification perspective, hasn’t changed much – it just pumps gas into your car – maybe a bit faster, while collecting a little bit of the vapor, but fundamentally hasn’t changed that much. And so, one of the things we rarely have to address in electronics is this explosion of complexity that continues. And the microprocessors have a link into embedded software that runs on those micro-processes and the complexity of those systems built around electronics. And that explosion in complexity is driving our need to have a more formal decomposition methodology, both for architectural decomposition and requirements decomposition, so we can understand what we’re building and verify it.

Tim Kinman: The whole thing about how customers continue to shift left; you know, how do I know earlier? How do I make decisions earlier? How do I understand the implications of my decisions earlier? The evolution of complexity also seems to be an evolution of performance and expectation. Historically, we accomplish this at an operational level by the human then move into automation through software and the application. Still, because of complexity and performance, it gets further moved down vertically from an EDA chain. So it seems like that’s the other element that’s going, as we’re putting more and more system responsibility all the way down to the board integrated circuit level than we ever did before.

Mark Malinowski: I think that’s like one of the driving forces right now in the challenges our customers are facing, Tim. So, think about this square centimeter of silicon with so many billions of transistors – multiple CPUs, GPUs, DSPs, and devices that handle different application specifics of the system operation. This setup is causing more system behavior to be determined by the code execution and at what time in the system’s interaction with the environmental conditions. So, in most of our discussions with our customers, in a candid moment of ‘what keeps you up at night?’, it’s that ability to identify what the risk is in the corner cases of all that complexity, controlling a system, moving it through society and aerospace. So, whether we’ve got the corner cases covered well – that is the challenge.

Tim Kinman: The initial conversation is the connection at the system level and system architecture and the need to have that common context available all across the domain. Moreover, it is the ability to understand the system context, decompose it to the functions relevant to the process, and then verify it in the system context – that’s what you guys are trying to do in this system-to-silicon approach.

Matt Bromley: That’s spot on, Tim. We want to be able to enable decomposition from high-level system requirements into the electronic domains. The broader systems cover more than electronics, but we’re focused on the electronics domains to optimize the architecture to meet the needs. That’s the architectural decomposition phase; what goes on the IC from a functional breakdown perspective versus the PCB. Then we close that loop to verify those requirements that need to be broken down in a more automated fashion than we do today. There are a lot of gaps in that process filled in by humans passing Visio diagrams and Excel documents and signing things off manually. And to manage that more programmatically through that complexity, we need to be able to connect both the digital thread of the design and the digital thread of the verification. 

Mark Malinowski: As we move down from the enclosure level of electronics, we have some box in a car or a plane that performs a specific function like controlling the engine performance, the entertainment section, or how the assisted driving algorithm is going to execute. Whereas, inside each of those enclosures is some printed circuit board package assembly connecting to this integrated circuit, this piece of silicon, executing code. And what we find is there’s a different level of application specificness that we are set up to handle when we’re designing that enclosure and delivering it to a specific car or plane versus developing a particular piece of silicon that goes inside that enclosure to perform a particular function. So we have much less application specificness in the design of silicon. We can do it, but there must be a huge economic driver because it’s costly. And usually, it is – takes one that already exists and sees if it can perform well enough to do the job that enclosure needs to do. So that spectrum of application specificness is a big challenge in decomposing the silicon and recomposing system verification.

Nick Finberg: That is very concrete. But then, once you dive inside of that, it’s a little bit more up to the manufacturer of that box to say, “Okay, I want this piece of silicon here.” So I have a back stock of something that will work but wouldn’t be as streamlined as possible. But my cost savings of using this older part and maybe modifying it a little bit would be better. Is that what you’re saying? 

Mark Malinowski: And cycle time. 

Nick Finberg: And cycle time. Okay. So, with that change, do you create verification gates in the concept phase, asking, what do these different parts need to do within our box?

Mark Malinowski: It’s some combination of that and some other stuff also. Right now, customer to customer, segment to segment – how that happens, we must study what’s going on to know what the as-is states are of our customers. But suffice it to say, there’s a wide range of capabilities in terms of how system-level requirements decompose into the different layers of electronics and to what degree they’re verifiable in a way that informs system behavior.

Matt Bromley: Also, Nick, to add to your question, what we do see is a lot of customers want to be able to sketch out an early architecture to meet a program requirement that they’re bidding on – that leverages reuse of some technology they already have and want to be able to get an early understanding on whether their proposed architecture is going to meet those requirements. So, there’s often modeling behind every element of reuse that has already been verified. You have this ability to verify certain parts of the architecture and requirements that go along with that and only must scope out newer pieces of solution space for that program. So customers want to do this early in the design cycle to get an early costing bid to see whether they can use that to win a program.

Tim Kinman: That’s the same approach we see with many of our customers, whether you’re doing an entire vehicle, a subsystem, or even a subsystem decomposition. Whatever your product is, that path of having a model-based product definition drives your cost, feasibility, and delivery decision through a concept. And some of this is a reuse of innovation. But, still, the whole model-based approach is driving precisely that – no matter the level, in the product of the value chain – a supplier or the integrator, the OEM – everybody is dealing with that same challenge, but in various scales.

Matt Bromley: I think the vision of being able to connect the levels of hierarchy from an abstraction standpoint, that architectural decomposition being continuous, and being able to kind of take out any portion of work on it independently to have a good definition of what the requirements are, and the subsequent implementation and verification – it solidifies that solution across multiple domains. I can then, as a supplier, have a contract I need to deliver. I know what those requirements are – I can provide verification data back – that is part of what the industry is driving towards, whether an airplane or a flight controller.

Nick Finberg: All right. This topic is a different world of engineering for me as much is happening on a small scale. Could you explain a bit of what it looks like as you drill down and refine your systems architecture? What does it look like to start building an ASIC or a SOC from these requirements? Or is that a little bit too in the weeds?

Mark Malinowski: I think one current example we have is aerospace and defense. We would be working with a customer supplying the subsystem that controls the dropping of weapon systems from underwing mounts. So, we see the commonality of the mount across the different types of weapons be a source of requirements – we see interaction with the pilot as a source of requirement. Also, we still get some architectural representation, but not to the level that the supplier of this subsystem needs to build the subsystem. So, we see this part of the representation of part of the air gap. This Model-Based Systems Engineering work product from the customer immediately gets interpreted by a human and entirely reproduced in the type of architecture representations that this team knows how to work it. Fundamentally, this scenario stems from using unfamiliar tools representing system-level architecture that the subsystem engineering team is not familiar or fluent, not possessing the features to represent the architecture adequately, creating a gap. 

If you take that down to an IC level, it depends on what the customer is. If the customer asks for a specific microelectronics product, they may have well-refined requirements that meet industry standards. However, the requirements must be refined by knowledgeable microelectronics designers to verify that the devices are working to their expectations. And not merely source-level requirements provided by the customer, but focusing on areas that may cause that piece of microelectronics to malfunction. So all that subject matter expertise comes from the designers and expresses a tool or work product that does not connect with the customers.

Nick Finberg: And does that extend down to the manufacturers, say, one of the silicon fabs – the translation, I guess you could say?

Mark Malinowski: I think that’s a great question, but it is unique to the manufacturing of silicon microelectronics because it’s just a unique world of nano-fabrication processes and chemistries to achieve, dissimilar from everything else we do in the electronics domain. But I think the question is pertinent to the other layers as well. Generally, the answer is no, and we don’t have that consistent flow of either the architecture or information of work products. Thus, the recomposing of system verification is very challenging.

Matt Bromley: I think there’s a bit of a tangent, but when you start looking at manufacturing, and the growing need to verify and have traceability that what was manufactured is what was designed from a trust and traceability standpoint throughout product definition, it does have a link back into verification. But that’s a bit of a tangent from the architectural breakdown side, into more of how you guarantee what you wanted to design and what you built, and then verify that throughout the supply chain.

Tim Kinman: I think if we walk this back even further, let’s go back to the complexity because, as we described earlier, it gets back to “Have I adequately described what I need the subsystem system to do? Have I described it effectively?” And as the complexity is driven further vertically to the board and chip level, the need to have that shift left description and verification is even more critical. So this is where I think the connection to Model-Based System Engineering has become even more relevant. It is not only the key to decomposing that descriptive element from the system to subsystem component, but it also interfaces across those that are unable to simulate early in my decision process before I get to fabrication, manufacturing, and the physical element. So, it provides the ability to connect the system’s definition, descriptive elements, system’s requirements, parameters and architecture interfaces – simulate that in a model-based way is part and parcel to how people are dealing with the complexity. Furthermore, being able to connect that vertically and deliver it in a familiar context down through your engineering chain is really what makes it the most powerful.

Matt Bromley: Tim, if you take maybe a more pure electronics example, somebody mentioned ECUs earlier for cars, but if you were to look at something like the cell phone, and the complexity in a consumer electronics device like that, again, there are some reasonably simple requirements that might say, “This new version of the cell phone has to be able to watch a video at 4K for three hours on a battery charge.” It sounds like a very simple requirement, but that’s an incredibly complex multi-domain challenge. I mean, you’ve got GPUs that, as you start running video on to the high frame rate heat up, impact the battery and the power consumption, further impacting the thermal characteristics within the device and how you architect that cooling. And doing that as an individual domain means that those pieces only come together when you’ve got a prototype. If it doesn’t fit together at that point, it is already a considerable expense, rather than verifying the proposed architecture earlier in the design cycle.

Nick Finberg: Yes, playing it out, you can look at where the problems are instead of just throwing darts at a dartboard and trying to figure out what works. This black box – pertains to this one task, and it’s not doing the task anymore.

Matt Bromley: I think we’re pretty good as an industry in individual domains, as we know how to build a circuit board to meet the requirements, and we know how to build an IC to meet the requirements. And, we are doing that in a more complex system where there’s a lot of interactivity between those where that complexity challenge is no longer linear. Complexity is always increased in the individual domains. So, when we look at PCB as an individual domain, with clock speeds and bus speeds escalating, we’re pretty good at managing that as the single domain complexity increases somewhat linear. But, when bringing this multi-domain problem together, the complexity increase is no longer linear; it becomes exponential.

Nick Finberg: All right. So, you must digitalize and make sure everything is accountable. And I know you guys are working towards automatable as well? Unfortunately, we are out of time in this episode, and we will look forward to discussing more of this topic and other information on our next call. Thank you for joining us.


Siemens Digital Industries Software is driving transformation to enable a digital enterprise where engineering, manufacturing and electronics design meet tomorrow.

Xcelerator, the comprehensive and integrated portfolio of software and services from Siemens Digital Industries Software, helps companies of all sizes create and leverage a comprehensive digital twin that provides organizations with new insights, opportunities and levels of automation to drive innovation.

For more information on Siemens Digital Industries Software products and services, visit siemens.com/software or follow us on LinkedInTwitterFacebook and Instagram. Siemens Digital Industries Software – Where today meets tomorrow

Leave a Reply

This article first appeared on the Siemens Digital Industries Software blog at https://blogs.sw.siemens.com/thought-leadership/2021/11/29/the-application-of-model-based-systems-engineering-ep-4-transcript/