Corporate

A New Manufacturing Playbook: Digital Thread, Data Integrity and AI

Summary

Big idea: Complexity in modern manufacturing isn’t a burden to manage — it’s a competitive advantage to seize. 

Why it matters: As AI reshapes product development, companies that lack clean, integrated data foundations will find their AI initiatives stalled before they start — and those that invest now are already pulling ahead. 

Real-world examples: Three customers illustrate how Siemens’ comprehensive digital twin and connected software stack are turning complexity into results: 

  1. GM uses Teamcenter Configurator to validate hundreds of thousands of vehicle build combinations — covering all product variations based on defined rules, replacing costly custom tools with out-of-the-box software that handles between 100,000 and one million nightly production orders. 
  1. Rolls-Royce built a full digital thread connecting engineering, manufacturing and operational data across Teamcenter, Opcenter, Insights Hub, Simcenter and Mendix — enabling a production AI copilot that identifies quality deviations in turbine blade manufacturing, suggests fixes and auto-generates maintenance tickets. 
  1. BAE Systems used Mendix to build 40 enterprise applications — including part approval and task management — cutting development timelines from years to as little as four weeks and saving £40 million on a single application. 

What’s changed with digital twins 

Heraclitus’ metaphor — no man steps into the same river twice — applies to technology as it does to life in general. After nearly 40 years in industrial software, we’ve seen companies either endure change or embrace it to gain a competitive advantage. 

Product development is only getting more complex. New features, regulations, sustainability, supply chain issues, and AI continually reshape design, manufacturing, and maintenance. The challenge is not to avoid complexity but to leverage it. 

A few years ago, we introduced Siemens Xcelerator, an open digital business platform designed to help customers move faster through the manufacturing and engineering gauntlet. At its core is the comprehensive digital twin. 

Digital twins have existed for years. What’s transformative now is their ability to integrate manufacturing, design, electrical, software, and automation into a synchronized model. The closer the virtual version mirrors reality, the more confidently you can decide amid rising complexity. 

This philosophy drove our $10.3 billion acquisition of Altair Engineering, completed in March 2025 — Siemens’ largest ever. Altair uniquely integrates nonlinear structural analysis and advanced electromagnetic simulation for electrified products, setting it apart from competitors. Its integrated data science and AI workflows via RapidMiner further strengthen its value. Most distinctively, Altair’s high-performance computing solutions dynamically balance workloads across GPU and CPU infrastructure, both on-premises and in the cloud, enabling efficient scaling as simulation and AI demands accelerate. 

But technology alone doesn’t solve current challenges. We have also focused on what we call ‘adaptive’ solutions — software that scales up and down without data loss or disruption. Our Designcenter suite enables engineers to start with entry-level 3D modeling and seamlessly migrate to the full power of NX, our flagship CAD/CAE system. This happens without any data translation. Whether you’re working in the cloud, on a desktop, or moving between versions, the data flows consistently. This might sound mundane, but for engineers who’ve lost weeks to data translation errors, it’s liberating. 

Moving from custom BOM

We’ve long provided both design and manufacturing BOMs in Teamcenter. Recently, customers asked for an enterprise BOM: a ready-made system for all materials and variants, supporting forecasts for capacity and scheduling. 

Many custom BOM systems still rely on decades-old IBM mainframes with hard-to-replace customizations. According to ABI Research, only 5% of industrial data is used because it’s fragmented and siloed. Legacy BOMs are major obstacles for AI. 

Customers must modernize. We’ve invested in new Teamcenter features, replacing legacy enterprise BOMs with out-of-the-box solutions. Teamcenter BOM performance is now 20 times faster for part organization and production planning. 

Automotive OEMs handle hundreds of thousands of vehicle configurations and need instant BOM resolution. They process up to a million orders nightly, solving for 40 weeks of production. 

GM once used 10-15 control models to validate builds. Now, Teamcenter Configurator covers all variations, validating only buildable vehicles and avoiding impossible combinations. This replaces costly custom tools with standard Teamcenter features. 

This complexity isn’t limited to the automotive industry. Leading phone manufacturers face similar challenges. Maintaining separate BOMs for each variant creates change management headaches. Having a single massive BOM leads to million-line confusion. Teamcenter’s variant-option approach addresses this smoothly with standard tools. 

What about AI as a solution to this growing complexity? That question was central in our decision to acquire life sciences R&D company Dotmatics last year. 

Data management and AI

You’ve seen headlines about the AI bubble and hype cycle. It’s hard to place where we are, but a core truth about AI comes from USA Today’s Chris Gallagher, in a piece sponsored by Ataccama and Dotmatics, which we acquired last year: 

Without robust data management practices to ensure data integrity, accuracy, and availability, AI’s capabilities will remain stunted. Data quality and governance play a pivotal role in unlocking AI’s true potential by providing the essential infrastructure for its success. In essence, the future of AI is inextricably linked to advancements in data management, making it an indispensable enabler of the AI revolution.

In 2024, we released an AI copilot for Teamcenter, the most proven and scalable product data management system in the world. Teamcenter is precisely where you go to ensure data integrity, accuracy, and availability. Now we’re applying AI to that lifecycle intelligence. For example, ask Teamcenter Copilot, “What are the top five warranty hotspots?” It uses RapidMiner to analyze data, project issue severity, and pinpoint the top risk. When you request a replacement, AI locates the part, creates a work set, tracks change impacts, and starts a change request. It also analyzes costs and risks and recommends a replacement. The BOM updates automatically. Everything is tracked. Stakeholders see the changes.  

This automated process ensures speed and accuracy. Teamcenter has always been where engineers handle change, configuration, and digital mock-ups. Now it’s an AI foundation. Using it just for CAD check-in/check-out wastes its potential. Companies that overlook these features risk falling behind as AI transforms product development. 

Dotmatics

When we looked at the pharmaceutical and life sciences industries, we saw something familiar. It was reminiscent of PLM 25 years ago — disparate applications with no APIs and no enterprise-level data integration. This opportunity is part of what spurred our $5.1 billion acquisition of Dotmatics, a leader in life sciences R&D software.  

This isn’t entirely new territory for Siemens. We’ve long designed medical devices with our tools. Our manufacturing execution software runs pharmaceutical production facilities globally, and our process automation systems operate many pharma factories worldwide. In Dotmatics, we saw an opportunity to create enterprise-level data management that ensures data integrity, accuracy, and availability. This is the foundation AI requires to accelerate drug discovery and life sciences R&D.  

Digital twins have existed for years. What’s transformative now is their ability to integrate manufacturing, design, electrical, software, and automation into a synchronized model.

We’re starting with pharma, but the biochemistry capabilities extend to oil and gas, personal care products, agriculture, and beyond. As Dotmatics integrates into our portfolio, we’re applying the same data backbone philosophy that transformed discrete manufacturing. Now, we’re bringing it to the molecular sciences. 

Across industry domains, our approach is to help customers move through complexity faster — critical as AI evolves. Realizing this requires data integrity, accuracy, and availability, which have been central to our strategy and the foundation of our investments. 

Next, we’ll explore how these principles are applied across specific industries. This ranges from photorealistic digital collaboration that brings non-technical stakeholders into engineering conversations to comprehensive digital threads that connect design through quality inspection. We will also look at data centers as integrated products, from chip to power grid. The technology is maturing faster than most realize. The companies positioned to capitalize on it are those investing in robust foundations today. 

Democratizing engineering intelligence

One persistent challenge in product development is that engineering data remains locked in formats only specialists can interpret. CAD models, simulation results, and manufacturing instructions are rich in insights but inaccessible to teams such as marketing, sales, service, and executives. These groups need to make critical product decisions. 

Last year, we released Teamcenter Digital Reality Viewer — a BOM-driven, photorealistic visualization tool embedded directly in Teamcenter. It’s cloud-based and uses real-time ray tracing powered by Nvidia GPUs. It also leverages our partnership with Nvidia Omniverse. Instead of relying on local hardware, which creates visualization inequality across the organization, our ‘rendering-as-a-service’ architecture is hosted on the secure Siemens Xcelerator cloud. This provides a consistent, powerful visualization for all users regardless of their devices. 

Here’s what’s changed: you can now chat with the system in natural language. Say, “Show me a cross-section and slide through the vehicle. Show it in a realistic environment and let me walk around it. Add the latest version with white trim so I can compare them side by side. Change to a sunset view gradually as I move.” 

The amount of manual setup work this used to require was staggering. Specialists spent hours or days configuring visualization environments. We now do it conversationally, which democratizes access to engineering intelligence. When a CFO asks, “Why are we making this design change?” they can see it, understand it, and make informed tradeoffs. That’s transformative for organizational alignment. 

Rolls-Royce

Rolls-Royce has been a Siemens customer for many years, using our software to manufacture the sophisticated engines they design. Their challenge was typical of heritage industrial companies: document-based systems, some over 50 years old, generating, transforming, and consuming data in silos not easily accessible and certainly not integrated. 

With Siemens Xcelerator, they’re now implementing a comprehensive digital thread. They collect engine test and performance data from engineering, compare it to the design intent (“as-simulated”) and the actual production (“as-built”), identify deviations, and feed those insights back into product development. They’re using Insights Hub, our industrial IoT platform, as a functional data store to unify manufacturing and engineering data. 

The results are compelling: Rolls-Royce is using Teamcenter, Opcenter (our manufacturing execution system), Insights Hub, Simcenter (simulation), and Mendix (low-code development) in a single digital thread that connects engineering, functional, and manufacturing data. Performance engineers use this integrated data foundation to make continuous product improvements, turning operational reality into design intelligence. 

But here’s where it gets interesting. Once you’ve built that foundation, you can extend it with AI. They’ve implemented a production copilot that integrates data from PLCs, sensors, and edge devices to make sense of error messages and telemetry. When a quality deviation occurs — say, in turbine blade production — you can ask the copilot to summarize the status. It provides a full overview, identifies errors, and suggests actions based on maintenance records. You can ask it to guide you through the proposed fix, and it offers step-by-step processes with direct links to maintenance manuals. It can even create maintenance tickets automatically with all relevant details. 

This uses RapidMiner with Mendix and embedded AI on top of Insights Hub to create a customized overall equipment effectiveness application. It’s the “adapt and extend” philosophy in action, demonstrating why low-code platforms have become strategic infrastructure. 

Mendix as integration glue

Mendix is coming up in these examples for good reason. For nine consecutive years, Mendix has been named a leader in the Gartner Magic Quadrant for Enterprise Low-Code Application Platforms. 

But awards matter less than the outcomes. General Atomics has built over 20 Mendix applications for employee information tracking, purchasing requests, inventory management, and field service. They’ve replaced approximately 90 legacy systems with Mendix. Bolzoni, a manufacturer of lift truck attachments and material-handling equipment, uses Mendix as integration glue between SAP, Salesforce, and Teamcenter — upgrading legacy systems for equipment quality checks, customer service reporting, and product delivery. Their new technical documentation portal was developed in one month, whereas the prior system took two years. 

When a CFO asks, ‘Why are we making this design change?’ they can see it, understand it, and make informed tradeoffs.

BAE Systems has created 40 applications, including parts approval and task management, that integrate data from SAP, Teamcenter, and project management tools. Applications that previously took years now require as little as four weeks. They saved £40 million on one application alone. 

Why does this matter strategically? Because we’ve embedded AI capabilities directly into Mendix. Specifically, we’ve created drag-and-drop AI functionality that lets customers build AI agents and AI-enabled applications at the right abstraction level. It’s the ideal combination: low-code’s speed and accessibility with AI’s transformative potential, all while maintaining the data integrity and governance that enterprises require. 

Finally, let’s wrap up by discussing how we’re enabling the AI industry itself through chip-to-grid thinking and solutions, and more. 

Productizing the data center

There’s been tremendous attention over the past year to the explosive growth of data centers driven by AI. What might not have been getting attention: the physics matter here. Modern AI accelerator chips are vastly more powerful than previous generations, and they generate enormous heat. But we started thinking: why not treat the data center itself as a product? Why not apply comprehensive digital twin thinking from chip to power grid? 

We start by modeling heat transfer within the chip itself. That drives server-level simulation to determine cooling requirements, which are increasingly moving toward water cooling. We then zoom out to the rack level and, finally, to full-room computational fluid dynamics to simulate airflow, thermal distribution, and energy efficiency. 

We use Simcenter for mechatronic system simulation of power and electrical systems. But our capabilities extend down to the integrated circuit level through Tessent, our leading verification and validation tool for ICs. We can embed IP directly into chips that monitor their health in real time. We can predict impending chip failures based on embedded intelligence — critical when a failed chip can bring down an entire data center. 

As Barron’s reported last fall, Nvidia is using digital twins to design optimal data centers, creating virtual versions of physical designs to determine what works best before construction begins. What they’re describing is exactly this chip-to-grid systems thinking — treating the entire facility as an integrated, simulatable product. 

There’s another dimension to this: the shift-left principle. Consider the Sony headset we helped develop. Initially conceived for office environments, we realized it could also work in factories — but that introduced new requirements. Volume handling needed to increase substantially to account for noise. Also, electromagnetic interference became a concern in industrial settings, and you don’t want to discover EMI problems late in the process, because once chip fabrication starts, changes become prohibitively expensive. 

So, we’re integrating software and hardware design in a shift-left process to find issues early. This matters enormously, as illustrated by one last automotive example. According to National Highway Traffic Safety Administration data, there have been as many automotive recalls since 2019 as in the entire 138 years before that — much of it software and electronics related. By verifying chips within the vehicle’s digital twin, we can not only prevent failures but also model electronic aging. If we predict an impending chip failure, we can scale back its function — reducing voltage and frequency — so that the vehicle doesn’t get stranded but can get home safely. 

Unilever

Process industries face a different level of complexity. To meet changing consumer demands, companies like Unilever need to move to smaller batch sizes, reproduce recipes across hundreds of global plants quickly, and manufacture products closer to the markets they serve — particularly important for sustainability. 

Unilever operates approximately 300 factories and is driving significant improvements in flexibility. Our largest consumer packaged goods customers make 100,000 product design changes annually. The result? One million manual interventions are managing that complexity. Consider the potential for error, delays, and costs. 

Our solution is what we call “recipe transformation” — an innovation pipeline between R&D and manufacturing facilities. We use Xcelerator Edge devices hosting ProductionLink, one of our solutions that synchronizes factory assets to a centralized bill of equipment. Recipe transformation automates the change process, calculating the impact of each modification across the entire manufacturing base. We push individually adjusted recipes to each facility, then map modifications to local automation systems. 

Why not apply comprehensive digital twin thinking from chip to power grid? 

What previously took weeks or months now happens in minutes. We call this adaptive manufacturing. That is, the facility adapts to the recipe rather than the other way around. This allows companies to produce locally for customers, which is transformative for sustainability. When you build mega-factories, ship all ingredients in, produce, and then ship products back out, roughly 98% of your carbon footprint sits in supply chain logistics. Producing locally for the customer fundamentally changes that equation. 

We’re starting with consumer-packaged goods and recipe management, but we believe this adaptive manufacturing paradigm applies to discrete manufacturing as well. 

Persistence 

As for a metaphor in the natural world for how technological change takes root and then takes off, consider the experience of cultivating bamboo. When you first plant the seed and start watering it, nothing seems to happen for the first year. You keep watering through the second year — still nothing. At this point, you’re convinced it’s dead. 

But if you persist into the fifth year, something remarkable happens. What was occurring the entire time was root system development — invisible underground work creating the foundation for explosive growth. In the fifth year, the bamboo can grow 90 feet in just six weeks. 

We’ve been investing in the comprehensive digital twin for a long time. We’ve invested enormously in Teamcenter to ensure it can capitalize on what AI enables. We’ve been working on AI itself for years. Like that bamboo plant, this is going to accelerate dramatically — and we’re working hard to ensure our customers aren’t left behind. And we’re using Siemens Xcelerator to help you create a competitive advantage for your companies — not despite complexity, but because of it

Editor’s note: This post is adapted from highlights shared at Realize LIVE 2025 (North America and Asia) and other Siemens presentations. Watch the full session here.

Natalie Navales
Senior Executive Communications Manager

Natalie Navales is part of the Executive & Culture Communications team at Siemens Digital Industries Software, where she helps shape storytelling across leadership, employee engagement, and brand initiatives. As the co-chair Inclusion and Belonging, she champions inclusive culture and well-being across the organization. Natalie loves to highlight diverse voices and journeys of Siemens employees, bringing authentic stories to life through thoughtful interviews and creative presentation. With a background in design and communications, she blends strategic messaging with a human touch—whether supporting keynote development, crafting leadership bios, or amplifying internal culture through digital platforms.

More from this author

Geoffrey Koch
writer

Geoff Koch is a writer on the DI SW executive and culture communications team in Portland, Oregon, which means working on everything from occasional book chapters to short social posts. He also edits and repurposes DI SW video content for social media. He's worked in tech and higher ed comms for two decades and sometimes dabbles in journalism and poetry, too.

More from this author

Leave a Reply

This article first appeared on the Siemens Digital Industries Software blog at https://blogs.sw.siemens.com/news/a-new-manufacturing-playbook-digital-thread-data-integrity-and-ai/