{"id":67728,"date":"2025-07-29T05:22:00","date_gmt":"2025-07-29T09:22:00","guid":{"rendered":"https:\/\/blogs.sw.siemens.com\/simcenter\/?p=67728"},"modified":"2026-03-26T06:47:09","modified_gmt":"2026-03-26T10:47:09","slug":"siemens-antemotion-join-forces","status":"publish","type":"post","link":"https:\/\/blogs.sw.siemens.com\/simcenter\/siemens-antemotion-join-forces\/","title":{"rendered":"Unlocking high\u2011fidelity radar simulation: Siemens and AnteMotion join forces"},"content":{"rendered":"\n<p>Siemens Digital Industry Software and <a href=\"https:\/\/www.antemotion.com\/\" data-type=\"link\" data-id=\"https:\/\/www.antemotion.com\/\" target=\"_blank\" rel=\"noreferrer noopener\">AnteMotion<\/a> have formed a strategic partnership to redefine sensor simulation for ADAS and autonomous vehicle development. This collaboration enables the integration of Siemens\u2019 high-fidelity, physics-based sensor models\u2014originating from the <a href=\"https:\/\/plm.sw.siemens.com\/en-US\/simcenter\/autonomous-vehicle-solutions\/prescan\/\" target=\"_blank\" rel=\"noreferrer noopener\">Simcenter Prescan<sup>TM<\/sup><\/a> environment\u2014directly into AnteMotion\u2019s real-time, Unreal Engine-based, rendering platform <a href=\"https:\/\/www.antemotion.com\/#portfolio\" target=\"_blank\" rel=\"noreferrer noopener\">Midgard<sup>TM<\/sup><\/a>.<br>In practical terms, this means Siemens\u2019 trusted physics-based sensor simulations, widely used by OEMs and tier 1 suppliers, can now operate in Unreal Engine 5, bringing photorealistic real-time rendering together with world-class sensor modeling. The first product of this collaboration is the Simcenter Prescan physics-based radar model, now fully integrated with Midgard. This partnership unlocks new levels of realism, speed, and scalability \u2013 giving ADAS\/AV developers a powerful competitive edge.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">Bringing Siemens\u2019 sensor models to Midgard<\/h2>\n\n\n\n<p>Simcenter Prescan has long been a go-to platform for virtual prototyping of ADAS and autonomous driving systems, celebrated for its accurate lidar, camera, and radar models. However, these models were previously confined to the Simcenter Prescan ecosystem.<br>That changes now.<br>Midgard, AnteMotion\u2019s open and extensible simulation framework, enables the seamless integration of third-party tools, making it possible to bring Siemens\u2019 validated radar models into a real-time, unreal-based environment. This allows developers to create a photorealistic world and interact with advanced sensor simulations. Fully connected to AnteMotion\u2019s automated content creation pipeline with <a href=\"https:\/\/www.antemotion.com\/#portfolio\" target=\"_blank\" rel=\"noreferrer noopener\">Maze<sup>TM<\/sup><\/a> and <a href=\"https:\/\/www.antemotion.com\/#portfolio\" target=\"_blank\" rel=\"noreferrer noopener\">Procedural Worlds<sup>TM<\/sup><\/a>, developers can generate realistic test environments within minutes using HD or SD map data.<\/p>\n\n\n\n<p>This partnership combines the best of both worlds: Siemens\u2019 rigorous sensor physics, plus AnteMotion\u2019s photorealistic simulation. This integration helps automakers saving costs by moving faster with more confidence.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">Radar model perfected through 10+ years of innovation<\/h2>\n\n\n\n<p>At the center of this announcement is the Simcenter Prescan physics-based radar model, a sensor simulation unlike any other. This model wasn\u2019t built overnight \u2013 it\u2019s the product of over a decade of development by Siemens, refined through extensive research and industry collaboration. In fact, Siemens validated this radar simulator in close partnership with leading radar experts: tests with major chipmakers (NXP Semiconductors and Analog Devices) and a Japanese tier-1 automotive supplier helped to ensure that the simulated radar echoes match real-world measurements with remarkable fidelity. Through close collaboration with tier-1 and tier-2 manufacturers, Siemens delivers a simulation that precisely replicate radar sensor behavior &#8211; from signal noise to complex antenna patterns &#8211; and real-world environmental conditions (fig 1).<\/p>\n\n\n<div class=\"wp-block-image\">\n<figure class=\"aligncenter size-full\"><img loading=\"lazy\" decoding=\"async\" width=\"604\" height=\"256\" src=\"https:\/\/blogs.sw.siemens.com\/wp-content\/uploads\/sites\/6\/2025\/07\/Fig1_radar-sensor-behavior.png\" alt=\"\" class=\"wp-image-67910\" srcset=\"https:\/\/blogs.sw.siemens.com\/wp-content\/uploads\/sites\/6\/2025\/07\/Fig1_radar-sensor-behavior.png 604w, https:\/\/blogs.sw.siemens.com\/wp-content\/uploads\/sites\/6\/2025\/07\/Fig1_radar-sensor-behavior-600x254.png 600w\" sizes=\"auto, (max-width: 604px) 100vw, 604px\" \/><figcaption class=\"wp-element-caption\">Figure 1: Physics-based radar simulation in Simcenter Prescan accurately replicates core sensor behavior, including signal noise, antenna characteristics, and raw signal outputs.<\/figcaption><\/figure><\/div>\n\n\n<p>What makes this radar simulation truly unique is its depth of physical modeling of the scene and device. Instead of a simplistic &#8216;ideal sensor&#8217; or pre-baked object detections, the Simcenter Prescan radar simulation reproduces the electromagnetic wave phenomena from the ground up. It can generate raw analog radar signals, range-Doppler or even the ADC (analog-to-digital converter) data as if coming directly off a radar\u2019s RF front-end. This allows engineers to test and develop radar signal processing algorithms on fully synthetic data that behaves like data from a real sensor. The simulation accounts for effects like multi-path reflections, Doppler shifts from moving objects, micro-Doppler effects (such as rotating wheels and human gait signature), and even subtle effects of various materials and surfaces. This advanced modeling delivers real-time performance across numerous industry-standard scenarios, including NCAP tests, as successfully demonstrated in Hardware-in-the-Loop (HiL) environments.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">Why high quality synthetic radar data matters?<\/h2>\n\n\n\n<p>With the automotive industry shifting toward AI-driven perception, the need for realistic raw sensor data has never been more urgent. Here\u2019s why:<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>4D imaging radar:<\/strong> The latest automotive radars are no longer limited to detecting range and speed for a few objects \u2013 they are \u201cimaging\u201d radars that provide high-resolution point clouds (distance, azimuth angle, elevation angle and velocity). A 4D radar can distinguish not just where an object is on the horizontal plane, but also its height above the ground, by using 2D antenna arrays. This richness of data (often comparable to lower-resolution LiDAR in detail) greatly enhances perception. To develop algorithms on such data (for tracking or object classification), engineers need simulated radar feeds that are just as detailed, including tens of thousands of reflection points with accurate signal amplitudes and Doppler shifts. Only a physics-based approach \u2013 like Siemens\u2019 model \u2013 can credibly generate this level of detail.<\/li>\n\n\n\n<li><strong>Raw sensor data fusion and end-to-end driving algorithms:<\/strong> ADAS and AV systems are increasingly moving toward fusing raw data from multiple sensors (camera pixels, radar returns, LiDAR points, etc.) using AI, rather than relying on each sensor\u2019s built-in object detection. In traditional systems, a radar might internally detect a car and send a high-level message \u201cobject at 50m ahead.\u201d In modern raw-fusion approaches, the radar would send the whole point cloud or even the ADC samples to a central computer, which fuses it with camera images and other data, giving the AI a far more detailed, nuanced view of the environment. This shift goes hand-in-hand with the rise of end-to-end driving algorithms \u2013 deep learning models that take raw multi-sensor input and directly output driving decisions or motion plans. Unlike the modular pipelines (where perception and planning are separate), end-to-end systems learn to interpret the scene and make driving choices in one network. Simcenter Prescan\u2019s radar model was designed with this raw-data paradigm in mind, which makes it ideal for developing these next-gen fusion algorithms and end-to-end AI systems. Engineers can feed these high-fidelity simulated radar reflections into their fusion pipelines or neural networks exactly as they would real sensor data. This level of realism is key for building confidence in AI models. It means that an autonomous driving algorithm can be trained or tested on Simcenter Prescan\u2019s simulated radar data and encounter the same kind of noise, resolution limits, and reflection characteristics it would in the wild. Radar, in particular, plays a crucial role in poor weather or long-range scenarios, so having a physics-accurate radar model gives the fused AI system reliable radar cues to learn from. In short, Simcenter Prescan provides the raw camera, LiDAR, and radar data streams needed for advanced sensor fusion under one virtual roof, allowing developers to iterate on AI perception and driving logic without waiting for physical testing.<\/li>\n\n\n\n<li><strong>Software-defined vehicles and centralized compute architectures:<\/strong> The car of the near future is often described as a \u201csoftware-defined vehicle,\u201d meaning its capabilities are heavily determined by software, and it features a centralized computing architecture rather than dozens of separate electronic control units. Many automakers are shifting to centralized high-performance zonal or domain controllers that handle data from multiple sensors in one place. This architecture goes hand-in-hand with the raw data fusion mentioned above. For simulation, this implies that we must generate sensor outputs in their native form (e.g. raw radar reflections, full camera pixel frames) using physics-based sensor models to virtually test the central perception software.<\/li>\n<\/ul>\n\n\n\n<p>The common theme is that as vehicles get smarter, the development\/testing challenges get more complex. You need synthetic data that mirrors reality as closely as possible. A radar simulation must emulate real physics (multipath, speckle noise, varying reflectivity) to properly train and stress-test radar signal processing or sensor fusion AI. The better the simulation fidelity, the more the algorithms trained or validated in virtual environments will behave reliably on the road. It\u2019s not enough anymore to simulate \u201cthere is a car over here;\u201d you need to simulate the actual radar returns from that car\u2019s metal body, the rotating wheels, the road below it, the guardrail nearby, etc (fig 2). The collaboration between Siemens and AnteMotion directly addresses this need.<\/p>\n\n\n<div class=\"wp-block-image\">\n<figure class=\"aligncenter size-full\"><img loading=\"lazy\" decoding=\"async\" width=\"604\" height=\"216\" src=\"https:\/\/blogs.sw.siemens.com\/wp-content\/uploads\/sites\/6\/2025\/07\/Fig2_Generating-sensor-outputs.png\" alt=\"\" class=\"wp-image-67913\" srcset=\"https:\/\/blogs.sw.siemens.com\/wp-content\/uploads\/sites\/6\/2025\/07\/Fig2_Generating-sensor-outputs.png 604w, https:\/\/blogs.sw.siemens.com\/wp-content\/uploads\/sites\/6\/2025\/07\/Fig2_Generating-sensor-outputs-600x215.png 600w\" sizes=\"auto, (max-width: 604px) 100vw, 604px\" \/><figcaption class=\"wp-element-caption\">Figure 2: Feature-rich, physics-based sensor models generate native-format outputs \u2014 such as radar reflections and camera pixels \u2014 to emulate real-world behavior with high fidelity.<\/figcaption><\/figure><\/div>\n\n\n<p>By bringing Siemens\u2019 high-fidelity, physics-based sensor models into an open, high-performance simulation engine, developers have a powerful new tool to generate realistic synthetic sensor data at scale.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">Enabling cross-platform access to top-tier models<\/h2>\n\n\n\n<p>One of the most exciting aspects of the Siemens-AnteMotion collaboration is what it signifies for the simulation industry at large: a move toward cross-platform availability of top-tier models. In the past, simulation environments tended to be self-contained \u2013 if you wanted the best camera model, or the best radar model, you often had to buy into a specific platform that offered it. This collaboration breaks that silo by demonstrating that even a highly complex sensor model from a proprietary platform can be packaged and deployed in a completely different simulation ecosystem with excellent results. This confirms the industry trend: the breaking down of walls between simulation tools. Just as standardization efforts in ADAS (for example, ASAM OpenX standards) aim to make scenarios and traffic models portable, similar initiatives target sensor model interoperability. One prominent example is ASAM OSI (Open Simulation Interface), a standard that essentially acts as an open sensor interface \u2013 defining a common way for sensor models to plug into various simulators. By adhering to standards like OSI, a camera or radar model developed on one platform can communicate seamlessly with another simulation environment via a shared format.<br>This push for open interfaces aligns perfectly with the Siemens-AnteMotion collaboration\u2019s goals. It suggests that sensor models might become just as portable as scenarios or maps in the near future. The vision is that the very best models\u2014be it for sensors, vehicles, or environments\u2014aren\u2019t confined to one system, but can be shared and plugged into multiple platforms to maximize their impact. <\/p>\n\n\n\n<p>Siemens and AnteMotion are taking a bold first step toward that vision by taking a flagship Siemens radar model and making it available within Midgard\u2019s Unreal-based simulator, effectively putting this cross-platform concept into practice.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">Try it &#8216;first\u2013early&#8217; access available<\/h2>\n\n\n\n<p>The integration of Siemens\u2019 physics-based radar into AnteMotion\u2019s Midgard is just the beginning. Both companies plan to integrate more physics-based sensor types (such as physics-based lidar and camera models) as they continue to collaborate.<br>Do you want to be among the first to take advantage of this new capability?<br>Siemens and AnteMotion are inviting early adopters and partners to join the Early Access Program. Collaborate with us to accelerate your sensor development with unmatched simulation realism, provide feedback, and shape our roadmap. Reach out to <a href=\"mailto:gwen.vanvugt@siemens.com\" data-type=\"mailto\" data-id=\"mailto:gwen.vanvugt@siemens.com\">Gwen van Vugt<\/a>, Head of Simcenter Autonomy at Siemens Digital Industries Software. He can provide more information about pilot programs, licensing, and how to be part of the early adopter community that drives this innovation forward.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">About AnteMotion<\/h2>\n\n\n\n<p>AnteMotion is a technology company specializing in advanced simulation solutions for Autonomous Vehicle and ADAS development. Its flagship platform, Midgard, is an open and modular simulation environment built on Unreal Engine, designed to deliver high-performance, real-time, and photorealistic simulations. Midgard enables seamless integration of third-party tools, sensors, and control stacks, empowering developers and researchers to validate complex driving scenarios with precision and flexibility. Learn more at <a href=\"http:\/\/www.antemotion.com\" data-type=\"link\" data-id=\"www.antemotion.com\" target=\"_blank\" rel=\"noreferrer noopener\">www.antemotion.com<\/a>.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">About Siemens Digital Industries Software<\/h2>\n\n\n\n<p>Siemens Digital Industries Software helps organizations of all sizes digitally transform using software, hardware and services from the Siemens Xcelerator business platform. Its Simcenter\u2122 portfolio provides simulation and test solutions that enable engineers to innovate faster and with greater confidence. Simcenter Prescan is a leading simulation platform for the virtual development and validation of automotive sensors and ADAS\/AV systems, offering highly realistic, physics-based models for radar, lidar, and camera technologies. Learn more at <a href=\"https:\/\/plm.sw.siemens.com\/en-US\/simcenter\/\" target=\"_blank\" rel=\"noreferrer noopener\">www.siemens.com\/simcenter<\/a>.<\/p>\n\n\n\n<p><\/p>\n","protected":false},"excerpt":{"rendered":"<p>Siemens and AnteMotion are reshaping the future of autonomous driving with a game-changing partnership. By merging Siemens\u2019 physics-accurate radar models with AnteMotion\u2019s real-time Unreal Engine platform, developers can now simulate ultra-realistic sensor data at scale, thereby accelerating AI training and cutting costs like never before.<\/p>\n","protected":false},"author":47820,"featured_media":67763,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"spanish_translation":"","french_translation":"","german_translation":"","italian_translation":"","polish_translation":"","japanese_translation":"","chinese_translation":"","footnotes":""},"categories":[1],"tags":[61513,63725,46191,1823],"industry":[89,132,133,134],"product":[511],"coauthors":[10256],"class_list":["post-67728","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-news","tag-adas","tag-autonomous-driving","tag-prescan","tag-simcenter","industry-automotive-transportation","industry-automotive-oems","industry-automotive-suppliers","industry-trucks-buses-specialty-vehicles","product-simcenter-prescan"],"featured_image_url":"https:\/\/blogs.sw.siemens.com\/wp-content\/uploads\/sites\/6\/2025\/07\/GettyImages-183060490-scaled.jpg","_links":{"self":[{"href":"https:\/\/blogs.sw.siemens.com\/simcenter\/wp-json\/wp\/v2\/posts\/67728","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/blogs.sw.siemens.com\/simcenter\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/blogs.sw.siemens.com\/simcenter\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/blogs.sw.siemens.com\/simcenter\/wp-json\/wp\/v2\/users\/47820"}],"replies":[{"embeddable":true,"href":"https:\/\/blogs.sw.siemens.com\/simcenter\/wp-json\/wp\/v2\/comments?post=67728"}],"version-history":[{"count":5,"href":"https:\/\/blogs.sw.siemens.com\/simcenter\/wp-json\/wp\/v2\/posts\/67728\/revisions"}],"predecessor-version":[{"id":68004,"href":"https:\/\/blogs.sw.siemens.com\/simcenter\/wp-json\/wp\/v2\/posts\/67728\/revisions\/68004"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/blogs.sw.siemens.com\/simcenter\/wp-json\/wp\/v2\/media\/67763"}],"wp:attachment":[{"href":"https:\/\/blogs.sw.siemens.com\/simcenter\/wp-json\/wp\/v2\/media?parent=67728"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/blogs.sw.siemens.com\/simcenter\/wp-json\/wp\/v2\/categories?post=67728"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/blogs.sw.siemens.com\/simcenter\/wp-json\/wp\/v2\/tags?post=67728"},{"taxonomy":"industry","embeddable":true,"href":"https:\/\/blogs.sw.siemens.com\/simcenter\/wp-json\/wp\/v2\/industry?post=67728"},{"taxonomy":"product","embeddable":true,"href":"https:\/\/blogs.sw.siemens.com\/simcenter\/wp-json\/wp\/v2\/product?post=67728"},{"taxonomy":"author","embeddable":true,"href":"https:\/\/blogs.sw.siemens.com\/simcenter\/wp-json\/wp\/v2\/coauthors?post=67728"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}