Podcasts

Immersive Engineering: Revolutionizing the future of product design with Siemens and Sony

What does the next generation of product design look like in a world where the digital and physical merge? What role does immersive technology play in enhancing design processes? 

In this episode of the Next Generation Design podcast, host Greg Arnot speaks with Ben Widdowson, Head of Marketing for Immersive Engineering for Siemens’ NX, and Henry Kondo, leading Sony’s XR and 3D display business. They explore the exciting collaboration between Siemens and Sony, unveiling new developments in immersive engineering and the industrial metaverse.

Ben and Henry share insights into the evolution of design processes—moving from traditional methods to cutting-edge immersive environments where engineers can interact with 3D models in real time. Learn how the Siemens’ NX Immersive Designer and Sony’s XR head-mounted display are breaking new ground in engineering workflows, enabling faster decision-making, and enhancing the digital twin experience.

Discover the potential of these immersive tools to reshape industries, reduce prototyping costs, and provide a more intuitive, human-scale approach to product design. Plus, hear how this collaboration is expected to inspire the next generation of engineers by making the design process more interactive and exciting.

What you’ll learn in this episode of the Next Generation Design podcast:

  • How has visualization in product design software evolved over time? (2:30) 
  • What are the Industrial Metaverse and Immersive Engineering and their importance in modern design. (4:20)
  • How Sony and Siemens launched the NX Immersive Designer and XR headset collaboration (7:10)
  • What are VR, AR, and MR, and how is Sony’s XR different? (11:35)
  • The benefits of working with Sony XR and Siemens’ NX (17:00)
  • What impact will the Sony-Siemens partnership have on engineering workflows? (21:10)
  • How Oracle Red Bull Racing will leverage NX Immersive Designer and the Sony XR HMD (30:30)
  • How does the market rate Sony XR vs Apple and Meta head-mounted displays (32:00)

Engineering is changing drastically by using the XR technology.

Henry Kondo, SONY

Listen to the Next Generation Design podcast wherever you do podcasts.

Connect with Greg Arnot

LinkedIn

X

Connect with Henry Kondo

LinkedIn

Connect with Ben Widdowson

LinkedIn

Learn more about the Sony XR Head-mounted display

Read the summary or watch the video version of this episode: What is Siemens Immersive Engineering? | Siemens & Sony partnership

Listen to or read the next episode of the Next Generation Design podcast:  From Concept to Creation: The Importance of Openness in Siemens Software Releasing Tuesday, November 19, 2024

Listen to or read the previous episode of the Next Generation Design podcast: A Student’s Journey in CAD and 3D Modeling


Podcast transcript for Immersive Engineering: Revolutionizing the future of product design with Siemens and Sony

BEN WIDDOWSON:

Siemens has talked for a long time about combining the real and the digital worlds and enabling you to create the most comprehensive digital twin. NX is often the starting point for creating these virtual objects. Traditionally, if you were interacting with your 3D design on the 2D monitor in front of you, this is how most people would consume 3D data until it was prototyped or made. With the new Sony XR HMD, you’re now able to experience that product or process much earlier in the production process. You can put the headset on, walk around the object, pick it up, pull it apart, and interact with it. You can even run simulations so you get a real sense of how the product will look, feel, and work before making any prototypes.

GREG ARNOT:

You’re listening to another episode of the Next Generation Design podcast. I’m your host Greg Arnot.

On today’s episode, I’m joined by Ben Widdowson, Head of Marketing for Immersive Engineering for NX at Siemens, and Henry Kondo, Deputy General Manager for XR Business Development at Sony.

Before we begin, let’s say hello to today’s guests. 

Let’s start with you Ben, can you tell our listeners a bit more about yourself and your role at Siemens?

BEN WIDDOWSON

Hi, Greg. Great to be here. I am the Head of Marketing for Immersive Engineering here at Siemens. I work within the NX product marketing team. We’ve been leading the collaboration with Sony, including Henry here, for a number of years now, building up to the launch of NX Immersive Designer and the Sony XR head-mounted display at the end of this year.

GREG ARNOT:
Henry, welcome to the show. Would you like to introduce yourself and tell us what you do at Sony?

HENRY KONDO

My name is Henry Kondo from Sony Corporation. I’m based in Tokyo, Japan. I’m now leading Sony’s XR business, or head-mounted display, and also the 3D display business. I’m overseeing the business, the technologies, and the marketing for XR. I’m also leading the collaboration with Siemens and Sony.

GREG ARNOT:

Thank you both.

Now, I’d like to dive right into today’s subject. At Siemens we   believe that the next generation of design will be immersive. From your perspective, what has visualization in product design software looked like over the years, and how has it evolved?

BEN WIDDOWSON:
In the 3D design world, we often use tools called CAD, or Computer-Aided Design, and these have evolved originally from layout drawings that were traditionally done with pen and paper. So, evolving from pen and paper to computers was a big step. We went from pen and paper to computers, which were generally 2D layouts, and then to 3D, but these were very simplistic forms. Over the years, we’ve been able to, with the help of increased computational power, increase the accuracy, quality, and realism of the visualization. Now, there’s always been a bit of a trade-off here. You could either get a very nice-looking picture, which took a long time to render, or you could use a technique used in games, where you get a very nice picture in real-time, but the quality is maybe reduced—it’s not ray-traced. However, over the years, this has become not the case. Using technologies like RTX, we’re now able to render things in real-time very accurately, and even in technologies like virtual reality and immersive technology, so you can actually see these things in real time as if you were standing in the room with them.

HENRY KONDO:

Thank you, Ben. So, at Sony, we are very focused on visualization technologies, especially in the XR and virtual reality world. Comparing it to previous workflows or people’s design processes, the new design process has become more virtual. People are using visualization technology in the virtual world, which means the 3D world, people can easily understand more realistic objects as real size and feelings. 

GREG ARNOT:
We’ve heard so much about the industrial Metaverse lately. 

Can you explain for our listeners what, exactly, that is and how it’s shaping the world and technology?

BEN WIDDOWSON:

We see the industrial Metaverse as a way to bring together the most comprehensive digital twin of your production process or product but also bring it together with real-time data from the actual product or process itself. So, not only can you simulate the product with real-world physics and very accurate visualizations, but you can also start to bring in things like factory production statistics or sales statistics. You can start to do things like control your production facility or interact with your digital twin in real-time. One of the other big elements we see in the industrial Metaverse is the ability to use collaboration—being able to see the same product or process at the same time with other people, even if they aren’t co-located with you. We think this will really help change the way designers work and how products are made in the future.

HENRY KONDO:

From Sony’s side, in terms of the industrial Metaverse, the most important thing is how we can convey the reality of the real world into the Metaverse. For example, our very accurate display technologies can realize the real-world objects from real to the Metaverse, and interactions including with our unique controller, we can convey the feeling of really touching objects. 

This year, we announced our new XR headset with Siemens, which is very important to consider when discussing the industrial Metaverse. As I mentioned, how can realize real-world objects in the display is very important. We have unique 4K microOLED technologies, which are really great, precise display. In terms of interaction, we have a pointing controller and also a ring-shaped controller. These two controllers can convey VR feelings very intuitively.

GREG ARNOT:
Henry, what can you tell me about the future of engineering? 

For you and for Sony, what does that look like?

HENRY KONDO
I think engineering is changing drastically by using the XR technology. Originally, people would design objects on a screen and then create and produce the products after that. But now, engineering is changing—it’s real-time and connecting people in the virtual world. This reduces the time needed for engineering, and more people are connected to the world of engineering. 

GREG ARNOT:
The announcement of the Sony headset by Siemens and Sony took many by surprise.

How did the Siemens and Sony collaboration come about, and why?

BEN WIDDOWSON:
Sony has been a customer of Siemens for a number of years now, using our product engineering tools like NX within their engineering divisions. I believe at Realize LIVE earlier this year, we announced that Sony saw a 30% increase in productivity after moving to NX from a competing CAD tool. A number of teams were then thinking, “Okay, what’s the next step in productivity?” Sony’s teams came up with the idea for this new, very high-end mixed reality workflow using a new headset they were planning to develop. They approached Siemens to see if we’d be interested in collaborating as one of their software partners. This was maybe three years ago, and now here we are today, post-CES, and we’re due to launch later this year. We’re very excited about this increased partnership with Sony, and the fact that they’re also a very, very good customer of ours.


HENRY KONDO
This collaboration started almost three years ago. At the beginning, Sony was a customer of Siemens. Our engineering division and our designers were using Siemens NX software to produce Sony products like TVs, cameras, and audio equipment. These important Sony products were produced using Siemens NX software, and that’s why we started collaborating with Siemens. Additionally, by using our unique technology alongside Siemens’ software technologies, we believe this combination is very important. Sony has quite unique expertise in hardware, and Siemens has extensive expertise in software and software platforms. This combination of software and hardware becomes a mixed power.

GREG ARNOT:
What were the business challenges that started the development of NX Immersive Designer and the Sony XR head-mounted display?

BEN WIDDOWSON

Siemens has had virtual reality capabilities within NX for many years now, but one of the limitations we’ve always found is that any headset connected to any tool is generally limited to viewing data. You can open an immersive view, walk around the data, and move parts around, but you’re always very limited in being able to do any meaningful engineering work within that environment. For us and our customers, this was the key issue we wanted to change with this collaboration with Sony. As we were collaborating to work on the headset and the software, the biggest piece we wanted to make sure was that the hardware was tightly integrated into the software. This would unlock new workflows and allow engineers to actually start creating, editing, and manipulating geometry within the immersive environment. That’s the underlying ethos or philosophy behind Immersive Designer, and it’s what we’re really excited to get into the hands of our customers. We think this will be the biggest change people will see compared to their existing VR experiences.

HENRY KONDO: 

From Sony’s side, this XR business is also a big challenge for us. Originally, Sony did its hardware business by itself—Sony created the TV, Sony created the Walkman, and Sony created the camera. Also, we sell these products by ourselves. That’s how Sony traditionally operated. But for this new XR challenge, we produce the hardware, but we don’t have the software, and we don’t have a customer base in the industrial Metaverse areas. So, the collaboration between Siemens and Sony is also a business challenge for us. That’s why we need more communication with your company. For example, at Siemens’ Realize LIVE events, Sony joined the meetings and collaborated with Siemens. This is just the beginning. We started this year, but we will continue this collaboration and aim to grow our business. It’s a challenge, but I believe there is big potential in this new business.

GREG ARNOT:
Henry, would you explain for our listeners what XR is and how it’s different from AR, VR, and MR?

HENRY KONDO:

This virtual reality work began with Virtual Reality. VR is covering the real world with adjusted virtual images. Next is AR—Augmented Reality. AR is like the glasses-type products, where the glasses allow you to see 100%real world, but just adding information digitally. Then came Mixed Reality, or MR. MR is between VR and AR. With MR, the display is captured by a camera, and those images are then mixed into the virtual world. It’s not 100% virtual, and it’s not 100% real—it’s a mix of both. That’s MR. We think these historical changes, from VR to AR to MR, that’s why we combined everything XR. 

GREG ARNOT:
I’d now like to talk more specifically about the technology of the Sony XR head-mounted display and NX Immersive Designer.

First, what makes the hardware unique? What makes it different from other VR headsets on the market?

HENRY KONDO:

Our XR head-mounted display, which we just announced this January at CES, has several unique features compared to other existing head-mounted displays. There are three main characteristics [that set our product apart] from other company’s products. First, our 4K microOLED display. It’s 4K per eye, so when we combine the two displays, we get 8K total resolution. We call this 8K per eye. That’s why it’s a very accurate display and a very precise resolution of 55 PPD, which means pixels per degree. So, in one degree of a person’s vision, there are 55 pixels. This high-density display is the biggest value of our head-mounted display. Second, we have two unique controllers. One is a pointing controller, which is a triangular-shaped, pen-type controller. It allows for very accurate operations, we can realize accurate operations for CAD designers and mechanical designers. The other controller is a ring-shaped controller, which offers more intuitive control. So, we have precise and intuitive controllers—that mix is uniqueness of our [14:12 inaudible]. Then lastly,, we have our real and virtual connection. So, how seamlessly we do the real and virtual connection is we have a flip-up and flip-down functionality, as well as video see-through camera technologies.  

GREG ARNOT:
From an engineering perspective, would you say this is physics-based?

HENRY KONDO:

Yes, it’s physics-based.

GREG ARNOT:
Ah, so being physics based then it considers factors such as gravity, collision detection and realistic motion.

Ben, how is this different from previous VR and AR visualization tools that were available in NX?

BEN WIDDOWSON:

Within NX, we’ve had virtual reality tools for a number of years now. We’ve supported many of the mainstream headsets, and we still do. The biggest challenge we’ve always faced is that they were limited to just viewing the data. That was the major change we wanted to achieve through this specific collaboration with Sony—we wanted to integrate the hardware directly into the software, allowing engineers to manipulate, edit, and create geometry in real-time within NX.  That’s the major difference. What Sony is bringing to the market with their hardware, as Kondo san mentioned, is a real focus on enabling creators. Our customers are creators; they’re designing cars, planes, trains, boats, and buildings in NX every day. We believe the intuitive nature of the controllers Henry mentioned—the precision of the pointing controller and the intuitiveness of the ring controller—will enable customers to interact with their data in ways they haven’t before. So, being able to precisely make those changes, like blends, edits, and pulling faces on the geometry, but being also virtually pick up objects from the NX screen and inspect them. If it’s a large object, they’ll be able to walk around the space and really get a sense of human scale. That’s something that’s very difficult to communicate in marketing videos, imagery, writing, or even stage presentations—the real sense of being immersed in the environment with the product. That’s something you can only experience when you try it. So, we’re very excited for customers to start using this technology as it rolls out towards the end of the year.

GREG ARNOT:
Can you expand a bit more on how it works with NX? 

How seamless is the connection, and what are the benefits? 


BEN WIDDOWSON:
Yeah, sure. I’ll start by explaining how customers can access it. We’ve tried to make it as easy as possible. Everything will be purchased directly from Siemens, both the hardware and the software. You’ll be able to buy a base seat of NX, and you’ll access the module called NX Immersive Designer through what we call value-based licensing. This is the ability to buy a set of tokens from Siemens, and you can then activate different software modules by switching them on and off without needing to purchase them separately. NX Immersive Designer is one of these modules, and it will be available later this year, along with the headset itself. In terms of how customers will actually use it, it’s plug-and-play. Once the headset is connected to your PC, you just hit a single button within NX, and it will launch NX Immersive Designer. You’ll instantly be in the immersive view. As Henry mentioned, you’ll be able to be in a completely virtual reality view—where the entire space is CGI and virtual, including the design objects you’re working on—or you can use the video see-through technology, which will allow you to see the real-world space around you and bring virtual objects into that space. So, imagine being in the office where you are now, Greg, and actually bringing in a virtual object of the design you’re making and being able to see it and spin it around in front of you. We believe this will be a real game-changer compared to just interacting on a screen. But all of this is going to be incredibly intuitive and very easy to use, and it’ll be directly connected to NX.

GREG ARNOT:
Can you expand a bit on value-based licensing for our listeners?

BEN WIDDOWSON:
For customers to access the immersive capabilities we’ve been developing with Sony, we wanted to make it as easy as possible. Customers who already have a seat of NX can also purchase something called Value-Based Licensing, which is our suite of capabilities that can be switched on and off very flexibly using tokens. NX Immersive Designer, which is the software module that leverages the Sony XR head-mounted display, is available through Value-Based Licensing. Any customers using Value-Based Licensing today will get immediate access to NX Immersive Designer and will only need to buy the headset—there’s no additional software purchase required. And if you’re not using Value-Based Licensing today, I would highly recommend looking into it if you’re interested in taking advantage of this new collaboration between Siemens and Sony.

GREG ARNOT:
I’ve been watching some of the publicity videos and I must admit, I’ve been fascinated by the hand controllers. 

I’m curious, how do these controllers work? And how does the ring come into play?

BEN WIDDOWSON:

Traditionally, designers operate a 3D CAD system like NX using a mouse or a space mouse along with a keyboard. It’s a combination of using keys and interacting with the mouse to spin the model around, typing numbers into dialogs. We don’t expect this to change overnight—it’s a very much ingrained habit for engineers. The Sony team is very aware of this and has been thoughtful about how engineers work today. The controllers are designed to be used alongside traditional methods like a mouse and keyboard. Things like the ring actually allow you to still operate your keyboard or mouse at the same time without needing to take the controllers on and off, giving you the instant ability to interact with your model in a more immersive way. Whether that’s moving the object around in front of you, walking around it, or even manipulating parts of the geometry. These controllers are specifically designed for content creators and with existing work methods in mind. We think this really differentiates them from other HMDson the market, where traditionally, they have a more general-purpose set of controllers which have to cater for all sorts of use cases like gaming, entertainment, and of course work. But these are specifically designed for content creators, which is very different and unique.

GREG ARNOT:
When it comes to combining the real and digital worlds, how easily can one switch between them with this Sony XR head-mounted display?

BEN WIDDOWSON:
Siemens has talked for a long time about combining the real and the digital worlds and enabling you to create the most comprehensive digital twin. NX is often the starting point for creating these virtual objects. Traditionally, if you were interacting with your 3D design on the 2D monitor in front of you, this is how most people would consume 3D data until it was prototyped or made. With the new Sony XR HMD, you’re now able to experience that product or process much earlier in the production process. You can put the headset on, walk around the object, pick it up, pull it apart, and interact with it. You can even run simulations like Mechatronics Concept Designer, so you get a real sense of how the product will look, feel, and work before making any prototypes. One of the big advantages, of course, is that you can do a lot of exploration before making any physical products. Physical prototyping still has its place and is very important, but if you can make decisions earlier in the design process without needing prototypes, not only is it a lot faster, but it saves a huge amount of time, resources, and materials for companies making these products. It was one of the big drivers for Sony’s engineering team as well to reduce the number of prototypes and speed up the process. Another critical point is that not everyone is a trained or experienced CAD user. CAD is traditionally a specialist skill set reserved for engineers, but often engineers and designers aren’t the only people involved in the decision-making process. Being able to bring other stakeholders—whether executives, manufacturing people, or even customers—into these design reviews in a more natural way, they don’t have to worry about the technical controls of the CAD system. They can just experience the product or process within the headset. This helps with decision-making so you can get much more visual and visceral information and feedback from stakeholders earlier in the process, which we hope will lead to better products and better processes for everybody over time.

GREG ARNOT:
Henry, how do you see this technology and the Sony-Siemens partnership changing engineering workflows?

HENRY KONDO:
We designed our new products and software with Sony’s engineers, who are both designers and users of this technology. I believe our new solution—head-mounted display plus NX—makes Sony’s engineers’ work more futuristic. We can reduce the time it takes to design new products, and we can perform more accurate designs in the virtual world. Before virtual engineering, we had to create physical mock-ups, like clay or plastic models, to test the product. But now, our engineers can check the product on screen in real size and intuitively. This shortens the design time. Shorter time means more trials, and new products can be produced by our engineers.

BEN WIDDOWSON:

I’ll give you two more examples of customers we’ve given early access previews to, one in aerospace and one in automotive, with some interesting and different use cases. Firstly, in aerospace, we have a customer called Natilus, based out of San Diego in the US. They’re making a new blended wing body design aircraft, both for cargo and, in the future, for passenger use as well. These guys haven’t built their first plane yet—they’re in the process of getting the plane certified. But within their hangar space, where they’re starting to build aspects of the plane, they really wanted to know if the full-scale plane was going to fit through the doors and within the hangar itself. So, being able to see that digital twin in an immersive view using mixed reality so they can see the real hangar around them and place the full-scale virtual airplane inside it. They were able to quickly test whether the airplane would fit through the doors and also whether it would fit alongside all the machinery and tools in the hangar. That’s quite specific to their needs, but it’s an incredibly valuable use case because it could mean the difference between having to change premises or finding somewhere else to build the plane. The other really interesting thing I saw when we first put the headset on the engineers at Natilus was that none of the discussion was about NX or the headset or the technology. Immediately, the engineers were saying, “Oh my God, I didn’t realize how big our plane was,” because they hadn’t built one yet. They were used to consuming the data on a 2D monitor in front of them. This was the first time they saw the full-sized object they’ve been working on day in and day out at human scale. They were blown away by the size because you just don’t get that from a 2D screen. Straightaway, they started diving into engineering decisions that they could have made differently. To me, that was really validating and fascinating.

Another use case comes from an automotive company called Briggs Automotive (BAC) here in the UK. Their engineering teams are split between two locations—one team in the UK and one in Germany. One of their biggest challenges is that they often have to bring engineers from one site to the other for design reviews and to look at the physical assembly of the vehicle. The biggest use case for them is being able to speed up the decision-making process by bringing those stakeholders together virtually to collaborate, view a virtual object, and make design decisions. Also, because they make custom cars for each of their customers, they can bring the customer into the design process much earlier as well so that they can see exactly what they’re buying, how it will come together, and make any minor design decisions alongside the engineers. These are two very different but interesting use cases we’ve seen from customers. Along with the Sony team, I think this shows the real breadth of value that companies can derive from these technologies.

GREG ARNOT:
Thinking about the importance of collaboration across hardware and software, which tend to be distinctly different and sometimes competing domains. 

Tell me about the two-way collaboration between NX and Immersive. How does that work?

BEN WIDDOWSON:
We mentioned earlier that the biggest challenge most immersive tools have is that they often require you to export data and bring it into another tool, like another software package. In the best case, it might open from within your CAD package, but it still locks out your ability to make changes to the design. The big difference between what we’re doing with Sony and their XR head-mounted display, combined with NX, compared to other tools, is that it’s very deeply integrated into the software. What this means for the customer is that, while they’re working on their design, they can interactively work on it within NX in an immersive view—without being restricted by exporting data. They’re always working with the live digital twin, the live engineering data. We know from speaking to many companies, including Sony, at a number of events we’ve done throughout the year, from CES to Realize LIVE and Hannover Messe, that security is a huge consideration for these product companies. If they can avoid exporting their data out of trusted tools like NX and Teamcenter and keep everything within those secure environments, that is a huge value-add for them. So, for us, as well as the live interaction with the CAD data, keeping everything secure within NX and Teamcenter was a huge part of the design considerations when thinking about this collaboration with Sony.

HENRY KONDO:

From Sony’s point of view, I’d like to add a comment about Immersive Designer. To produce Immersive Designer plus NX, yes, that software is Siemens products, but we do collaboration, as I mentioned. Our XR head-mounted display team has been tightly connected to the Siemens’ software team and the Immersive Designer team for almost one or two years. For example, our engineers visited the Siemens office, sat with them, and also did the user experience, like interaction and user interface. Through this collaboration, we produced our solutions more iteratively.

GREG ARNOT:
Ben, for those of our listeners who haven’t seen the Consumer Electronics Show keynote, can you share how Oracle Red Bull Racing plans to leverage NX Immersive Designer and the Sony XR HMD? 

What was their experience like being introduced to this technology, and how do they see themselves benefiting?

BEN WIDDOWSON:

It’s a great question, and Oracle Red Bull Racing is a fascinating company with very unique challenges. As you know, Formula One is a sport that hosts races weekly. They have a race every week, and they have to transport the entire show, including the cars, people, logistics, and everything else, from location to location. Week-to-week, each team, including Red Bull, is making many design changes to the vehicle itself and ships out new parts to enhance the car. Naturally, they have engineers both traveling with the vehicles to the races and back at their base in the UK. For those engineers, being able to collaboratively make design changes in an immersive way means they can significantly speed up the validation process and reduce the time from design to production. This is a huge advantage for Red Bull. I know they’ve been tracking, for many years, the number of design changes they’ve been able to make in a given season and correlating it with the number of points they’ve won in the races. They really see rapid design changes as a key driver of their success. We believe the combination of NX and Sony’s XR head-mounted display is going to further enhance that over the coming years.

GREG ARNOT:
And what have users been saying so far? What has been the people’s reaction around the world?

HENRY KONDO:

From Sony’s side, regarding the hardware, we announced it at CES, and after that, we’ve done presentations at several events. We’ve received a lot of feedback from both customers and the media. The overall reaction to the hardware has been quite positive. A lot of the media, after trying our products and experiences, focused on the superior picture quality and the uniqueness of the controls. Their comments typically compare our product to other competitors, like Apple or Meta. They see Sony’s hardware as occupying a unique position compared to Apple’s and Meta’s head-mounted displays. 

BEN WIDDOWSON:

Since CES, we’ve attended a number of events throughout the year, including our own customer events like Realize LIVE. I think we’ve done about 1,500 demos for different customers at these events, and we’ve received an enormous flood of feedback, which has been overwhelmingly positive. I can pick out a few things that have really stood out to me. Number one, people really understand and appreciate the tight integration between the Sony XR head-mounted display and NX, especially the ability to interact with CAD data in real-time. That’s a real game-changer, and people clearly see the value it brings—not only in terms of data security but also in being able to make design changes directly within the immersive environment. The second thing that people have responded positively to is the controllers. They appreciate that they have been designed specifically for creators. Whilst they might look different and take a few minutes to get used to, people overwhelmingly intuitively understand how they work and how they are designed specifically for creators. This has been incredibly well received. And finally, as someone who doesn’t wear glasses—Greg, I know you’re a glasses wearer—Sony has made a great accommodation for people who wear glasses. The goggles slide in and out to accommodate glasses, so users don’t have to take them off, which enhances people’s experience of what they see in the graphics window themselves. This is something that sets Sony’s headset apart from other head-mounted displays on the market, and it’s been very well received by glasses-wearing users that we’ve spoken to over the past few months.

GREG ARNOT:
Before we go, I’d love to know: What are you most excited about with this launch?

HENRY KONDO:
The most interesting thing for me is that, in my career at Sony, I’ve launched many new products and done many announcement events. But this Siemens and Sony announcement was very unique for me because it’s a totally different industry, a totally different customer base, and a totally different opportunity to sell our products. This is my first experience with this kind of launch, and I’m very excited about the Siemens and Sony collaboration announcement. Of course, we announce before the product delivery. We’ll deliver the product by the end of this year. We have not delivered yet. But the product announcement and the communication with customers have been a completely new experience for me. 

BEN WIDDOWSON:
Personally, this has been the coolest project I’ve ever worked on in my life. There’s nowhere else in the world where I could be doing this—working with an amazing brand like Sony and bringing such a unique product to market in an area that I have a lot of interest in. Equally, having spoken to so many customers and people at these events over the last 12 months, I just can’t wait for people to start using it and give us feedback on the benefits and to see them actually using it in production. That’s what I’m most excited about.

HENRY KONDO:
Doing this collaboration with Siemens, I love that I’ve become a Siemens person. We have very tight communication—every week, we have constant discussions. Our engineers and Siemens’ engineers, as I mentioned, sit down together and create one product. This collaboration has been a very fresh experience for me. 

GREG ARNOT:
We hear the term “AI” all the time, so of course we can’t go without asking– How might Immersive Engineering interact with the use of AI in engineering?

BEN WIDDOWSON:
That’s another huge topic: industrial AI. Speaking specifically about the design and engineering domain, which is where we live, Siemens NX has had AI capabilities for a number of years now. One that comes to mind is generative AI, where you can define a set of parameters for a product, project, or even a part, and have NX optimize the geometry for that part based on those inputs—whether it’s mechanical properties or physical shape and size. Companies have already used this extensively. I think tools like this are going to start to increase across our portfolio. Being able to experience those parts and designs immersively is another great benefit. We’ve also made big enhancements around the user interface, such as using AI to automatically predict what you’re going to select next. These kinds of workflow predictions using AI will be a huge benefit, especially when it comes to new interaction methods like immersive environments, where you don’t have all the buttons, keys, and a mouse in front of you. You’re going to want the computer to intuit what you’re planning to select or do next within that immersive environment. So, I’m very excited to see how these things develop over time.

GREG ARNOT:
Is there anything else you’d like to let our listeners know about NX Immersive Engineering?

BEN WIDDOWSON:
One other point that I think is really interesting to consider is how traditional interaction methods, like mouse and keyboard, are evolving. We really see this technology as a way to make engineering more fun and exciting—more interactive, more visceral, and more of an experience. We genuinely believe and hope that this will be a way to attract the next generation of engineers into our industry. We need talented people to create the products and services of the future, both at Sony and for the rest of our customers. We believe that this immersive engineering technology combination of Siemens’ tools and Sony’s XR head-mounted display technology is going to inspire that next generation of engineers. We’re also really excited for academic institutions and university students to get their hands on this and see what they can do with it.

GREG ARNOT:
And most importantly, can you remind our listeners when this is releasing and where they can find more information and get hold of it?

BEN WIDDOWSON:

We haven’t announced the final release date yet, but we’re targeting the end of the year. If you want more information, you can find it at siemens.com.

GREG ARNOT:
Well, we will certainly be checking that out when the time comes. And for our listeners, we will have a few links in the show notes for you to check it out.

Ben, Henry, thanks so much for joining us today. It’s been a pleasure speaking with you both and hearing your perspective on XR technology.

Thanks again to Ben and Henry, we hope to speak with you again soon.

Thanks also to our listeners for tuning in to today’s episode. Join us next time for more discussions about the latest in design innovation and software applications. If you haven’t already, please consider subscribing to our podcast on your favorite platform. And if you’re enjoying the content, please leave us a review.

I’m your host, Greg Arnot, and this has been Next Generation Design.

Next Generation Design Podcast Podcast

Next Generation Design Podcast

As product engineering tools continue to morph and expand at speeds human expertise may not be able to endure, Revolutionary design technologies that span beyond industry borders, will prove their necessity for companies looking to take over their markets in the future. What will the future of design technologies and machinery look like? What will your digitalization story be? Where engineering meets tomorrow.

Listen on:

Mollie Gladden
Product Marketing Coordinator

Mollie Gladden joined Siemens in June 2022 as a Marketing Coordinator on the Product Engineering Software (PES) go-to-market team. She is responsible for the NX Academic program, the Next Generation Design podcast and more.

More from this author

Leave a Reply

This article first appeared on the Siemens Digital Industries Software blog at https://blogs.sw.siemens.com/podcasts/next-generation-design/siemens-sony-immersive-engineering-design/