Thought Leadership

Simulating your way to a friendlier autonomous car

Humans and autonomous vehicles must be able to coexist and easily interact if future cars are going to be as safe, or ideally safer, than our streets today with human drivers. To accomplish this, the engineering of the human-machine interface is a key part of autonomous vehicle development.

A number of questions must be addressed. How do you keep passengers safe if they are conducting activities better suited for a living room or kitchen rather than a car? Will seat belts be enough? Will passengers even want to wear them? Will we need airbags throughout the interior that, during an accident, convert the inside of a vehicle into what may look like an inflatable house?

Also, how will we communicate with autonomous vehicles when commands get more complex as activities within vehicles become more varied and complicated? Will we be able to eventually simulate humans’ cognitive behavior to fully test the human-machine interface, or will the ability to test and develop interfaces with humans in the loop be enough to validate an interface design?

One company working to answer these questions is TASS International, which uses its expertise to develop safer and more intelligent vehicles. I recently spoke with three experts from TASS – Gwen Van Vugt, Frank Litjens and Paul Van Straten – about autonomous vehicle safety. In this part of the discussion, we move into development and testing of the interface needed to coexist with autonomous vehicles.

Edward Bernardon: The idea of human safety is important, but that depends on the human-machine interface. What are some of the challenges for getting autonomous cars to work more smoothly with humans so we can eventually trust them?

Gwen Van Vugt: There are so many questions and so many unknowns that are still being developed for system interfaces in autonomous vehicles.



For instance, how does one inform a driver of the status a vehicle is in? Or, how are you sure that when you hand back control to the driver, that they are aware of the traffic situation and capable of controlling the vehicle?

You either have a driver assistance system, where the driver is in control, and you assist the driver in making decisions; or, you go to the other side of the spectrum where you say the driver is entirely out of the loop and the vehicle is doing it.



When the vehicle gets into the situation where it’s not in control anymore – for example, it is snowing too hard – then the car pulls aside and puts itself in a safe state, and then tells the driver that it’s stopping here, but not by handing back control. The car is saying, “I’m stopping this vehicle, and from here, you may restart the vehicle, but you, the human driver are in control.”

Edward Bernardon: How do you think that autonomous vehicle simulation may help develop a solution? TASS simulates not only the car, tires and sensors but also human motion. It seems like TASS tools would be perfect for this.

Gwen Van Vugt: The only thing missing is that our software tools are not modeling the cognitive behaviors of the person. What we model is the physical behavior of the human body. We don’t have mental models where we model how people react to certain situations. I don’t know the status of mental modeling and how accurate it is, but, in principle, it would be possible if you have good mental models that you could add into in this equation.

Autonomous vehicle simulation.jpg

Ed Bernardon: TASS does vehicle-in-the-loop or driver-in-the-loop testing. Could that help engineers develop the interfaces of the future?

Gwen Van Vugt: TASS has a driver-in-the-loop system, where a real person drives in a virtual world, and those systems are often used to test the interaction between the human and the system. In this way, you can see the capability of an ADAS [Advanced Driver Assistance System] and how it has to change depending on the behavior or the state of the human in the car.

So, what you’re seeing more of is a driver-monitoring system with either a camera or other sensors that loop into the driver, so the car understands where the driver is looking and if he’s actively driving or half asleep, relaxed, tense, sweating or has a temperature.

That data is then used to set the driver assistance system. I think that even Level 2 systems, where the human is in control, can be optimized by taking the state of the driver into account. This driver assistance system is a trend that we are currently seeing in the automotive world.

This concludes part three of my interview with the experts from TASS International about autonomous cars. Part four delves into the challenges of autonomous vehicle testing.

About the author
Edward Bernardon is vice president of strategic automotive initiatives for the Specialized Engineering Software business segment of Siemens PLM Software, a business unit of the Siemens Industry Automation Division. Bernardon joined the company when Siemens acquired Vistagy, Inc. in December, 2011. During his 17 year tenure with Vistagy, Bernardon assumed the roles of vice president of sales, and later business development for all specialized engineering software products. Prior to Vistagy, Bernardon directed the Automation and Design Technology Group at the Charles Stark Draper Laboratory, formerly the Massachusetts Institute of Technology (MIT) Instrumentation Laboratory, which developed new manufacturing processes, automated equipment and complementary design software tools. Bernardon received an engineering degree in mechanical engineering from Purdue University, and later received an M.S. from the Massachusetts Institute of Technology and an MBA from Butler University. He also holds numerous patents in the area of automated manufacturing systems, robotics and laser technologies.

Edward Bernardon

Ed has over 25 years experience as an entrepreneur and executive in industries related to software, design, and automated manufacturing in a variety of industries including automotive, aerospace, and apparel. Primary roles have been as a sales and business development executive in early stage startups that have grown to become global in scope. Ed is currently Vice President of Strategic Automotive Initiatives at Siemens Specialized Engineering Software. In this role, he is responsible for strategic planning, business development, and making initial sales of new products to market leading companies. The primary focus of these efforts has been in the areas of design and manufacture of lightweight automotive structures and transportation interiors. Prior to Siemens, he was the third principal member and Vice President of Sales for VISTAGY that, without any outside funding, developed industry leading software for design and manufacturing of light-weight composite parts. Initially the sole sales person, he expanded sales to a global organization with direct and channel partners in the Americas, Europe and Asia. Ed was a key member of the executive team during the global expansion of VISTAGY and the transaction of Siemens acquisition in 2011. Prior to VISTAGY, Ed directed the Automation and Design Technology Group at the MIT Draper Laboratory, developing manufacturing processes, robotics, and complementary design software for composites, automotive and textile applications. Projects included design of a composite car body, FRTM and preform pick/place for composites fabrication, as well as robotic equipment for the manufacture of men’s suits, blue jeans, sweatpants and other apparel. Ed holds an M.S. in mechanical engineering from MIT, a B.S. in mechanical engineering from Purdue University, and an MBA from Butler University. He also has numerous patents in the area of high volume automated composite manufacturing systems, robotics and laser technologies.

More from this author

Leave a Reply

This article first appeared on the Siemens Digital Industries Software blog at https://blogs.sw.siemens.com/thought-leadership/2018/09/24/simulating-your-way-to-a-friendlier-autonomous-car/