AI-driven factories of the future

A couple of years ago, I toured the Ford Rouge Complex in Michigan, where the F-150 trucks are assembled. From high above the factory floor, I stood on a skywalk, watching the process from various vantage points. An impressive serpentine system moved the trucks to each assembly station.

The visitor center at the entrance of the Rouge factory floor.

I was particularly fascinated by the sunroof station. A man sat on a stool watching a TV show until he was alerted that the next truck moving into his station needed a sun roof. He stood up, shot glue into the grooves that hold the glass and pressed a button. A robot arm grabbed the glass and set it precisely into place on the roof and employed a heat gun to seal it into place. The man inspected the job and sat back down to watch his TV show again. I watched long enough at this station to see that the rack of glass was empty and just then a man pulling a fully-loaded rack of glass swapped out the empty rack with the loaded rack and walked away. I watched another station nearby were a woman was loading speakers into a door. And then a station were another worker slapped some stickers inside the doors. I hope that these folks get to rotate stations!

A couple of thoughts ran through my head:

  • It was clear that the computer at each station knows what configuration and options have been ordered for each truck coming down the line. Each of those components are created off site, delivered to the factory, and then taken to each station as needed.
  • There were not many robots in action but there certainly were a lot of humans doing work. The tour does not show all the stations. For example, the painting, welding, and installation of the power train are located out of view. Those aspects of assembly surely employ robots.

This factory kicked out an impressive one truck every 5 minutes or so (I watched the final drive away station). While I expected more automation I was fascinated by how the trucks came together, until I heard an interview from MIT Technology Review with Dr. Stefan Jockusch, who is vice president for strategy for Siemens Digital Industries Software. His vision of AI-driven factories in the future seems very different than what I saw at the Rouge.

At Siemens, we use the Chip-to-City concept to help people understand our vision. By showing how to design cars that get manufactured in smart factories and then drive though smart cities, we can demonstrate how a complex system is all connected from start to finish using the digital twin.

It all starts with world class chip design and verification tool flows. These chips combine with physical sensors to form vehicle systems. These electronic systems are connected together with cabling that interface seamlessly with mechanical design tools for vehicle design and simulation. The models of these systems are used to drive the smart factory floor for manufacturing. Digital feedback at each step of the way allows for changes and improvement. Using products and services from Siemens, cities can build out and maintain digital systems that manage the autonomous vehicle traffic and battery charging. Feedback from these systems can find its way all the way back through the digital twin to make product changes.

Stefan’s focus on the smart factory in the Chip-to-City scenario leads him to define the factory of the future which is not possible without artificial intelligence. It is the application of artificial intelligence across the digital twin that allows for exponential innovation. Learn more about this vision in the MIT Technology Review interview available in podcast or transcript formats here.

Leave a Reply