There’s no doubt that the current time in history is “The Digital Age,” an age driven by the ubiquitous digital computer chip. These chips are in everything from fridges to phones to airplanes, and without them the world as we know it simply wouldn’t exist. But digital chips with their 1’s and 0’s didn’t always reign supreme. Originally it was analog chips that dominated computing in the past, operating over continuous ranges of values. While analog computing might seem outdated by today’s standards, with less precision and flexibility than digital chips, they are having a resurgence in the space of cutting-edge AI thanks to their speed and energy efficiency in specialized tasks unmatched by their digital counterparts.
Modern AI operates, at a basic level, using the mathematical operation of matrix multiplication, with one operation required when information travels from one artificial neuron to another. To deliver timely decisions, hundreds of thousands of neurons simultaneously need to transmit data to thousands of other neurons, in other words, the process of an AI “thinking” is a massively parallel one that is computationally intensive. In recent years graphics processing units (GPUs), specially designed to handle fast parallel operations, have been redeployed for AI processing. Thanks to the ever-increasing power of these GPUs as well as purpose-built AI accelerators, it is now possible to train ever larger neural networks, comprised of hundreds of thousands of virtual neurons being processed by tens of thousands of GPUs. While this is a boon for AI research and the power of AI algorithms, it is not without its downsides.
Today’s GPU contains billions of transistors that consume several hundred watts of power under full load. Multiple this figure by the hundreds or thousands of GPUs required for training and the problems become enormous. Training a single algorithm consumes more power than the yearly power consumption of several houses and takes up a vast amount of space. And once an algorithm is trained and can execute its task, it will still require at least a single GPU worth of transistors with the all the power and heat generation that entails to run. This effectively limits AI from being directly integrated on many edge devises, such as smart monitoring systems, cameras, or small robotics where the power consumption and heat generation would be impossible to deal with in a compact form factor. Additionally, thanks to the increasingly complex architecture of GPUs there may be internal bottlenecks that prevent the system from processing data as fast is it could or needs to.
Enter analog chips. Many of the short comings that saw analog fall behind digital chips are simply not a concern in the AI space. Issues like slight variance in results and single-purpose focus aren’t a major concern for AI. Analog’s ability to provide fast, power efficient chips that excel at the single task of matrix multiplication gives analog chips all the edge they need to compete with digital ones. A single specialized analog AI chip requires less than 10 watts, whereas a GPU would consume more than 100 watts for the same results. This opens up the door for integrating AI directly into edge devices where power and heat are a major concern. For example, a camera on a production line might have a specially made analog chip integrated directly into it, allowing it to run a pre-trained algorithm for part recognition right there, rather than sending the voluminous visual data to a more powerful external system for processing and waiting for the response.
While analog can’t fully replace digital in the AI space, especially in the areas of interacting with humans or ingesting information, it offers tantalizing possibilities. While many would think analog has no place in the cutting-edge world of AI, in the coming years we’ll most likely find that’s not the case. The human brain has long been a model for AI, but as a system it is neither completely analog nor digital but some mixture of both. To create AI that truly reflects human abilities, unifying the benefits of analog and digital chips is an important step on that road, offering benefits not just to the cutting edge but to smart devices in homes and factories the world over.
Siemens Digital Industries Software is driving transformation to enable a digital enterprise where engineering, manufacturing and electronics design meet tomorrow. Xcelerator, the comprehensive and integrated portfolio of software and services from Siemens Digital Industries Software, helps companies of all sizes create and leverage a comprehensive digital twin that provides organizations with new insights, opportunities and levels of automation to drive innovation.
Siemens Digital Industries Software – Where today meets tomorrow.