Thought Leadership

Power and compilers at the NMI

I recently attended an event focused on power and embedded software hosted by the NMI in the UK, where I had been invited to make a presentation. My session was titled “Power Management in a Real Time Operating System”. If you would like a copy of the slides, please email.

Of course, apart from presenting myself, I was interested in the other sessions, in particular one about compilers and power consumption …

The session that caught my eye was titled “The Impact of Compiler Options on Energy Consumption in Embedded Platforms”. This was a report on some university research work, which was seeking a correlation between the use of optimization options on a GNU compiler and the energy consumption during execution of the resulting code. The approach was to build the code of a selection of [rather arbitrary] benchmarks, using a large number of combinations of compiler options, then execute the code and measure power consumption. They also used a number of different target systems or different architectures [including multicore].

My interpretation of their conclusions is that there were no clear conclusions. I felt that the research was not well designed or thought through. The use of compiler options was not systematic – they even used one, the function of which was unclear to the researchers. To my mind, this was equivalent to randomly mixing chemicals in the hope of creating a wonder drug. Their use of multiple targets [in the first instance] just served to complicate matters.

I do, however, feel that this is an interesting area of research, but it needs a slightly different approach, largely focused on understanding compiler optimizations, hypothesizing on their effect on power and testing that hypothesis. In general, there are 3 kind of optimizations:

  1. Speed. Faster code is often larger, but is very likely to consume more energy to perform a specific task.
  2. Size. Compact code is usually slower, so is likely to consume more power. However, there is likely to be the opportunity to reduce the memory size, which represents a power saving. [This possibility was not addressed in the research.]
  3.  Lucky. Sometimes code is smaller and faster. Obviously this combines the benefits of (1) and (2).

Interesting research would be to find out whether fast code does offer the best overall energy consumption result. This could result in compilers being provided with a “compile for low power” option.

Other sessions at the event were interesting in various ways. There was a very inspiring session concerning a project to encourage young people to start writing software. Another session was presented by a guy who looked and sounded exactly like the comedian David Mitchell [who is well known in the UK, at least]. A session from Intel looked at the technology that they provide to enable software to optimize system power consumption. This included the random fact that the energy in a banana is equivalent to that held in two fully-charged laptop batteries. I do not believe that Intel are announcing banana powered computing …

Colin Walls

I have over thirty years experience in the electronics industry, largely dedicated to embedded software. A frequent presenter at conferences and seminars and author of numerous technical articles and two books on embedded software, I am a member of the marketing team of the Mentor Graphics Embedded Systems Division, and am based in the UK. Away from work, I have a wide range of interests including photography and trying to point my two daughters in the right direction in life. Learn more about Colin, including his go-to karaoke song and the best parts of being British: http://go.mentor.com/3_acv

More from this author

Leave a Reply

This article first appeared on the Siemens Digital Industries Software blog at https://blogs.sw.siemens.com/embedded-software/2012/11/19/power-at-the-nmi/