Thought Leadership

When compilers do magic

What is a compiler? Ask an average engineer and you will get an answer something like: “A software tool that translates high level language code into assembly language or machine code.” Although this definition is not incorrect, it is rather incomplete and out of date – so 1970s. A better way to think of a compiler is: “A software tool that translates an algorithm described in a high level language code into a functionally identical algorithm expressed in assembly language or machine code.” More words, yes, but a more precise definition.

The implications of this definition go beyond placating a pedant like me. It leads to a greater understanding of code generation – and just how good a job a modern compiler can do – and the effect upon debugging the compiled code …

Some time ago, I put forward an argument that a modern compiler could [under specific circumstances] produce better code than a skilled human assembly language programmer. To illustrate my point, I will show another example of the same phenomenon. Consider this code:

#define SIZE 4
char buffer[SIZE];
for (i=0; i<SIZE; i++)
   buffer[i] = 0;

This is very straightforward. One would expect a simple loop that counts around four times using the counter variable i. I tried this, generating code for a 32-bit device and stepped through the code using a debugger. To my surprise, the code only seemed to execute the assignment once; not four times. Yet the array was cleared correctly. So, what was going on?

A quick look at the underlying assembly language clarified matters. The compiler had generated a single, 32-bit clear instruction, which was considerably more efficient than a loop. The loop variable did not exist at all. I experimented and found that, for different values for SIZE, various combinations of 8-, 16- and 32-bit clear instructions were generated. Only when the array size exceeded something like 12 did the compiler start generating a recognizable loop, but even that was not a byte-by-byte clear. The operation was performed 32 bits at a time.

Of course, such optimized code is tricky to debug. Indeed, even today, some debuggers just do not allow debugging of fully optimized code. They give you an interesting choice: ship optimal code or debugged code. Realistically, I would recommend that initial debugging is performed with optimization wound down to avoid such confusion and enable this kind of logic to be carefully verified. Later, verify the overall functionality of the code with aggressive optimization activated.

I was very impressed by how smart the compiler was. I know that it is not magic, but it sure looks like it.

Colin Walls

I have over thirty years experience in the electronics industry, largely dedicated to embedded software. A frequent presenter at conferences and seminars and author of numerous technical articles and two books on embedded software, I am a member of the marketing team of the Mentor Graphics Embedded Systems Division, and am based in the UK. Away from work, I have a wide range of interests including photography and trying to point my two daughters in the right direction in life. Learn more about Colin, including his go-to karaoke song and the best parts of being British: http://go.mentor.com/3_acv

More from this author

Comments

0 thoughts about “When compilers do magic

Leave a Reply

This article first appeared on the Siemens Digital Industries Software blog at https://blogs.sw.siemens.com/embedded-software/2010/07/05/when-compilers-do-magic/