Thought Leadership

Instruction set simulation

I enjoy hearing about other people’s lives and, hence, often read biographies. Just lately I have been reading about Bill Gates. Interestingly, he and I have many similarities in terms of interests and approach. However, he is, of course, a billionaire and I am not. In lots of ways the book is more about Microsoft and the PC industry and, as it covers a period of technology history that spans my adult life, I find it fascinating.

I have always felt that, to be successful in a specific activity, you can usually find someone else who is successful and look at what they do. It is not necessary to copy, but to learn. I wondered what I might learn from Microsoft, as they seem to be quite successful …

Microsoft’s first product, in the early 1970s, was BASIC for some of the first microcomputers on the market. It would seem obvious that, to write code for a particular computer, you start out by getting such a machine. That may seem logical, but it can be problematic, as Microsoft found back then. The machines were in short supply – demand had far exceeded expectations and production was having trouble keeping up. They were also very expensive. When launched the Intel 8080 chip [and the 8008 before it] was priced at $360 each. That is about $4000 in 2020 money! It was rumored that Intel did not know what to charge and the $360 figure was a nod to to ubiquitous IBM/360 mainframe.

The Microsoft developers had access to timeshare systems – DEC minicomputers. Their solution was to write a simulator for the 8080 so that they could execute code on the computer they had to hand. This had several benefits: cheap and easy to do; multiple programmers could use the simulator at the same time; debugging was eased by the transparency of the “hardware”. Although execution speed was quite slow, the pros offset the cons and this technology led to the creation of code that ran on real hardware first time.

What can we, as embedded software developers learn from this 40+ years later? Well, quite a lot actually. Firstly, software teams need access to new chips as soon as the use of the device is confirmed; instruction set simulation [ISS] can be the answer. This is a particularly acute challenge as, with very complex software, an early start is essential to meet deadlines. Second, even when real hardware is available, an ISS can be valuable to give more engineers an execution environment and to offer greater transparency for debugging. The biggest difference that modern engineers experience, compared with their counterparts 4 decades ago, is that ISS execution speeds, on modern high-performance workstations are very respectable.

I strongly feel that everyone should study history. Even if the world of our ancestors is different from the 21st Century, and modern embedded software toolkits are very sophisticated, we still have much to learn about how to face challenges and solve problems.

Colin Walls

I have over thirty years experience in the electronics industry, largely dedicated to embedded software. A frequent presenter at conferences and seminars and author of numerous technical articles and two books on embedded software, I am a member of the marketing team of the Mentor Graphics Embedded Systems Division, and am based in the UK. Away from work, I have a wide range of interests including photography and trying to point my two daughters in the right direction in life. Learn more about Colin, including his go-to karaoke song and the best parts of being British: http://go.mentor.com/3_acv

More from this author

Leave a Reply

This article first appeared on the Siemens Digital Industries Software blog at https://blogs.sw.siemens.com/embedded-software/2020/01/13/instruction-set-simulation/