August 28, 2015
For the past 70 years, digital computers have followed the principles of a model developed by Jon von Neumann in 1945. The model effectively defines how a stored computer program, comprising a set of instructions, runs. Most of the world’s computers, from the very first – the 1948 Manchester Baby – to the latest generation of mobile devices, x86 desktops and servers, use this model. Programs, including the operating system, are run as a series of machine instructions, which are fetched and then run one step at a time. It is a model used in computer languages such as C and Java and operating systems such as Unix and Windows. The von Neumann model has served the industry well – but its limitations are starting to show. "Twenty years ago, someone did not develop a website to have millions of customers in parallel," says Patrick Di Loreto, betting company William Hill’s research and development engineering lead.