Skip to content

By Roger Highfield on

Back to the Future of Computing

The next generation of high-performance computers might see a return of the oldest form of all, analogue computing, according to a paper published today, coauthored by Science Director, Roger Highfield.

In the Science Museum, there are wonderful examples of analogue computers, which carry out calculations by harnessing properties of the real world – whether voltages, distances, volumes of water or the rotation of gears. Though these machines sound like something out of a steampunk novel, there is growing interest in reviving analogue computers as conventional computing is pushed to its limits.

Today, with Peter Coveney of University College London, I highlight the growing interest in analogue computing in a paper in a special issue of the Royal Society Journal, Philosophical Transactions A, which focuses on the critical role of reproducibility in computational science.

In an accompanying Royal Society blog, I describe how analogue computing has a role to play in ensuring we can trust computers, given the increasing use of ‘digital twins’ to carry out virtual tests in engineering, computer models to simulate pandemics and shape public health policy, climate simulation, computer aided drug design, automated diagnosis, the virtual human project and more.

One famous example of this first generation of computer –my favourite –  was created by economist Bill Phillips in 1949. It might sound crude but, using taps and valves and by observing the resulting flows of water in the Phillip’s Economic Computer, trainee economists at the London School of Economics learned how to model the British economy.

Analogue computer
Phillips Economics Analog Computer, devised by Bill Phillips at the London School of Economics, 1949 known as MONIAC (Monetary National Income Analogue Computer).

On the second floor of the Science Museum are two more analogue machines,  both ‘differential analysers’ developed at the University of Manchester by the English mathematician and physicist Douglas Hartree (1897-1958), which have been studied in doctoral research by Tom Ritchie of the University of Kent, written up for the Science Museum Group Journal.

The principles of these mechanical calculating machines were glimpsed long before, in the nineteenth century, by William Thomson, Lord Kelvin, and his brother James. Together, they designed the first general-purpose analogue computer capable of solving differential equations, used across scientific research to model change.  Their ‘differential analyser’ represented a way to ‘mechanise calculus’ but Victorian technology was not sufficiently advanced to turn their ideas into reality.

Decades later, in the late 1920s, Vannevar Bush and Harold Hazen at the Massachusetts Institute of Technology (MIT) in Cambridge, near Boston, succeeded. Their differential analyser couples ‘integrators’,  each of which is a large glass disc rotated by an electric motor that drives a smaller spinning metal wheel, where its speed of rotation depends on how far it is placed, by a carriage, from the centre of the disc. The rotation of this metal wheel on the glass disc represented the point of integration of an equation. With these spinning components, the mathematical process of integration could be carried out.

During a visit to MIT in 1933, Hartree found himself gawping at the latest version of Bush’s differential analyser, a room full of gears and shafts driven by electric motors. The visitor joked that the machine looked as if ‘someone had been enjoying himself with an extra-large Meccano set’, the popular children’s construction toy of the day.

Sure enough, on his return to the United Kingdom, Hartree got hold of Meccano with his research student, Arthur Porter, so they could build their own version at a fraction of the cost. It took 18 months but the result boosted the accuracy and speed of resolving equations for fluid mechanics compared to the traditional computer of that day – a person, usually female.

Though he told Meccano magazine the differential analyser ‘exceeded all expectations’, Ritchie has reproduced Hartree’s machine and found ‘it breaks down constantly.’ Even so,  its spinning wheels bring mathematics to life. Bonita Lawrence, who worked with Arthur Porter before his death in 2010, has been building differential analysers at Marshall University since 2007 to help her students visualise calculus.

The success of this Meccano model led Hartree to construct (not from Meccano) a larger differential analyser, known as the Manchester machine, which could be used to solve many problems, from the electronic structure of atoms to power distribution across electrical networks. They could also help with railway train timetables, generate anti-aircraft and ballistics trajectories, and calculate pension fund derivatives.

Many analogue machines were built in the United Kingdom before the Second World War. Three of these computers – the Manchester machine and two differential analysers built at the University of Cambridge – were used during the war for a variety of applications, according to Ritchie, such as the development of GEE (a precursor to radar), the Magnetron (both projects were led by Phyllis Nicholson and the rest of Hartree’s team at the University of Manchester), as well as the Tube Alloys Project (the British equivalent of the Manhattan Project), where these analogue computers were used by Rudolf Peierls and Otto Frisch to determine the decay rate of Uranium-235 isotopes.

In the mid-twentieth century, digital computers began to be adopted and, by the late 1970s had taken over from analogue. However, digital computing, for all its advantages, rests on manipulating binary numbers and, because these are rounded, there are concerns about the fidelity with which digital can simulate reality. ‘With colleagues at Tufts University in America, I did a study that suggests that the behaviour of the real world is richer than any digital computer can capture,’ comments Peter Coveney. ‘Here analogue computers offer advantages over digital, along with much lower power consumption.’

One such effort to create a new generation of analogue computers can be found in the Department of Electrical and Systems Engineering at the University of Pennsylvania, coincidentally the birthplace of ENIAC, the pioneering digital general-purpose computer, by Nader Engheta, along with colleagues in Texas and Italy.

I first came across Nader’s research in 2005, when discussing what he then called ‘transparency’ in metamaterials, today known as cloaking. Metamaterials (‘meta’ from the Greek, meaning ‘beyond’) are designer materials that can be built to have unnatural properties to manipulate light. Around that time he had what he describes as a ‘very crazy idea’ to develop metamaterials into basic circuit elements at the nanoscale, which he called nanoinductors, nanocapacitors and nanoresistors, but for optics, not electronics.

By 2012, Engheta had demonstrated these optical circuit elements for information processing. However, two years later showed how, instead of using bulky lenses and optics, a metamaterial measuring a light wavelength thick could perform a suite of mathematical functions.

Rather than turning information into ones and zeros, as in a conventional computer, Engheta encoded the input in the form of the complex shape of the light wave. The tailored properties of the metamaterial – notably the profile of its refractive index, along with magnetic and electric properties – shapes how this wave propagates and the engineers could craft this wave to do something mathematically useful, such as solve particular equations, as the light wave passes through the metamaterial.

His team demonstrated a proof-of-concept experiment using microwaves, where the long wavelength of this electromagnetic radiation allowed for an easier-to-construct metamaterial device at much bigger length scales (microwaves wavelengths are around 30 centimetres to one millimetre). They were able to solve equations using what they call the ‘Swiss cheese’, a two-square-feet block of polystyrene plastic with a precise and specific distribution of air holes.

The pattern of holes in the Swiss cheese was predetermined to solve an integral equation with a given ‘kernel’, the key part of the equation that describes the relationship between two variables. After the microwave propagates through this ‘cheese’ and its air holes, the solution is found in the shape, intensity, and phase of the exiting wave, for any given input waves as the input function for the integral equation.

This month, Engheta reported another ‘Swiss cheese’ that can handle two different integral equations simultaneously (using two microwave frequencies). He told me that ‘this work shows that metamaterials have the promise of making analogue computing as parallel processor/parallel computing.’

Light waves would behave in a similar way but, because of their smaller wavelength, can be scaled down to a few microns, a fraction of the thickness of a human hair. An interconnected cluster of these devices could form an analogue computer chip that requires no digital computation. Engheta and his colleagues are currently working on a photonic chip a few microns across.

By using metamaterials that can alter their properties, it might also be possible to program these devices, said Engheta, likening this to the way that laser light was used to write information on old-fashioned CDs. He is investigating another approach, using the Mach-Zehnder interferometer, named after two 19th century physicists, when a source of microwaves is split into two halves and then recombined so the beams interfere. By stringing these interferometers and altering the phase of the light going through them, it is possible to implement any kernel, offering another way to program these devices to do what Engheta calls ‘photonic calculus’.

Engheta believes that it may be possible to consider analogue photonic chips – high-speed, ultra-small, and very-low-power – as subsystems of the next generation of computer, so called exascale computers capable of performing a million million million operations per second  (your laptop or desktop is likely capable of several teraflops, or several million million calculations per second) where power consumption – predicted to be 60 megawatts in the case of the Aurora exascale computer in Illinois – is now becoming a problem, along with how to keep these behemoths cool.

In recent months, a Chinese team demonstrated a remarkable analogue optical quantum computer, Jiuzhang, which took minutes to do what a conventional supercomputer would take billions of years to complete.  It could be that, in years to come, computers will increasingly rely on dancing beams of light.

One comment on “Back to the Future of Computing

  1. Extraordinary ingenuity on an unimaginably small scale: a fascinating read.

Comments are closed.