“The King is Dead. Long Live the King” That is, the new one! The same could be said of the car at around 1900. “The steam-engine car is dead. Long live the petrol-driven car.” What about computers?
Is the computer dead? Not quite, but it’s probably dying. Last week, Brian Krzanich, Intel’s chief executive, said that they’re now reaching the end of making smaller chips. Memory devices are probably as close to being as small as they’ll ever be. Every year, for 50 years ever since the transistor was invented, transistors have been printed closer together on chips so that the speed (and thus the power) of computers has been able to double every two years. This is what known as Moore’s Law, named after an early suggestion by Gordon Moore, one of the co-founders of Intel, and it has held remarkably true ever since. The “Law” was never intended to apply forever — but now its ending has arrived.
But now, even as Intel is still building massive chip making facilities in India, China and Israel, the computer is reaching the end of its growth phase in terms of chip design and developments. The consequences of this, according to Matthew Lynn, writing in the Daily Telegraph today, is that there will be less innovation, in a world of devolution we should expect higher (real) prices than before and also that computers will not continue to disrupt traditional industries as they have been doing for at least 30 years since they’d become embedded.
Matthew Lynn is obviously unaware of what is going on in biological research labs, particularly in making synthetic genes which can be strung along new types of DNA. These will be capable of data memory storage at far higher density than present-day chips. Furthermore, unlike chips which inevitably degrade over time due to stray radiation from the sky or from the ground DNA is self-reparable. Very many of our genes, for example, are identical to those which came into existence billions of years ago and have been maintained ever since. Also, the central core of DNA is capable of super-conductance — much faster than in chips.
Biologists, such as such as Shawn Douglas (mentioned in my posting of 15 July, (“Extending the natural environment”) are developing new sorts of software, such as caDNAno which uses ‘nanobot’ algorithms that are quite different from those in normal software. This new language is already being taught to biologists wishing to make new types of DNA. When will DNA-based computers start coming on the scene? My guess, as a lay observer of biological research, is between 20 and 50 years and, bearing in mind that biological research is now attracting more brilliant young scientists than ever before, very possibly within 20 years. So Intel will still have a good run for its existing investments until DNA computers take over. More than likely, Intel will already be keeping fully abreast of DNA research or might even have DNA research labs of its own.