2014 : WHAT SCIENTIFIC IDEA IS READY FOR RETIREMENT?

[ print ]

Editor, The Feuilleton (Arts and Essays), of the German Daily Newspaper, Sueddeutsche Zeitung, Munich
Moore's Law

Gordon Moore's 1965 paper stating that the number of transistors on integrated circuits will double every two years has become the most popular scientific analogy of the digital age. Despite being a mere conjecture it has become the go-to model to frame complex progress in a simple formula. There are good technological reasons to retire Moore's Law. For example the general consensus that Moore's Law will effectively cease to exist past a transistor size smaller than 5 nanometers. That would mean a peak and sharp drop-off in ten to twenty years. Another one is the potential of quantum computers pushing computing into new realms, expected to become reality in three to five years. But Moore's Law should be retired before it's technological limits, because it has propelled the perception of progress into wrong directions. Allowing it's end to become an event would just amplify the errors of reasoning.

First and foremost Moore's Law has allowed to perceive the development of the digital era as a linear narrative. The simple curve of progression is the digital equivalent of the ancient wheat and chessboard problem (with a potentially infinite chessboard). Like the Persian inventor of the game of chess who demanded from the king a geometric progression of grains all across the board, digital technology seems to develop exponentially. This model ignores the parallel nature of digital progress, which encompasses not only technological or economic development, but scientific, social and political change. Changes that can rarely be quantified.

Still the Moore's law model of perception has already found it's way into the narrative of biotechnological history, where change become ever more complex. Proof of progress is claimed in the simplistic reasoning of a sharp decline in cost for sequencing a human genome from three billion Dollars in the year 2000 to the August 2013 cancellation of the Genomics X Prize for the first 1000 Dollar genome, because the challenge had been outpaced by innovation.

For both digital and biotechnical history the linear narrative has been insufficient. The prowess of the integrated circuit has been the technological spark to induce a massive development comparable with the wheel allowing the rise of urban society. Both technologies have been perfected over time, but their technological refinement falls short to illustrate the impact both had.

It is about 25 years ago that scientists at MIT's media lab told me about a paradigmatic change in computer technology. In the future, they said, the number of other computers connected to a computer will be more important than it's number of transistors on it's integrated circuits. For a writer interested but not part of the forefront of computer technology that was still groundbreaking news in 1988. A few years later the demo of a Mosaic browser was as formative as listening to the first Beatles record and seeing the first man on the moon had been for my parents.

Change since have been so multilayered, interconnected and rapid that comprehension has lagged behind ever since. Scientific, social and political changes occur in random patterns. Results have been mixed in equally random patterns. The slowdown of the music industry and media has not been matched in the publishing industry and film. The failed twitter revolution of Iran had quite a few things in common with the Arab spring, but even in the Maghreb the results differed wildly. Social networks have impacted societies sometimes in exact opposites—while the fad of social networks have resulted in cultural and isolation in Western society, it has created a counterforce of collective communication against the strategies of the Chinese party apparatus to isolate it's citizenry from within.

Most of these phenomena have been only observed, not explained by now. It is mostly in hindsight that a linear narrative is constructed, if not imposed on. The inability of many of the greatest digital innovations like viral videos or social networks to be monetized are just one of many proofs how difficult it is to get a comprehensive grasp on digital history. Moore's law and it's numerous popular applications to other fields of progress thus create an illusion of predictability in the least predictable of all fields—the course of history.

These errors of reasoning will be amplified, if Moore's Law is allowed to come to it's natural end. Peak theories have become the lore of cultural pessimism. If Moore's law is allowed to become a finite principle, digital progress will be perceived as a linear progression towards a peak and an end. Neither will become a reality, because the digital is not a finite resource, but an infinite realm of mathematical possibilities reaching out into the analog world of sciences, society, economics and politics. Because this progress has ceased to depend on quantifiable basis and on linear narratives it will not be brought to a halt, not even slowed down, if one of it's strains comes to an end.

In 1972 the wheat and chessboard problem became the mythological basis for the Club of Rome's Malthusian „The Limits to Growth". Moore's will create the disillusionment of a finite nature of the digital. It will become as popular as it's illusion of predictability. After all there have bee no loonies carrying signs saying "The End is Not Near".