Inventor and Technologist; Author, The Age of Spiritual Machines

We will find ways to circumvent the speed of light as a limit on the communication of information.

We are expanding our computers and communication systems both inwardly and outwardly. Our chips use every smaller feature sizes, while at the same time we deploy greater amounts of matter and energy for computation and communication (for example, we're making a larger number of chips each year). In one to two decades, we will progress from two-dimensional chips to three-dimensional self-organizing circuits built out of molecules. Ultimately, we will approach the limits of matter and energy to support computation and communication. 

As we approach an asymptote in our ability to expand inwardly (that is, using finer features), computation will continue to expand outwardly, using readily available materials on Earth such as carbon. But we will eventually reach the limits of the resources available on our planet, and will expand outwardly to the rest of the solar system and beyond. 

So how quickly will we be able to do this? We could send tiny self-replicating robots at close to the speed of light along with electromagnetic transmissions containing the needed software. These nanobots could then colonize far-away planets. 

At this point, we run up against a seemingly intractable limit: the speed of light. Although a billion feet per second may seem fast, the Universe is spread out over such vast distances that this appears to represent a fundamental limit on how quickly an advanced civilization (such as we hope to become) can spread its influence. 

There are suggestions, however, that this limit is not as immutable as it may appear. Physicists Steve Lamoreaux and Justin Torgerson of the Los Alamos National Laboratory have analyzed data from an old natural nuclear reactor that two billion years ago produced a fission reaction lasting several hundred thousand years in what is now West Africa. Analyzing radioactive isotopes left over from the reactor and comparing them to isotopes from similar nuclear reactions today, they determined that the physics constant "alpha" (also called the fine structure constant), which determines the strength of the electromagnetic force apparently has changed since two billion years ago. The speed of light is inversely proportional to alpha, and both have been considered unchangeable constants. Alpha appears to have decreased by 4.5 parts out of 108. If confirmed, this would imply that the speed of light has increased. There are other studies with similar suggestions, and there is a table top experiment now under way at Cambridge University to test the ability to engineer a small change in the speed of light. 

Of course, these results will need to be carefully verified. If true, it may hold great importance for the future of our civilization. If the speed of light has increased, it has presumably done so not just because of the passage of time, but because certain conditions have changed. This is the type of scientific insight that technologists can exploit. It is the nature of engineering to take a natural, often subtle, scientific effect, and control it with a view towards greatly leveraging and magnifying it. If the speed of light has changed due to changing circumstances, that cracks open the door just enough for the capabilities of our future intelligence and technology to swing the door widely open. That is the nature of engineering. As one of many examples, consider how we have focused and amplified the subtle properties of Bernoulli's principle (that air rushing over a curved surface has a slightly lower air pressure than over a flat surface) to create the whole world of aviation. 

If it turns out that we are unable to actually change the speed of light, we may nonetheless circumvent it by using wormholes (which can be thought of as folds of the universe in dimensions beyond the three visible ones) as short cuts to far away places. 

In 1935, Einstein and physicist Nathan Rosen described "Einstein-Rosen" bridges as a way of describing electrons and other particles in terms of tiny space-time tunnels. In 1955, physicist John Wheeler described these tunnels as "wormholes," introducing the term for the first time. His analysis of wormholes showed them to be fully consistent with the theory of general relativity, which describes space as essentially curved in another dimension. 

In 1988, California Institute of Technology physicists Michael Morris, Kip Thorne, and Uri Yertsever described in some detail how such wormholes could be engineered. Based on quantum fluctuation, so-called "empty" space is continually generating tiny wormholes the size of subatomic particles. By adding energy and following other requirements of both quantum physics and general relativity (two fields that have been notoriously difficult to integrate), these wormholes could in theory be expanded in size to allow objects larger than subatomic particles to travel through them. Sending humans would not be impossible, but extremely difficult. However, as I pointed out above, we really only need to send nanobots plus information, which could go through wormholes measured in microns rather than meters. Anders Sandberg estimates that a one-nanometer wormhole could transmit a formidable 10^69 bits per second. 

Thorne and his Ph.D. students, Morris and Yertsever, also describe a method consistent with general relativity and quantum mechanics that could establish wormholes between Earth and far-away locations quickly even if the destination were many light-years away. 

Physicist David Hochberg and Vanderbilt University's Thomas Kephart point out that shortly after the Big Bang, gravity was strong enough to have provided the energy required to spontaneously create massive numbers of self-stabilizing wormholes. A significant portion of these wormholes are likely to still be around, and may be pervasive, providing a vast network of corridors that reach far and wide throughout the Universe. It might be easier to discover and use these natural wormholes than to create new ones. 

Would anyone be shocked if some subtle ways of getting around the speed of light were discovered? The point is that if there are even subtle ways around this limit, the technological powers that our future human-machine civilization will achieve will discover these means and leverage them to great effect.