"Information Is The Resolution Of Uncertainty"
Nearly everything we enjoy in the digital age hinges on this one idea, yet few people know about its originator or the foundations of this simple, elegant theory of information.
Einstein is well rooted in popular culture as the developer of the theory of relativity. Watson and Crick are associated with the visual spectacle of DNA's double helix structure.
How many know that the information age was not the creation of Gates or Jobs but of Claude Shannon in 1948?
The brilliant mathematician, geneticist and cryptanalyst formulated what would become information theory in the aftermath of World War II, when it was apparent it was not just a war of steel and bullets.
If World War I was the first mechanized war, the second war could be considered the first struggle based around communication technologies. Combat in the Pacific and Atlantic theaters were as much a battle of information as they were about guns, ships and planes.
Consider the advances of the era that transformed the way wars were fought.
Unlike previous conflicts, there was heavy utilization of radio communication among military forces. This quick remote coordination quickly pushed the war to all corners of the globe. Because of this, the field of cryptography advanced quickly in order to keep messages secret and hidden from adversaries. Also, for the first time in combat, radar was used to strategically detect and track aircraft, thereby surpassing conventional visual capabilities that ended on the horizon.
One researcher, Claude Shannon, was working on the problem of anti-aircraft targeting and designing fire-control systems to work directly with radar. How could you determine the current, and future position of enemy aircraft's flight path, so you could properly time artillery fire to shoot it down? The radar information about plane position was a breakthrough, but "noisy" in that it provided an approximation of its location, but not precisely enough to be immediately useful.
After the war, this inspired Shannon and many others to think about the nature of filtering and propagating information, whether it was radar signals, voice for a phone call, or video for television.
He knew that noise was the enemy of communication, so any way to store and transmit information that rejected noise was of particular interest to his employer, Bell Laboratories, the research arm of the mid-century American telephone monopoly.
Shannon considered communication "the most mathematical of the engineering sciences," and turned his intellectual sights towards this problem. Having worked on the intricacies of Vannevar Bush's differential analyzer analog computer in his early days at MIT, and with a mathematics-heavy Ph.D. thesis on the "Algebra for Theoretical Genetics," Shannon was particularly well-suited to understanding the fundamentals of handling information using knowledge from a variety of disciplines.
By 1948 he had formed his central, simple and powerful thesis:
Information is the resolution of uncertainty.
As long as something can be relayed that resolves uncertainty, that is the fundamental nature of information. While this sounds surprisingly obvious, it was an important point, given how many different languages people speak and how one utterance could be meaningful to one person, and unintelligible to another. Until Shannon's theory was formulated, it was not known how to compensate for these types of "psychological factors" appropriately. Shannon built on the work of fellow researchers Ralph Hartley and Harry Nyquist to reveal that coding and symbols were the key to resolving whether two sides of a communication had a common understanding of the uncertainty being resolved.
Shannon then considered: what was the simplest resolution of uncertainty?
To him, it was the flip of the coin—heads or tails, yes or no—as an event with only two outcomes. Shannon concluded that any type of information, then, could be encoded as a series of fundamental yes or no answers. Today, we know these answers as bits of digital information—ones and zeroes—that represent everything from email text, digital photos, compact disc music or high definition video.
That any and all information could be represented and coded in discrete bits not just approximately, but perfectly, without noise or error was a breakthrough which astonished even his brilliant peers at academic institutions and Bell Laboratories who previously thought it was unthinkable to have a simple universal theory of information.
The compact disc, the first ubiquitous digital encoding system for the average consumer, showed the legacy of Shannon's work to the masses in 1982. It provides perfect reproduction of sound by dividing each second of musical audio waves into 44,100 slices (samples), and recording the height of each slice in digital numbers (quantization). Higher sampling rates and finer quantization raise the quality of the sound. Converting this digital stream back to audible analog sound using modern circuitry allowed for consistent high fidelity versus the generation loss people were accustomed to in analog systems, such as compact cassette.
Similar digital approaches have been used for images and video, so that today we enjoy a universe of MP3, DVDs, HDTV, AVCHD multimedia files that can be stored, transmitted and copied without any loss of quality.
Shannon became a professor at MIT, and over the years students of Shannon went on to be the builders of many major breakthroughs of the information age, including digital modems, computer graphics, data compression, artificial intelligence and digital wireless communication.
But without a sensational origin myth, bombastic personality or iconic tongue-wagging photo, Shannon's contribution is largely unknown to today's digital users.
Shannon was a humble man and intellectual wanderlust who shunned public speaking or granting interviews. He once remarked, "After I had found answers, it was always painful to publish, which is where you get the acclaim. Many things I have done and never written up at all. Too lazy, I guess."
Shannon was perhaps lazy to publish, but he was not a lazy thinker. Later in life, he occupied himself with puzzles he found personally interesting—designing a Rubik's cube solving device and modeling the mathematics of juggling.
The 20th century was a remarkable scientific age, where the fundamental building blocks of matter, life and information were all revealed in interconnected areas of research centered around mathematics. The ability to manipulate atomic and genetic structures found in nature have provided breakthroughs in the fields of energy and medical research. Less expected was discovering the fundamental nature of communication. Information theory as a novel, original and "unthinkable" discovery has completely transformed nearly every aspect of our lives to digital, from how we work, live, love and socialize.
Beautiful, elegant and deeply powerful.