His ground-breaking work on information theory, much of it laid out in his master's thesis, made today's digital revolution possible
Claude Shannon, who has died aged 84, perhaps more than anyone else laid the groundwork for today's digital revolution. His single-handed exposition of information theory, stating that all information could be represented mathematically as a succession of noughts and ones, facilitated the digital manipulation of data without which today's information society would be unthinkable.
More correctly, his theory applied not only to data but also to communication of all kinds and, combined with the pulse code modulation system of digital telecommunications transmission devised in 1937 by Englishman Alec Reeves, effectively laid the foundations for today's modern digital communications and broadcasting networks, the internet and much more.
Claude Shannon's childhood was spent in Gaylord, a small town in Michigan where he had a normal state school education. Inspired by his hero, Thomas Edison (whom he later learned was a distant cousin), the young Shannon showed an early gift for science and mathematics as well as a talent for constructing models and gadgets, including a private telegraph system to a friend's house half a mile away. These interests served him well at the University of Michigan, where he graduated in 1936 with twin BSc degrees in electrical engineering and mathematics.
Shannon's master's thesis, obtained in 1940 at the Massachusetts Institute of Technology, reflected work he had done at MIT and Bell Telephone Laboratories to solve the growing complexity of telephone systems. Entitled A Symbolic Analysis of Relay and Switching Circuits, it demonstrated that problem solving could be achieved by manipulating just two symbols -- 1 and 0 -- in a process that could be carried out automatically with electrical circuitry. A switch turned on could represent the symbol 1; 0 would be a switch that was turned off. That dissertation has since been hailed as one of the most significant master's theses of the 20th century. To all intents and purposes, its use of binary code and Boolean algebra paved the way for the digital circuitry that is crucial to the operation of modern computers and telecommunications equipment.
Eight years later, Shannon published another landmark paper, A Mathematical Theory of Communication, generally taken as his most important scientific contribution. This paper put information theory on the map, establishing terminology and a framework that are still used today. He coined the term bit for a binary digit, and explained that the maximum effectiveness of communications channels would be achieved only when the source rate of the information carried matched the capacity of that channel, both being measured in bits per second. This "mathematical theory" relied heavily on statistical analysis, and in the process Shannon also developed a measure of the efficiency of a communications system, which he called its entropy. At the same time, he separated as different tasks the technical problem of delivering a message from the process of understanding what that message means, allowing engineers to focus on the message delivery system. Given that most communications channels are subject also to interference, Shannon established how best a message could be delivered in sub-ideal circumstances. Overlying all this was the knowledge that the nature of the information transmitted was irrelevant. Whatever its nature -- speech, text or images -- it could be represented numerically in binary form.
To engineers accustomed to continuous waveforms, this revolutionary and divisive technique of splitting them into discrete units of ones and zeros was both alien and abstract, but it found favour. Indeed, the very fact that this model of communications systems was so entirely abstract was no disadvantage; on the contrary, this gave it application to a broad range of communication functions, both analogue and digital. It was adopted enthusiastically by communications engineers and prompted attempts to apply similar techniques in other branches of the sciences.
Shannon applied the same radical approach to cryptography research, in which he later became a consultant to the US government. In his paper Communication Theory of Secrecy Systems, he argued that messages could survive any degree of interference -- or scrambling -- if sufficient "redundancy" (extra bits) were added. Over successive decades, this technique has developed into sophisticated error-correction codes that ensure the integrity of data prone to unwanted corruption in transmission.
Many of Shannon's pioneering insights were developed before they could be applied in practical form. Along with Alec Reeves's concept of telephone conversations transmitted as pulses and Alan Turing's dream of all-purpose computing devices, Shannon's ideas had to wait for solid-state electronics to mature before they could be achieved practically. In fact, it was only the arrival of integrated circuits in the 1970s that made possible the commercial exploitation of technology based on Shannon's theories.
His work was not confined to academic research, however, and his eclectic extracurricular interests and activities were notable. They included juggling while riding a unicycle down the halls of Bell Labs, devising a calculator to perform arithmetic operations in Roman numerals and designing a mechanical device for solving Rubik's cube puzzle. He was truly a remarkable man, yet unknown to most of the world. He is survived by his wife, a son and daughter.
Andrew Emerson, Thursday March 8, 2001 © Guardian Newspapers Limited