Claude Shannon

Playful genius who invented the bit, separated the medium from the message, and laid the foundations for all digital communications

©Claude Shannon, information theorist, was born on April 30, 1916. He died on February 24, 2001, aged 84.

Eccentric and at times even erratic, Claude Shannon singlehandedly laid down the general rules of modern information theory, creating the mathematical foundations for a technical revolution. Without his clarity of thought and sustained ability to work his way through intractable problems, such advances as e-mail and the World Wide Web would not have been possible.

Something of a loner throughout his working life, he was individually responsible for two of the great breakthroughs in understanding which heralded the convergence of computing and communications. To colleagues in the corridors at the Massachusetts Institute of Technology who used to warn each other about the unsteady advance of Shannon on his unicycle, it may have seemed improbable that he could remain serious for long enough to do any important work. Yet the unicycle was characteristic of his quirky thought processes, and became a topsy-turvey symbol of unorthodox progress towards unexpected theoretical insights.

The ability to make astonishing leaps beyond the intellects of his colleagues (all the more remarkable at MIT, the forcing house of technological theory) had to be acknowledged as genius, and Claude Shannon was recognised as a giant throughout the industry.

Born in Petoskey, Michigan, Claude Elwood Shannon graduated from the University of Michigan in electrical engineering and mathematics in 1936. He was not considered particularly distinguished as a mathematician (later on it was said that he had to invent any maths that was needed) but he was fascinated by the work of the 19th-century English philosopher George Boole. In his great work Laws of Thought, Boole had formalised logical expressions in what is still known as Boolean algebra.

These yes/no expressions seemed to Shannon to be important as a practical basis for the use of electrical circuits, where a switch placed on or off could act as a yes or a no symbol. While working at MIT for his master's degree, Shannon was fortunate to be assigned to work under Vannevar Bush, the inventor of the differential analyser. This device was at the heart of all analogue computers at the time, and Shannon was steered towards consideration of the task of bringing Boole's precision to the increasingly complex world of telephone networks.

In 1940 he published his master's thesis, A Symbolic Analysis of Relay and Switching Circuits, in which he showed how Boole's logical symbols could be treated as a series of on or off switches, and how binary arithmetic -- manipulation of strings of 0s and 1s -- could be carried out by electrical circuits. This was immediately recognised as the springboard from analogue computing to digital computers, and because of its particular importance to the telephone industry, Shannon was given a job in Bell Laboratories (though during the war he worked on fire-control systems for anti-aircraft guns).

Shannon's rather offhand approach to problems was not always popular with his earnest colleagues, who felt he lacked the rigour of the true researcher. Unabashed, though, he continued to work on his own, concentrating on the use of his two-symbol language in the development of switching circuits to replace human operators in telephone networks.

In 1948 this work resulted in a second landmark paper. Here he dismissed the universally accepted view that messages were synonymous with the waveforms by which they were sent over lines, and that different lines were needed for the transmission of voice and telegraph messages. Telephone engineers saw the waveform as both the medium and the message, but Shannon recognised that by treating all messages as strings of binary digits -- or bits as he called them -- it was possible to distinguish the message from the medium it travelled on. This allowed engineers to focus on the delivery mechanism without worrying about the message itself, which could be voice, data or telegraph.

Like its predecessor, this paper, A Mathematical Theory of Communication, was instantly recognised as having enormous practical importance. It provided a basis for information theory, as well as showing a method of measuring the efficiency of a communications channel, which, adapting Clausius's term from thermodynamics, Shannon called entropy.

He showed that entropy, a measure of the disorder of a system, can be seen as a lack of information about the system. From this he deduced that disordered systems can be "cleaned up" by using information they contain, such as their patterns. The paper was highly publicised, and for a decade Shannon was a national figure, with no discouragement from Bell Laboratories.

As early as 1950 he wrote a paper on programming a computer to play chess, and 15 years later he had an opportunity to discuss the question with Mikhail Botvinnik, for many years the world chess champion. The discussion was interesting, he said, but unfortunately it was carried on through a "noisy channel", since the interpreters knew little about either chess or computers. Another 15 years on, in 1980, Shannon was on hand when Bell Labs' chess playing computer, Belle, won the International Computer Chess Championship.

Like Charles Babbage, Shannon was known by his contemporaries as "the Irascible Genius". When he returned to MIT in 1958, he continued to threaten corridor-walkers on his unicycle, sometimes augmenting the hazard by juggling. No one was ever sure whether these activities were part of some new breakthrough or whether he just found them amusing. He worked, for example, on a motorised pogo-stick, which he claimed would mean he could abandon the unicycle so feared by his colleagues, and moving away a little from binary arithmetic, he delighted his friends with THROBAC-1, which computed in Roman numerals.

For some time he devoted his attention to a mechanical mouse trapped in a maze. Fitted with copper whiskers and a magnet on its wheels, this could find its way through the maze to a "piece of cheese" (an electrical terminal that rang a bell when the whiskers touched it). Shannon was fascinated by artificial intelligence, and the electromechanical mouse turned out to be one of the earliest attempts to teach a machine to learn.

Shannon's colleagues knew that he worked alone, behind his closed door, but that if anyone had a problem he was always happy to break off his own work to offer advice. They also knew that he could always understand their problems at lightning speed, with no time wasted in extended explanations. On the other side of the coin, he was not patient with those who did not readily follow his line of thought. A practical theorist, he was always interested in solving problems, not, as he said, "creating new ones for someone else to solve".

In his twenty years at MIT -- from 1957 as Donner Professor of Science -- Shannon continued to develop important ideas in information science, a number of which bear his name. The Shannon Capacity, for instance, defines the band-width capacity of a local copper loop taking account of noise levels across the loop; and the Shannon Limit, recognising that the capacity of a channel was limited only by signal-to-noise ratios in the channel, pointed the way to current high-order modulation schemes.

His concept of adding so-called redundant bits to a message to allow reconstruction of a corrupted message led to the widespread use of error detection and correction codes in data transmission.

He was responsible for many developments in cryptography, too, harking back to a digital encryption system he had developed for use by Winston Churchill and Franklin Roosevelt. He also continued with the less serious side of research and development. It is certainly true that he once invented a two-seater version of his unicycle, and it is probably true that no one was anxious to share it with him. A later invention, the unicycle with an off-centre hub, would bring people out into the corridors to watch him as he rode it, bobbing up and down like a duck.

A selection of his work appeared in Russian in 1963, but it was not until 1993 that his Collected Papers, running to a thousand pages, appeared in English (edited by Neil J. A. Sloane and Aaron D. Wyner). The papers, some of which had previously been classified as secret, are in three groups: information theory and cryptography; computers, circuits and games; and his doctoral dissertation on population genetics. His design for a rocket-powered frisbee, however, remains unpublished.

As the stream of important insights and wacky developments continued, so did the flow of laurels from universities all over the world. It was rumoured that Shannon's games room held a modified dry-cleaner's rack, from which hung all his ceremonial gowns. His honours included the National Medal of Science in 1966 and the Kyoto Prize in 1985.

Alzheimer's disease claimed him, and took him to a Massachusetts nursing home for his last years. It was somehow in keeping that he was too ill to attend a recent ceremony held by the American Institute of Electrical and Electronics Engineers, the Information Theory Society and the University of Michigan, at which a statue of him was unveiled by his wife. Many of his peers attended, pleased that he should be recognised as a genius at doing as well as thinking.

He is survived by his wife, Mary Elizabeth Shannon, and by their son and daugther.

© The Times, 2001