Mathematicians We Lost in 2020


On 31 December 2020 Dan Rockmore published Mathematicians We Lost in 2020 in the New York Times. He wrote about

John Conway
Ronald Graham
Freeman Dyson

Click on the link to go to that part of the article


Life in the real world is complicated. It's much simpler on the computer. As the Game of Life begins, the screen is filled with a vast latticework of squares, only a few of them filled in. The magic is in the algorithms, which determine, on the basis of the current pattern, what will happen next. As time ticks on, whether any given square will be vacant or occupied, dead or alive, depends on its present state, as well as the states of its nearest neighbors, and possibly of their neighbors, and of their neighbors twice or three times removed. Change the first pattern, rewrite, delete, or add an algorithmic rule, and the pattern may grow unbounded and crenellated or recede to a tiny, moving archipelago, or evolve only to cycle back to its initial configuration, so that it can start over again. The wheel spins on.

The "moral" that excited everyone when this computer game was invented, in 1970, was that simplicity could beget complexity. It provided users with a computational version of the molecular primordial soup, with its ramifications intact but time collapsed. These days, of course, it's hard not to see our real lives in the game's simple, abstract terms. Variations on Life are used to model epidemics like the one we're experiencing. The coronavirus is among the simplest life forms, living only to reproduce, occupying each host only to find the next, through some form of basic contact transmission. We are the squares, some of us occupied and some vacant, all of us doing what we can to avoid being a nearest, or even second-nearest, neighbor.

The inventor of Game of Life, John Conway, is among those we've lost to the coronavirus this year. (He died, in April, of covid-19.) I sometimes wonder if, given his ironic and dark sense of humor, he would've appreciated the symmetry. He was an extraordinarily creative mathematician, one who needed to see a problem as a puzzle or a game in order for it to seize his interest.

Early on, Conway made his name by solving a complicated puzzle about symmetries in twenty-four dimensions. For most of us, it's easier to start with two. Imagine that you have a pile of identical Frisbees. You want to lay as many of them down on the floor as possible--no stacking allowed! Try it, and you'll probably find quite quickly that setting them out in rows that are shifted just a bit, so that the boundary of one Frisbee dips into the cleavage between the two below, is the best that you can do. Nestle a few rows together in this way, and you'll find that any given Frisbee is surrounded by six others. At this point, a mathematician might place an imaginary peg in the center of a group of six Frisbees, then connect the pegs with imaginary lines. Do this, and you get a perfect hexagon, a shape that's symmetrical in a number of ways: you can flip it across various axes or rotate it around its center in steps of sixty degrees, and, except for which corners are where, it remains unchanged.

In mathematical lingo, we've taken a few interesting steps. We started with a "packing problem"; by solving it, we uncovered a symmetrical shape; that shape, in turn, contains its own "group"of symmetries. We could take the same steps in three dimensions. Suppose that you replace the Frisbees with perfectly spherical, identical oranges. (A mathematician's oranges are always perfect spheres.) Now we're contemplating the so-called greengrocer's problem--the question of the best way to stack fruit in a market. In the usual greengrocer's arrangement, layers of oranges are stacked such that each orange touches twelve others. The three-dimensional polyhedron created when we connect the centers of the neighboring oranges also has its own, much larger, group of symmetries.

The mathematics of symmetry is called "group theory," and its modern origins are generally traced back to the nineteenth-century mathematician Evariste Galois, who -- in one of the most romantic of mathematical legends -- is said to have feverishly organized his definitive manuscripts the night before a duel in which he died. Galois wasn't interested in symmetries in space, but in symmetries among and within solutions to equations. A given solution to an equation might have a mirror solution, differing only in its sign: √2 and -√2, for instance. Galois realized that the complexity involved in solving an equation was intimately related to the complexity of the "group" of its solutions'symmetries. His discovery initiated over a century's worth of work aimed at ferreting out groups of symmetries hidden in ever-more-complicated mathematical and geometric structures. By the late nineteen-sixties, mathematicians were racing to fill out a complete catalog.

Conway, who had been hunting around for a good problem, had followed the work of the British mathematician John Leech, who had explored the packing of spheres in twenty-four-dimensional space. Leech had found that, in this fantastical grocery store, each sphere simultaneously touches 196,560 others. But the complete symmetries of the gemlike object obtained by connecting the centers of those neighbors were still unknown, and Conway decided to take a crack at finding them. He set out a deliberate schedule of work, anticipating that it would go on for weeks, but then blazed to a solution in a single, Galois-worthy night of intellectual frenzy: he found that his twenty-four-dimensional crystal was symmetrical in 8,315,553,613,086,720,000 distinct ways. This work would ultimately find applications in the creation of codes useful for communication between satellites and Earth. The so-called Conway Group, meanwhile, paved the way for the uncovering of an even larger group, which mathematicians call the Monster--a group of symmetries as large as the number of kilograms of matter in the observable universe, which lives in a space of over a hundred and ninety thousand dimensions. The Monster has helped mathematicians to understand prime numbers, and given physicists new insights into quantum gravity.

Conway was also a showman and a showoff and an intellectual competitor. A favorite parlor trick of his was to tell you the day of the week on any date, something he could do faster than anyone else. At Princeton, he could usually be found not in his office--which resembled a mathematical apothecary shop hit by a tornado--but in the large and somewhat soulless common room of Fine Hall, the massive looming tower, on the edge of the Princeton campus, that is the home of the mathematics department. When I was an undergraduate math major at Princeton, in the early nineteen-eighties, the common room would come to life only in the mid to late afternoon, just as things were revving up for the daily "tea,"a small box-cookie reception roughly marking the time when most classes had ended and a few seminars were about to start. Conway would often hold court there, hard to miss, a cross between Rasputin and a Middle Ages minstrel, loudly talking philosophy and mathematics, playing the board game Go, or engaging in some other kind of mathematical competition, surrounded by adoring and admiring students, faculty, and visitors. I was a shy and unhappy undergraduate, not a game player, and I would watch the small whirlwind of activity from a distance, eat my cookies, drink the terrible coffee, and then disappear back to the bowels of the mathematics library to work on my problem sets.

After graduating from college, I don't think I saw Conway again until five years later, in 1989. I was in the audience at a conference at M.I.T., where Conway gave a lecture, titled "Computers and Frivolity," to a packed house. I remember little about the content--something about the way in which a spirit of curiosity and fun, mixed with a little computing, could be a pathway to some deep mathematics. What I do remember quite clearly was that Conway gave the talk using an overhead projector with a single transparency; each time he filled the transparency, he picked it up and then, to the horrified delight of the audience, licked it clean, then resumed writing. To Conway, mathematics was a game -- so much so that, later in his career, he discovered, or invented, a new class of numbers that can be infinitely large and infinitely small: the "surreal numbers," which include the real ones.

Conway had both a disciplined and an undisciplined mind. He was childlike in many ways, and he took advantage of the kind of leeway that we grant to geniuses. His method was to make mathematics out of whatever caught his fancy, but to do so with laser focus. You and I might notice that brickwork often has a pattern to it; only Conway could turn that into a deep exploration of symmetry. (After I heard him speak on this subject, walks through Central Park were never the same.) Some say that his contributions to mathematics peaked with his discovery of the Monster. That's a little like saying that, after "Anna Karenina," it was all downhill for Tolstoy; still, Conway himself often fretted about losing his mathematical powers. In the early two-thousands, Conway was among the mathematicians my co-producers and I interviewed for a documentary, "The Math Life." We had a broad and fascinating conversation, ranging from deep mathematics to word origins ("numb"and "number"are very likely connected!) and wordplay. He reflected on a life of intellectual privilege, the joys of teaching, and the wild highs and dark lows of perpetual thinking. Our talk of his achievements was tinged with melancholy as he reminisced about the "white hot"creative moments now in his rearview mirror. While Conway's life was full of honors and mathematical achievement, it was also just a life, and a complicated one, one as messy as his office: several marriages, bouts of depression, and even a suicide attempt. Mathematics was both a passion and an escape. "You know the saying 'Euclid alone has looked on beauty bare'?" he asked me. (It's a line from Edna St. Vincent Millay.) "Well, what does that mean? I think it means that, you know, in Euclidean geometry, because it's stripped--stripped of cats and twigs and palaver--there's just something pure, and clean, and simple, and exact, and precise."

In Conway's Game of Life, chaos emerges from order. In reality, it's generally the other way around: life is lived forward and understood backward, which is to say that the search for order in randomness is a very human endeavor. In that sense, we're all mathematicians -- pattern seekers and pattern creators at heart, from the search for meaning in our terrestrial wanderings to the imposition of constellational structures as we scan the night sky. In the vast heavens over a pitch-black town in ancient Greece, how could your eye not connect a line of stars into Orion's Belt, or a group of them into the Big and Little Dippers? There are mathematical manifestations of this impulse. The game of connect-the-dots that our forebears played with the stars is a precursor to a zone of inquiry that mathematicians call Ramsey Theory--named for the British mathematician Frank Ramsey--which explores the inevitability of finding preordained structures in collections of random dots. Essentially, it investigates the conditions under which structure is unavoidable.

Social events can sometimes feel like settings populated by random people (or random points, to a mathematician). Here's a question: how many people do you need to invite to a party to guarantee that three of them will be either mutual friends or mutual strangers? You might visualize a gathering as a network, with lines connecting friends or strangers; in either case, their constellations will form triangles. The "Ramsey number"associated with this scenario tells us the minimum gathering size we need in order for those triangles to emerge. In this case, five people is too few, but six will do the trick--so the Ramsey number for this odd social-engineering task is six.

For more complex scenarios, Ramsey numbers are notoriously difficult to calculate. They seem to require the listing out, for each guest, of the other guests they do and don't know--an enumeration that quickly becomes an unmanageable task. Instead of making lists, mathematicians have tended to reframe the question in terms of an upper bound: we might conclude that the Ramsey number, whatever it is, is no higher than a certain other number. Finding these bounds can quickly take us into the numerical stratosphere. It was through such a quest that Ron Graham, who also died this year, arrived at Graham's number, once called "the largest number ever to have a use."

Graham's early life was one of great peregrination in which a love of mathematics was a steady and portable source of comfort. A few years of school here and a few there led to entry into the University of Chicago at fifteen, through a program for precocious teen-agers; he studied philosophy and literature in the school's "great books"program--Carl Sagan was a classmate--then left, a few credits shy of a degree, to study mathematics at Berkeley. He left Berkeley early, too, to enlist in the Air Force--"The brochures looked great!"he told me, when I interviewed him for "The Math Life"--and, while stationed in Fairbanks, Alaska, earned a degree in physics as a part-time student. Later, he would return to Berkeley to finish his doctorate in math, becoming one of the great "combinatorialists"of our time. Many parlor-room questions are combinatorial: "How many ways can we seat these people at this table so that no one is sitting next to someone she knows?"But there are less familiar questions, and the beautiful formulas that answer them inform probability theory and computer science.

Graham's number mixes party planning with geometry. Imagine a party held on a jungle gym with eight guests; each guest sits on a corner of a cube. By slicing the cube through any two parallel edges, it's possible to isolate a four-person "table" -- a plane on which four guests sit. Six of these "four-tops"are made by the sides of the cube; six more are made by diagonal slices through it. You might ask yourself whether, at such a party, you're guaranteed to find that the guests at any of these four-tops will either all know one another or all be strangers. The answer is no: with eight guests at a cubical party, such a social arrangement isn't guaranteed.

Now suppose that, instead of being held on a cube, the party is held on a hypercube -- a jungle gym built in four dimensions instead of three. A hypercube has sixteen corners, so the number of guests doubles. The number of four-tops also grows: a few of my math friends and I have determined that there are now a hundred four-person "tables." Finally, imagine that the party keeps expanding, again and again -- it's a rager in ever-higher dimensions. Graham wanted to know if, at some point in this dimensional growth, it would be guaranteed that a four-top would pick up four people who either all knew one another or all were strangers. He determined that, yes, this would happen -- but only after the party had grown to the point that it took place in at least a Graham's number's worth of dimensions. The number is so large that not only can't it be written down on a piece of paper, it can't be written out in any way, on any scale, in the observable universe. Even microscopic digits would simply stretch too far. Graham had to invent a whole new notation to express the number in something resembling words. That's some party, and some number. One lesson of this work is that, while mathematical order eventually emerges out of the chaos, you have to wait a while. It's easier to find an archer or a lion in the skies: just squint.

Graham was for many years a fixture at Bell Labs; eventually, he ran its mathematics research group, to the extent that one can actually "run" a collection (or is it a "set"?) of mathematicians. He finished his career as a professor of mathematics and computer science at the University of California, San Diego. He had a sharklike intellect, ravenous and always on the move; at any given moment, he was likely to be learning Chinese, or working on a new trick on the trampoline, as well as researching a math problem. Fittingly, he was a world-class juggler (even president of the International Jugglers Association for a time), and a conversation with him could be a juggling act. Topics would come and go, and, just as you were ready to dwell on one idea, another would be handed to you, only to be quickly taken away.

As I read over the interview he did for our little documentary, I see that there were lots of balls in the air: anecdotes about this or that mathematician, and about Graham's early and somewhat peripatetic life. The math worked its way in -- maybe every third ball or so. We discussed how he'd created the subject of computational geometry, as well as his groundbreaking work in "worst-case analysis." Both research efforts grew out of problems the military tendered to Bell Labs in disguised, somewhat abstracted form; the latter, presented as a problem in "scheduling theory," actually derived from the question of how we might defend ourselves against a salvo of multiple warheads of varying capabilities. (The same math describes how a collection of microprocessors might coöperate most effectively when there are multiple tasks to execute, each with different costs and considerations.) "You've got certain things that have to be done before others, and we've noticed that, when you change the order that you try to do these things in, or you decrease the times that it took to actually do it, sometimes it takes longer," Graham explained. "And so I tried to make a careful model of it, and slowly began to understand how bad these anomalies could be." The work of Graham and others was the beginning of the study of computational complexity and the development of a taxonomy of algorithmic difficulty, whose subtleties are still being worked out today. The most significant problem in the field is an open question: it asks whether a task whose solution is easy to check ("Did we intercept all the rockets?") is also, necessarily, easy to solve. A million-dollar prize awaits the person who determines the answer.

The "worst-case" element of Graham's work didn't reflect his generally sunny personality. In conversation, he was both charming and disarming; with his fine blond hair and good looks, he always reminded me of a choirboy who had just snacked on the sacrament. For him, the world was a never-ending source of puzzles and problems, which he attacked with joy and energy. "In my grand scheme of things, the Great Being is kind of a great mathematician," he said. "The universe is really running on mathematical principles, and we just don't understand very much of them. It's just that we get a glimpse, now and then."

Like Ron Graham, Freeman Dyson had a lot of stories. One of my favorites was about how he decided to turn from studying mathematics to studying physics. As he told me in his "Math Life" interview, he was walking down the King's Parade in Cambridge with a friend, the future mathematician Harish-Chandra. "Harry said to me, you know, physics is so messy -- I think I'll switch from physics to mathematics. And I said, you know, it's funny, because I just decided to switch from mathematics to physics for the same reason." They would eventually become colleagues at Princeton's Institute for Advanced Study.

Dyson also died this summer. While he was alive, I always viewed him as the physicist most deserving of a Nobel Prize who didn't receive one. He was among the raft of outstanding scientists set adrift by the Nobel Committee's rule of three: no more than three scientists can share the prize in a given area. In Dyson's case, he lost out on the prize awarded, in 1965, for the creation of the field of quantum electrodynamics, which recognized the work of Richard Feynman, Julian Schwinger, and Sin-Itiro Tomonaga. These three physicists had each figured out a way to reconcile the classical theory of electromagnetic fields -- the theory that makes sense of, and makes possible, the electrical work in your home -- with the complexities of the quantum-mechanical behavior of the electron. It was one of the earliest steps in the overarching grand goal of modern physics -- the reconciling of the various fundamental forces (electricity, magnetism, the weak and strong forces of the atom, and gravity) as the necessary outcomes of a single mathematical framework. Dyson's role in the story was one of "translator." He took Feynman's odd and idiosyncratic pictorial "Feynman diagram" approach to physics and turned it into the mathematics that Schwinger and Tomonaga created. Others might have been bitter at being passed over for the prize, but Dyson would always speak of his luck at having arrived at Cornell University just as Feynman was embarking on his revolutionary remaking of modern physics.

The role of translator is one that Dyson took on throughout his life: he turned physics into math, and those subjects into English for the general public. One bridge that he built between math and physics was by showing that the discrete energy levels achieved by a random quantum mechanical system bear an extraordinary resemblance to certain properties of prime numbers--a statistical mystery that gave new hope for solving one of the most important open problems in mathematics, regarding the pattern with which the prime numbers appear among the integers. Like Graham, Dyson delighted in glimpsing the mathematical principles on which the world runs. "I mean, how could the electron know that it was supposed to have a six in the ninth decimal place?"he said. "To me, it's absolutely amazing!" He shared this amazement on camera, during the recording for "The Math Life"; while he was laughing about it, his tooth fell out.

Dyson's scientific breadth, bottomless curiosity, and quick mind were surely among the reasons that he was so often sought as a consultant on a wide range of scientific projects -- including an effort to build an interplanetary spaceship powered by nuclear bombs, a fascinating story well told by his son, George Dyson, in "Project Orion: The True Story of the Atomic Spaceship." My last extended exchange with Dyson had to do with the Concinnitas Project, a collection of aquatint prints of mathematical equations to which he was a contributor. The ten participants, all prize-winning mathematicians and physicists, were asked to write out whatever expression they had discovered that they considered the most beautiful; it would then become a white-on-black fine-art print. Among the many expressions that Dyson might have chosen, he picked the Macdonald equation, a result from number theory that captures a deep property of symmetry among the integers that also extends to the physical world.

Dyson loved this equation not only for the beauty of its mathematics but also for the memories he had of its discovery. He derived the equation for himself just a little after another mathematician, Ian Macdonald, did so; all the while, without knowing that they were working on the same subject, Dyson and Macdonald were seeing each other almost every day when they picked up their children at the day-care facility at Princeton's Institute for Advanced Study. No matter: "My friend Ian MacDonald had the joy of discovering it first," Dyson wrote, "and I had the almost equal joy of discovering it second." The Macdonald equation is a long, horizontal piece of fine calligraphy to those who can't read it, and many naturally focus on the multiple exclamation marks that it contains. These factorial marks have significant mathematical meaning, but, whether or not you understand them, they appear like tiny bursts of joy in a landscape of numbers, letters, and Greek symbols. They remind me of the chuckles and laughs that peppered each of the conversations I was lucky enough to share with this marvellous man.

Mathematics, like life, is complicated. But, for those who do mathematics, it is a source of joy. "The main thing is just astonishment that there's such a rich world out there--a wonderful, abstract, very beautiful, simple world," Conway said. "It's like Pizarro standing on the shores of the Pacific or whatever ... I can sit here in this chair and go on a voyage of exploration. A very different voyage of exploration, but, still, there are things to be discovered, things to be seen, that you can quite easily be the first person ever to see."

So many of us now sit in our rooms, bound in space while time drips away. It can be a bit of a comfort to know that, as long as you are able to sit still and think, your creative spirit can be an engine of exploration. On their journeys, these playful, curious mathematicians discovered Monsters and numbers so large that they can hardly be written down. We're grateful for the lively stories of their expeditions, and for the thinkers who led them. They'll be missed.

Last Updated January 2021