David Donoho Awards



1. Presidential Young Investigator Award 1985.
1.1. The Presidential Young Investigator Award.

The Presidential Young Investigator Award was awarded by the National Science Foundation of the United States Federal Government. The program operated from 1984 to 1991, and was replaced by the NSF Young Investigator Awards and Presidential Faculty Fellows Program. The award gave minimum of $25,000 a year for five years from the National Science Foundation, with the possibility of up to $100,000 annually if the Presidential Young Investigator obtained matching funds from industry.

1.2. The 1985 Award.

The 1985 Presidential Young Investigator Award was awarded to David L Donoho.
2. MacArthur Fellow 1991.
2.1. David Donoho: Class of 1991.

David Donoho is a statistician whose work ranges from statistical computing to classical mathematical analysis.

His theoretical work includes the nonlinear recovery of signals despite massively incomplete data; recovery of curves from noisy statistical data in a theoretically optimum manner; and robust methods for treating severely contaminated high-dimensional data. Underlying this work is mathematics ranging from inequalities in the theory of entire functions to geometric properties of high-dimensional convex sets. Donoho has interests in large-scale statistical applications, particularly signal analysis in geophysical and medical problems. In presenting his findings, Donoho not only describes the intellectual underpinnings for his results, he also makes software available for others to learn from and expand his research. For example, he was a designer of MacSpin, a package that visualises statistical data as rotating, three-dimensional point clouds; Donoho also coordinated the development of mathematical software tools that expand on his initial work in wavelet theory.

2.2. Stanford University News Service Report.

Statistics Professor David Donoho is one of 31 new fellows named Tuesday, 18 June 1991, by the John D and Catherine T MacArthur Foundation.

Donoho, 34, is a professor at Stanford University and the University of California-Berkeley.

He will receive a total of $225,000 over the next five years in what is widely known as the MacArthur "genius award." Recipients are free to use the funds from the fellowship in any way they wish, as the foundation's way of supporting talented and creative individuals.

Donoho said the award is ideal for his style of learning.

"This award gives a scientist a chance to shoot for the clear blue sky," he said, "to try projects that may seem too risky for regular funding channels. With a normal grant, you have to know what you're going to do a year in advance. I do research on a project when the compulsion for it takes over. Instead of formalising and justifying it, I'd much rather use my energy actually doing the project."

He said his first thought about how to spend some of the award is to sponsor a meeting of scholars in his field, including the Soviet scholar M S Pinsker, a mathematical statistician whose work has profoundly influenced Donoho.

Donoho has done ground-breaking work in statistical theory and has applied it to a broad range of practical applications.

"He's really a renaissance man in our field," says Jerome Friedman, chair of Stanford's department of statistics. "He's done some of the best statistical theory of anyone in the last five years."

One of Donoho's interests is "robust statistics." He has worked out ways to detect errors in a data base containing many dissimilar types of data - for example, the blood pressure, cholesterol level, blood sugar and pulse rate of a group of medical patients. Using his methods, a computer could be programmed to visualise each type of fact as a different dimension and to look at all the dimensions from different angles. It thus could detect "outliers" - facts outside the norm that may be errors.

The same concept can be turned around to solve "signal recovery" problems. In this case, the "outliers" are the desired facts - for astronomers looking for the internal structure of a mysterious object such as a pulsar, or geologists looking for the blip in earth-scanning data that may mean the presence of oil.

Donoho has been working on another problem of incomplete or partially inaccurate data: how to de-blur an image when there is no way to know how it was blurred to begin with. It's a problem important to scientists who use instruments to observe regions they cannot reach - scanning the ocean from satellites or reading deep into the earth with seismographs, for example.

Donoho also is interested in computer graphics that scientists can use to display multi-dimensional data. With his brother, Andrew Donoho, and his wife, Miriam Gasko, he developed software for the Macintosh that can be used to rotate data displays to show patterns from various perspectives. The program, MacSpin, was based on the PRIM-9 program developed for mainframe computers by Stanford's Friedman, Mary Ann Fisherkeller and John Tukey. MacSpin was designated the best scientific/engineering software of 1987 by MacUser magazine.

Donoho earned his bachelor's degree from Princeton in 1978 and his doctorate from Harvard in 1984. He has been a member of the faculty at Berkeley since 1984 and a professor at Stanford since September.

Donoho is the 11th MacArthur Fellow currently at Stanford. He is the fifth statistician to be named a fellow, and the third with ties to Stanford's Department of Statistics. The previous two are Bradley Efron and Persi Diaconis.

Donoho won a Presidential Young Investigator award for 1985-90. He used part of that award to sponsor a meeting of statisticians - in that case, researchers from Eastern Europe, Germany and France.

"All sorts of ideas that had been locked up in Eastern Europe were exposed to us and vice versa," he said.

Inspiration for some of his own work has come from such international contacts. "You have scholars who spend 10 to 15 years investing every bit of their personalities in a single project," he said. "Real progress in research comes not from having many people know your work, but from having a few people understanding your work deeply. This is why small-scale meetings are so important. When it really works it's intense, something like the Vulcan mind-meld on 'Star Trek.' "
3. Committee of Presidents of Statistical Societies Presidents' Award 1994.
3.1. COPSS Presidents' Award.

The Committee of Presidents of Statistical Societies (COPSS) sponsors and presents the COPSS Presidents' Award annually to a young member of the statistical community in recognition of outstanding contributions to the profession of statistics. The award, established in 1976, consists of a plaque and a cash honorarium of $2000 and is presented annually at the Joint Statistical Meetings. Prior to 2005 the award was given to a statistician not yet having reached his or her 41st birthday during the calendar year of the award. The prize is sponsored by five statistical societies: The American Statistical Association; The Statistical Society of Canada, The Institute of Mathematical Statistics; The Eastern North American Region of the International Biometric Society; and The Western North American Region of the International Biometric Society.

3.2. Presidents' Award 1994

The Committee of Presidents of Statistical Societies Presidents' Award for 1994 is made to David Donoho.
4. SIAM John von Neumann Lecture Prize 2001.
4.1. The John von Neumann Prize.

The John von Neumann Prize, established in 1959, is awarded by the Society for Industrial and Applied Mathematics for outstanding and distinguished contributions to the field of applied mathematical sciences and for the effective communication of these ideas to the community. The recipient receives a monetary award and presents a survey lecture at the annual meeting.

4.2. The 2001 Lecture Prize.

David L Donoho of Stanford University was honoured with the John von Neumann Lectureship, which carries a cash award of $2,500. He gave the lecture "What lies behind Wavelets?" in July 2001.
5. Information Theory Society Paper Award 2008.
5.1. The Information Theory Society Paper Award.

The IEEE Information Theory Society Paper Award is given annually for an outstanding publication in the fields of interest to the Society appearing anywhere during the preceding four calendar years.

The purpose of the Information Theory Paper Award is to recognise exceptional publications in the field and to stimulate interest in and encourage contributions to fields of interest of the Society. The Award consists of an appropriately worded certificate(s) and an honorarium of $1,000 for a paper with a single author, or an honorarium of $2,000 equally split among multiple authors. To be eligible, the paper must have appeared in the preceding four calendar years.

5.2. The 2008 Awards.

The IEEE Information Theory Society Paper Award for 2008 was given to David Donoho, for the paper:

David Donoho, Compressed Sensing, IEEE Transactions on Information Theory (April 2006).

Also in 2008 Emmanuel Candès and Terence Tao received the IEEE Information Theory Society Paper Award for their paper:

Emmanuel Candès and Terence Tao, Near-Optimal Signal Recovery From Random Projections: Universal Encoding Strategies, IEEE Transactions on Information Theory (December 2006).
6. Norbert Wiener Prize in Applied Mathematics 2010.
6.1. The Norbert Wiener Prize.

The Norbert Wiener Prize in Applied Mathematics was established in 1967 in honour of Professor Norbert Wiener and was endowed by a fund from the Department of Mathematics of the Massachusetts Institute of Technology. The prize is awarded for an outstanding contribution to "applied mathematics in the highest and broadest sense." The award is made jointly by the American Mathematical Society and the Society for Industrial and Applied Mathematics. The recipient must be a member of one of these societies and a resident of the United States, Canada, or Mexico. This prize will be awarded every three years.

6.2. Citation for the 2010 prize for David L Donoho.

The 2010 Norbert Wiener Prize is awarded to David L Donoho for introducing novel fundamental and powerful mathematical tools in signal processing and image analysis. His many outstanding contributions include those to compressed sensing and the construction of multiscale analysis techniques that take advantage of the specific mathematical and physical properties of the problems under consideration. His methods are very deep mathematically and very efficient computationally. This explains their success with both theoreticians and practitioners, which causes him to be one of the most cited applied and computational mathematicians of our time.

6.3. Biographical note.

David Donoho received his A.B. in statistics (summa cum laude) from Princeton University, where his undergraduate thesis adviser was John W Tukey. After working in seismic signal processing research at Western Geophysical under Ken Larner, he obtained the Ph.D. in statistics at Harvard, where his thesis adviser was Peter Huber. He held a postdoctoral fellowship at MSRI, then joined the faculty at the University of California, Berkeley, advancing to the rank of professor. He later moved to Stanford University, rising to the position of Anne T and Robert M Bass Professor in the Humanities and Sciences. He has also been a visiting professor at Université de Paris, University of Tel Aviv (Sackler Professor and Sackler Lecturer), National University of Singapore, Leiden University (Kloosterman Professor), and University of Cambridge (Rothschild Visiting Professor and Rothschild Lecturer). Donoho is proud of his more than twenty-five Ph.D. students and postdocs, many of whom have become very successful in academia and industry. Donoho is a member of the U.S. National Academy of Sciences and of the American Academy of Arts and Sciences, and he is a recipient of the honorary Doctor of Science degree from the University of Chicago. Donoho cofounded two companies while in Berkeley: D2 Software, makers of MacSpin for high-dimensional data visualisation, and BigFix, makers of remote network management software. Donoho has served on the research staff of Renaissance Technologies, a prominent quantitative hedge fund.

6.4. Response by David Donoho.

Norbert Wiener means a lot to me; I am a proud owner of his Collected Works [Norbert Wiener, Collected Works, Vols. 1-3 (The MIT Press, Cambridge, 1976, 1979, 1981)] and have dived into them regularly for more than two decades. They allowed me to survey Wiener's career from close-up: I became intimately familiar with many of Wiener's visionary achievements, including the generalised harmonic analysis, the work on Brownian motion and chaos, and the work on prediction and smoothing of signals as well as his technical achievements, such as the algebra of absolutely convergent Fourier series and the space PW of bandlimited functions. From the non-mathematical fourth volume [Norbert Wiener, Collected Works, Vol. 4: Cybernetics, Science, and Society; Ethics, Aesthetics, and Literary Criticism; Book Reviews and Obituaries (The MIT Press Cambridge, 1985)] of his Collected Works, I learned that Wiener had a "wild side" in his later career - a vision of the future; he aimed to be broader and to see farther than any other mathematician of comparable stature.

I am also the proud owner of a beaten-up old copy of a special issue of the Bulletin of the American Mathematical Society dedicated to Norbert Wiener [Special issue on Norbert Wiener, 1894-1964, Bull. Amer. Math. Soc. 72 (1, part 2) (1966), 1-125.]. I have studied carefully what scholars of that time had to say about Wiener. Mathematicians were partially at a loss to assess Wiener's significance, for he was by then a public intellectual and, in some sense, a seer of our future; mathematics simply was too narrow a forum for discussing and evaluating some of his insights. Has any other issue of the Bulletin ever had an article with a title like "From philosophy to mathematics to biology"? It seems unlikely to me.

When Wiener did his great work on prediction and filtering in the early 1940s, he realised that the coming convergence of mathematics and computers was going to have great impacts on society and human life. Others had related insights at the time, notably von Neumann. But Wiener saw farther. He saw three things coming together: mathematical insights, computational power, and the capture of signals sensing the world around us and our position in and effects on the world. Wiener communicated the feeling that the convergence of these three elements was a great adventure for humankind, with great potential benefit but also some complexity and even moral peril.

I am fortunate to have lived part of Wiener's adventure: I have the good fortune to be inspired by mathematical analysis; to have rendered some inspiring mathematics operational through computers, and to actually use the resulting computer codes for processing some of the massive bodies of signals data our civilisation is now capturing. I have been fortunate to be part of research teams imaging the earth seismically, probing molecular structure by NMR spectroscopy, using magnetic resonance imaging in novel clinical applications, and processing financial signals in markets worldwide. I have been particularly fortunate to find collaborators willing to do new things in those areas, inspired by mathematical criteria. Wiener must have envisioned that mathematical scientists would someday be so fortunate, but he was able to experience only limited opportunities of this kind in his own lifetime.

Wiener's vision has "caught on"; while his enthusiasm for the convergence of mathematics, computing, and signals must have seemed odd to mathematicians sixty years ago, today there are many mathematical scientists who implicitly assume this convergence as a central ingredient in their world view. The journals Inverse Problems and the SIAM Journal of Imaging Science are two venues where mathematical scientists are engaged actively in this convergence. I personally am very fortunate to have had students, co-authors and postdocs who were as inspired as I was by this same convergence. I'd like to mention three mentors: John Tukey, who foresaw the data-drenched world of today and the importance of data analysis; Yves Meyer, who inspired me to work in multiscale analysis through his eloquent writings and broad scientific attitude; and Raphy Coifman, who foresaw so many of the interactions between harmonic analysis and signal processing that we see today.

We are still only at the beginning of Wiener's adventure. The full convergence of mathematics, computing, and ubiquitous signal capture is still in the future. Perhaps future Wiener awardees will, from time to time, contribute in their own way to Wiener's adventure.
7. Shaw Prize for Mathematics 2013.
7.1. Announcement.

The Shaw Prize in Mathematical Sciences 2013 is awarded to David L Donoho:-
... for his profound contributions to modern mathematical statistics and in particular the development of optimal algorithms for statistical estimation in the presence of noise and of efficient techniques for sparse representation and recovery in large data-sets.
7.2. Press Release.

The Shaw Prize in Mathematical Sciences for 2013 is awarded to David L Donoho, Anne T and Robert M Bass Professor of the Humanities and Sciences, and Professor of Statistics at Stanford University, USA for his profound contributions to modern mathematical statistics and in particular the development of optimal algorithms for statistical estimation in the presence of noise and of efficient techniques for sparse representation and recovery in large data-sets.

The dramatic developments in technology in the last half century present fundamental new challenges in theoretical and applied mathematical statistics. David Donoho has played a major role in developing new mathematical and statistical tools to deal with such problems ranging from large data-sets in high dimensions to contamination with noise. His work provides fast, efficient and often optimal algorithms which are founded on rigorous mathematical analysis.

Key themes introduced in his works, and which today are standard features of the theory, include the exploitation of sparseness of representation of complex objects, related adaptive nonlinear thresholding techniques and the deep relation between sparseness and certain penalty functions that are being minimised (specifically L1L^{1} norms).

Many of these emerge from his development of algorithms for statistical estimators in the presence of noise. These are remarkable in that they overcome the difficulties associated with noise, with very little loss of efficiency or reliability. Along the way, he demonstrated the power of the mathematical theory of wavelets in dealing with such problems in statistics. The Donoho-Johnstone soft-thresholding algorithm has been widely used in statistical and signal processing applications.

During the last 15 years Donoho has developed a theory of sparse and multi-scale representations of signals and data-sets using nonlinear L1L^{1} optimisation methods. These combine very well with techniques of unstructured and redundant dictionaries of functions and provide a fundamental approach to lower the dimensionality of complex problems. Along with Candès and Tao, he made fundamental contributions to the development of "compressed sensing". In terms of sparseness and recovery, this method which "compresses while sensing the data", using dramatically fewer data points while retaining the ability to recover the correct signal, yields strikingly efficient and even optimal algorithms for compressing and decompressing complex signals (e.g. images). This area remains a very active area of research especially in view of its wide applications.

7.3. Biographical Note of David Donoho.

David L Donoho was born in 1957 in Los Angeles, USA and is currently Anne T and Robert M Bass Professor of the Humanities and Sciences, and Professor of Statistics at Stanford University, USA. He graduated from Princeton University in 1978 and received his PhD from Harvard University in 1983. From 1984 to 1990, he was on the faculty of the University of California, Berkeley before moving to Stanford. He is a fellow of the American Academy of Arts and Sciences, a SIAM Fellow, a foreign associate of the French Academy of Sciences, and a member of the US National Academy of Sciences.

7.4. Autobiography of David Donoho.

My father Paul was a physics professor at Rice University. I remember the chalk dust, slate blackboards and marble hallways of his academic office, and the lasers and low-temperature gadgets in his laboratory. Paul took our family on sabbatical to Grenoble, France, where I attended 6th grade - a transforming educational experience.

For university, my mother Julia chose Princeton, a hotbed of bright and ambitious students (e.g. Eric Lander!). My father's advice was 'learn computers'; so I developed data analysis software for the Statistics Department. John Tukey, inventor of the Fast Fourier Transform and the words 'bit' and 'software', was my undergraduate thesis adviser; Tukey advocated robust statistical methods, such as fitting equations by minimising the L1L^{1} norm of residuals rather than the L2L^{2} norm. He criticised 'classical' mathematical statistics as searching for polished answers to yesterday's problems.

After college, I worked in oil exploration research for Western Geophysical, and witnessed the seismic signal processing revolution driven by digital measurement and exciting computer algorithms. Ongoing experiments in blind deblurring and signal recovery from highly incomplete measurements were phenomenally successful. Experimentally, minimising the L1L^{1} norm of the reconstructed signal (rather than the L2L^{2} norm of the residual) was miraculously effective. The strange interactions of sparse signals, undersampled data, and L1L^{1}-norm minimisation, inspired me (age 21!) to pursue 'non-classical' mathematics, to one day explain and use such phenomena.

At Berkeley during my postdoctoral and junior faculty years (1985-1990), I was in the Mecca of classical mathematical statistics, but I pursued my 'non-classical' interests. Iain Johnstone and I showed how to optimally 'denoise' sparse signals observed in noise, injecting 'sparsity' into top statistics journals. A sparse signal sticks up here and there above the noise, like daisies above weeds; our denoiser, based on L1L^{1}-minimisation, chops away the weeds while leaving the daisies. Jeff Hoch and Alan Stern successfully applied such ideas in Magnetic Resonance spectroscopy.

To publish 'non-classical' work on undersampled measurements of sparse signals, I turned to applied mathematics journals. Philip Stark and I found that a highly sparse signal could be perfectly recovered from randomly chosen (slightly) incomplete Fourier measurements, by L1L^{1}-minimisation on the recovered signal. Ben Logan and I extended Logan's PhD thesis, and work of Santosa and Symes, to show that sparse signals missing low frequency information could be perfectly recovered, again by L1L^{1}-minimisation on the recovered signal. Finally, sparse signals missing high frequencies need really very few measurements, if the sparse signal is nonnegative.

'Wavelets' swept over applied mathematics in 1988-1992, led by Yves Meyer, Ingrid Daubechies and Stephane Mallat. The wavelet transform could sparsify signals and images; I was inspired! Iain Johnstone, Dominique Picard, Gérard Kerkyacharian and I used wavelets to optimally remove noise from certain kinds of signals and images. The wavelet transform turned noisy images into sparse signals embedded in noise, and we could rescue the daisies from among the weeds, using L1L^{1}-minimisation.

Coifman and Meyer, and Mallat and Zhang, proposed in the early 1990's to represent signals by combining several transforms - controversial to most, because the equations would be underdetermined, but inspiring to me. Scott Chen, Michael Saunders and I showed that L1L^{1} minimisation ('Basis Pursuit') could often successfully find sparse solutions to such systems.

By the late 1990's, I sought theory explaining such successes. Xiaoming Huo, Michael Elad and I gave 'incoherence' conditions on underdetermined systems guaranteeing that L1L^{1} minimisation found sufficiently sparse solutions perfectly; soon, many others entered this area. In 2004, Emmanuel Candès and Terence Tao got dramatically stronger guarantees by interesting and deep mathematics, triggering a tidal wave of interest.

My 2004 paper 'Compressed Sensing' explained that, because the wavelet transform sparsifies images, images can be recovered from relatively few random measurements via L1L^{1} minimisation. Michael Lustig's 2007 Stanford Thesis used compressed sensing in medical imaging MRI, and, with co-authors, pushed MRI to solve seemingly intractable problems, where L1L^{1}-minimisation from undersampled measurements enables totally new application areas, like MRI movies of the beating heart, or MR spectroscopic imaging (revealing metabolism). In recent MRI conferences, compressed sensing became the most popular topic.

During 2004-2010, Jared Tanner and I discovered the precise trade-off between sparsity and undersampling, showing when L1L^{1}-minimisation can work successfully with random measurements. Our work developed the combinatorial geometry of sparse solutions to underdetermined systems, a beautiful subject involving random high-dimensional polytopes. What my whole life I thought of privately as 'non-classical' mathematics was absorbed into classical high-dimensional convex geometry.

Arian Maleki, Andrea Montanari and I discovered in 2009 a new approach to the sparsity/undersampling trade-off. Solving random underdetermined systems by L1L^{1}-minimisation was revealed as identical to denoising of sparse signals embedded in noise. Two separate threads of my research life became unified.

This brutally compressed account omits extensive work by many others, often more penetrating than my own, and much prehistory. I thank my mentors Peter Bickel, Raphy Coifman, Persi Diaconis, Brad Efron, Peter Huber, Lucien Le Cam and Yves Meyer, all my co-authors, students, and postdocs, and my wife Miki and son Daniel.

7.5. An Essay on the Prize.

For more than two decades David Donoho has been a leading figure in mathematical statistics. His introduction of novel mathematical tools and ideas has helped shape both the theoretical and applied sides of modern statistics. His work is characterised by the development of fast computational algorithms together with rigorous mathematical analysis for a wide range of statistical and engineering problems.

A central problem in statistics is to devise optimal and efficient methods for estimating (possibly non-smooth) functions based on observed data which has been polluted by (often unknown) noise. Optimality here means that, as the sample size increases, the error in the estimation should decrease as fast as that for an optimal interpolation of the underlying function. The widely used least square regression method is known to be non-optimal for many classes of functions and noise that are encountered in important applications, for example non-smooth functions and non-Gaussian noise. Together with Iain Johnstone, Donoho developed provably almost optimal (that is, up to a factor of a power of the logarithm of the sample size) algorithms for function estimation in wavelet bases. Their "soft thresholding" algorithm is now one of the most widely used algorithms in statistical applications.

A key theme in Donoho's research is the recognition and exploitation of the fundamental role of sparsity in function estimation from high dimensional noisy data. Sparsity here refers to a special property of functions that can be represented by only a small number of appropriately chosen basis vectors. One way to characterise such sparsity is to minimise the L0L^{0}-norm of the coefficients in such representations. Unfortunately, the L0L^{0}-norm is not convex and is highly non-smooth, making it difficult to develop fast algorithms for its computation. In addition to pioneering the exploitation of sparsity, Donoho also introduced the computational framework for using the L1L^{1}-norm as a convexification of the L0L^{0}-norm. This has led to an explosion of efficient computational algorithms realising this sparsity framework which have been used effectively in a wide variety of applications, including image processing, medical imaging, data mining and data completion.

A recent and much celebrated development along this sparsity-L1L^{1} theme is Compressed Sensing (a term coined by Donoho). Data compression is widely used nowadays - for example the JPEG standard for compressing image data. Typically, the data is gathered from sensors (for example a camera) and the data is then compressed (that is represented by a much smaller number of coefficients in an appropriate basis, while preserving as much accuracy as possible). Corresponding de-compression algorithms are then used to recover the original data. The revolutionary idea in Compressed Sensing is to shortcut this standard approach and to "compress while sensing", that is to collect a small number of appropriately chosen samples of the data, from which the original data can be recovered (provably exactly under appropriate assumptions) through corresponding de-compression algorithms. The key ingredients are again sparsity (most typical in a wavelet basis), use of L1L^{1}-norm for recovery, and the use of random averaging in sensing. Along with Emmanuel Candès and Terence Tao, Donoho is widely credited as one of the pioneers of this exploding area of research, having contributed fundamental ideas, theoretical frameworks, efficient computational algorithms and novel applications. This is still a thriving area of research with wide applications, but already many stunning results have been obtained (both theoretical and practical).

7.6. Princeton Alumnus Donoho receives Shaw Prize.

Princeton University alumnus David Donoho, the Anne T and Robert M Bass Professor of Humanities and Sciences and a professor of statistics at Stanford University, today was named the 2013 Shaw Laureate in mathematics. Awarded by the Hong Kong-based Shaw Foundation, the Shaw Prize honours recent breakthroughs by active researchers in the fields of mathematics, astronomy and life and medical sciences.

Donoho, a member of Princeton's Class of 1978, was recognised for his work to get a more detailed analysis out of large numerical data sets. Specifically, the prize citation noted his "profound contributions to modern mathematical statistics and in particular the development of optimal algorithms for statistical estimation in the presence of noise and of efficient techniques for sparse representation and recovery in large data sets."

Donoho graduated from Princeton with a bachelor's degree in statistics and completed his doctorate at Harvard University in 1983. Donoho joined the Stanford faculty in 1990 after six years on the faculty of the University of California-Berkeley.

In addition, Steven Balbus, who shared the 2013 Shaw Prize in astronomy, was a Princeton postdoctoral researcher in astrophysics before joining the University of Virginia faculty in 1985. In 2010, Balbus served as the department's Bohdan Paczynski faculty visitor.

The Shaw Prize was founded in 2002 by media mogul and philanthropist Run Run Shaw and includes a $1 million award. The 2010 Shaw Prize in astronomy went to Princeton faculty members Lyman Page, the Henry De Wolf Smyth Professor of Physics, and David Spergel, the Charles A Young Professor of Astronomy on the Class of 1897 Foundation, for their leadership of the Wilkinson Microwave Anisotropy Probe (WMAP) experiment.
8. Gauss Prize 2018.
8.1. The Gauss Prize for 2018.

The Gauss Prize was awarded at the ICM for the first time in 2006 and is now awarded at every ICM. It was established by the International Mathematical Union and the Deutsche Mathematiker-Vereinigung. The Gauss Prize for 2018 was awarded to David Donoho:-
... for his fundamental contributions to the mathematical, statistical and computational analysis of signal processing.
The Gauss Prize is to honour scientists whose mathematical research has had an impact outside mathematics - either in technology, in business, or simply in people's everyday lives.

The award ceremony of Gauss Prize 2018 took place at 7:30 pm on Saturday 4 August before the Social Dinner.

8.2. Citation for David Donoho.

Large amount of data is today generated at an increasingly accelerated pace. Processing it by sampling, compression and denoising has become an essential undertaking.

David Donoho has throughout his remarkable research career helped us make sense of data, which often is in the form of signals and images. His research transcends boundaries between mathematics, statistics and data science. The contributions range from deep mathematical and statistical theories to efficient computational algorithms and their applications.

Already in his early research Donoho was reaching outside of the main stream of classical applied mathematics and statistics. He understood the importance of sparse representation and optimisation in signal processing. He also recognised the power of wavelet type representations for a variety of tasks in signal and image analysis. One example is his work with Johnstone, where they exploit sparsity in wavelet representations together with soft thresholding for enhanced signal estimation and denoising.

The development, analysis and application of curvelets by Donoho and collaborators introduced another powerful tool in sparse image representation. While wavelets can be seen as a generalisation of delta functions and Fourier expansions by using a basis that represents both location and frequency, curvelets go further by adding localisation in orientation. Much of what wavelets do for one-dimensional signals curvelets can do for multi-dimensional data. Efficient representation and processing of images with edges are natural and successful applications.

Compressed sensing is a technique for efficiently sampling and reconstructing a signal by exploiting sparsity in an incoherent representation and thus beating the classical limit on the required sampling rate imposed by the Nyquist-Shannon sampling theorem. This technique has, for example, been applied to shorten magnetic resonance imaging scanning sessions. Donoho and collaborators have contributed to develop and refine this powerful theory. He showed that one could solve some types of underdetermined linear systems via L1L^{1}-minimisation provided that the solution is sufficiently sparse. He derived the existence of sharp transitions for the recovery of sparse signals from special kinds of random measurements.

David Donoho stands out in his ability to bring together pure and applied aspects of mathematics and statistics. He has had a fundamental influence by his original research, and also by his writing and mentoring of students and postdocs.

8.3 Essay on David Donoho' contributions.

Magnetic Resonance Imaging scans (MRIs) are crucial to high-tech medical care: Their three-dimensional view inside your body allows doctors to spot an aneurysm on the edge of bursting, to fly through your brain to plan a surgery, or to pinpoint a hairline crack in a bone - all with nary a scalpel or dose of radiation. However, when you are the patient getting scanned, MRI scans impose very low-tech demands. You are asked to lie perfectly still for as much as an hour, inside a cramped tube that clangs and thumps.

But now the seemingly endless scanning process is about to go by ten times faster, thanks to a new generation of MRI scanners now entering clinical use. The new technology can save time and money on the 80 million MRIs performed each year globally, and will make MRIs practical for fidgety children who can't stay still for long scans. The speed-up also enables ambitious 3D scans and MRI "movies" of the beating heart.

The engineers and doctors who brought these new devices to market were inspired by insights crystallised in mathematics journals in 2006. Those insights today go by the name "compressed sensing," a term coined by the 2018 Gauss Prize winner David Donoho. The Gauss Prize recognises mathematical work with impacts beyond mathematics.

Along with Emmanuel Candès, Terence Tao and other mathematicians, Donoho published mathematical analyses in the mid-2000's showing that compressed sensing (CS) might work practically, and proposed algorithms that were successful enough to inspire further research. A massive outpouring of mathematical and experimental work soon followed, with applied mathematicians, harmonic analysts, and information theorists pushing the theory; numerical analysts and computer scientists creating fast algorithms to enable computation; and MRI researchers adding their own profound understanding of MR physics and many additional creative insights.
FDA approval of new medical devices sets a high bar for any would-be innovation. Yet FDA approvals for compressed-sensing devices were reached in 2017, barely a decade after initial appearance of CS articles in mathematics journals. That's impact!

I. Sparsity.

The 2018 Gauss Prize winner Donoho was one of the first researchers to develop math describing signals that are sparse. Such signals are zero most of the time, with occasional non-zero wiggles. Examples are all around you. Think of the night sky: an occasional star (represented, say, by a very lonely "1") punctuates the vast blackness (represented by a sea of "0"). Think of the human genome: two people differ only once every 300 nucleotides.

Donoho first encountered sparsity just after university, while working in oil exploration. To find oil deep underground, geophysicists would set off explosions, sending seismic waves into the earth. Each time the wave hit a new rock layer, it sent back an echo; from the echo signal, the scientists could reconstruct an image of the layers below. The seismic echo series was sparse because layer changes were relatively rare.

II. L1L^{1}-norm

At age 21, Donoho stumbled on a puzzle that marked his career. In those days, raw seismic measurements only offered a vague, blurry sense of where the rock layers were. But geophysicists had developed methods that seemed amazingly effective at "deblurring" the signal and identifying the layer changes precisely.

Those methods measured distance in a nontraditional way. Ever since foundational work by the mathematical giant C.F. Gauss - as in Gauss Prize - scientists traditionally use the so-called L2L^{2} distance in data processing. This distance is also called the crow's-flight or Euclidean distance because it measures the length of a path that goes straight between two points, like in high school geometry. However, the surprising new methods used instead the "L1L^{1} norm", also called the Manhattan distance, because it measures how many city blocks you would walk if you have to travel between two points on a rectangular grid of city streets (diagonals like Broadway not allowed!).

There was something mysteriously effective in this combination of sparse signals with L1L^{1} norm - but no one knew why. When Donoho returned to graduate school for his Ph.D., he was determined to solve the puzzle. In the coming years, he developed mathematical theory showing the unreasonable effectiveness of the L1L^{1} norm with sparse signals.

Some of the phenomena seemed miraculous. He first used L1L^{1} + sparsity techniques to recover a sparse signal that has been blurred in an unknown, arbitrary way (today called 'blind deconvolution'). He next used them to recover totally missing data. Often in signal processing, some part of a signal can go missing - think of an old acoustic recording with no highs or lows. Donoho, Philip Stark and Ben Logan showed that for certain special signals - sparse ones - L1L^{1} + sparsity techniques could perfectly recover missing low-frequency signal. In other work, Donoho and collaborators Jeffrey Hoch and Alan Stern developed L1L^{1} + sparsity techniques to recover missing high-frequencies - in acoustic terms, missing 'high notes'.

L1L^{1} + sparsity also allowed 'de-noising' of signals: if you add noise to a sparse signal and then look at the plot, you will see 'daisies' - signal - sticking up above 'weeds' - noise; L1L^{1} minimisation gives a way of chopping out the weeds while keeping the daisies. Donoho and Iain Johnstone showed that for sparse signals this was essentially optimal.

III. Harmonic Analysis

The 80's/90's 'wavelet revolution' in applied mathematics further transformed Donoho's thinking. At the time, computational harmonic analysts such as 2014 Gauss Prize Winner Yves Meyer and collaborators Ronald Coifman, Ingrid Daubechies, and Stephane Mallat were building many new tools for mapping digital signals into more useful forms. Their new wavelet transforms literally blew Donoho's mind. Transforming digital data using these new tools revealed that sparsity was everywhere - in images and other media we now use daily. To Donoho's mind, this dramatically expanded the stage for applications of L1L^{1}+Sparsity.

Donoho's mathematical results placed a premium on being as sparse as possible. They drove him to 'sparsify' even better than wavelets could - where possible. He searched for systems 'beyond wavelets' that would expose the hidden sparsity of geometric phenomena such as edges, sheets, and filaments in images. His collaborators Emmanuel Candès and Jean-Luc Starck were soon also aiming beyond wavelets, for "curvelets", "beamlets", and other 'X'-lets.

Donoho worked to sparsify signals even more by combining several different systems of harmonic analysis. For example, a sine wave contaminated with several spikes would not be sparse under traditional Fourier analysis, but it could be sparsely synthesised using both Fourier analysis and wavelets together. With collaborators Michael Saunders and Scott Chen, he developed an algorithm called Basis Pursuit to solve the synthesis problem by minimising the L1L^{1} norm. Its success seemed miraculous because the task - solving a system of underdetermined equations, and algorithmically getting the sparsest possible answer - seemed forbiddingly complex. With collaborators Xiaoming Huo, Michael Elad and Vladimir Temlyakov, Donoho gave a series of foundational mathematical results showing that the L1L^{1} norm could truly find the sparsest possible such synthesis.

IV. Compressed Sensing

The 3 strands of research described in Sections I-III above converged in the mid-2000's to produce Compressed Sensing, the mathematical theory that inspired those fast MRI's that are now coming to market. The sparsity of images when viewed in the wavelet basis; the use of the L1L^{1} norm; the use of underdetermined equations - all three ingredients came together in work by Donoho and by Candès, Romberg, and Tao, mentioned earlier. Their mathematical analyses showed clearly why all three ingredients must be present to allow speed ups, and how, under certain assumptions, this combination is guaranteed to work. Such clear mathematical understanding was transformational, and inspired rapid progress, in MRI research, and elsewhere.

V. Unreasonable Effectiveness

Since his university days, Donoho has believed that mathematicians would contribute during the information era by providing new models for data, new processing algorithms, and subtle but powerful theoretical insights. And he himself has done all three.

What Donoho did not know as a youngster, and could not have known, is that the continuing growth of pure mathematics would be so important. For example, in his own work on compressed sensing, Donoho found that essential roles were played by the theories of random matrices, of high-dimensional Banach spaces, of random convex polytopes, and of mathematical spin glasses - in all cases, pure mathematics unrelated to signal processing and much younger than Donoho himself!

50 years ago, Eugene Wigner coined the phrase 'unreasonable effectiveness of mathematics' to refer to the surprising tendency of pure mathematics to inspire practical applications.

If, some day, you enjoy a fast MRI scan, you may perhaps also remember Wigner's dictum!

8.4. David Donoho's Gauss Award.

David Donoho is the recipient of the 2018 Carl Friedrich Gauss Prize, the major prize in applied mathematics awarded jointly by the International Mathematical Union (IMU) and the German Mathematical Union. Bestowed every four years since 2006, the prize honours scientists whose mathematical research has generated important applications beyond the mathematical field - in technology, in business, or in people's everyday lives - and this award acknowledges David's impact on a whole generation of mathematical scientists.

David Donoho was commended by the IMU President Shigefumi Mori for his "fundamental contribution to mathematics" during the opening ceremony of ICM 2018 in Rio de Janeiro, Brazil.

After the award was announced, David spoke of the joy he has experienced when theories he has developed earlier in his career are applied to everyday life. "There are things I've done decades ago, and when I see things happen in the real world, it makes me so proud. The power we have in moving the world gives me a great deal of satisfaction in my career choice."

He said that a career in mathematics is not limited to pure mathematics, and publication in journals. "There are so many relations between mathematics and the rest of the world. We see more and more relations over time; so much in the modern world is underpinned by mathematics," he said, citing the example of smartphones, and the vast level of mathematical fundamentals intertwined, such as prime factorisation.

David Donoho, who was born in California in 1957, dedicates his professional life to statistics, information theory and applied mathematics. He has made fundamental contributions to theoretical and computational statistics throughout his career, as well as to signal processing and harmonic analysis. His algorithms have made significant contributions to the understanding of the maximum entropy principle, the structure of robust procedures, and sparse data description.

David Donoho is the Anne T and Robert M Bass Professor in the Humanities and Sciences, and Professor of Statistics, at Stanford University; he previously taught at UC Berkeley, and he holds a summa cum laude degree from Princeton University, as well as a PhD in statistics from Harvard University. He has worked in various industries, including oil exploitation, information technology, and quantitative finance. He has previously been awarded the MacArthur Fellowship (1991), the COPSS Presidents' Award (1994), the Norbert Wiener (2010), and the Shaw Prize (2013).

The Gauss prize is a tribute to the German mathematician Carl Friedrich Gauss (1777-1855), who made important contributions to number theory, statistics, mathematical analysis, differential geometry, geophysics, astronomy and optics.

8.5. Exerpt from the Gauss Lecture by David Donoho.

Here, I review a June 2017 Congressional Briefing in Washington, organised by the American Mathematical Society and the Mathematical Sciences Research Institute. That briefing has been described at length in an article published in the January 2018 [From Blackboard to Bedside: High-dimensional Geometry is Transforming the MRI Industry - Aide Mémoire for a Congressional Briefing. Notices of the AMS 65 (1) (2018). The interested reader might consult that article before proceeding.

The Congressional Briefing took place in an atmosphere of uncertainty for US Federal policy, where there was some possibility that funding for US Mathematics research might see noticeable declines under the budgets soon to be put forward by the new administration and congressional majority.

David Eisenbud, director of MSRI, had in mind that I review for an audience of Congresspersons, staff and science officials, a recent transition from research to practice, showcasing the contribution that Mathematical research can make to issues of national concern.

I told the audience on Capitol Hill how the US Food and Drug Administration had very recently (Spring 2017) given bioequivalence certification to a new generation of Magnetic Resonance Imaging (MRI) scanners that the major manufacturers were each bringing to market. Depending on the application, the new generation of scanners can get equivalent images in much less time - in some cases 110\large\frac{1}{10}\normalsize the time. Applications I mentioned included:

- Imaging fidgety children in much less time, without sedation and without ionising radiation.

- Commercialising new diagnostics, for example dynamic images (movies) of the beating heart.

- Enabling new clinical procedures - for example MRI guided biopsy - and advanced surgical interventions.

The individual examples that I showed really only scratch the surface; the industrial developers of the new technology envision it spreading, over time, throughout the whole range of MRI applications. MRI imaging is today a large field: there are 40 million MRI scans annually in the US alone, and 80 million globally. We can envision that as the world grows more wealthy and developed there will eventually be billions of MRI scans yearly, all of which are made much less expensive and much more convenient by these speedups. While many of these scans might indeed be looking at knees, hips, and ankles of ageing wealthy Americans and Europeans, many others will be peering deep into the brains of needy younger patients, as MRI imaging is also essential for modern Neurology and Neurosurgery.

The interested viewer might watch the documentary The English Surgeon about the UK Neurosurgeon Henry Marsh's travels to perform surgery in less advanced countries. In a wrenching pivotal scene, Marsh diagnoses what will inevitably become permanent blindness in a young girl who did not receive an MRI scan when it would still have been helpful. Marsh remarks that the young girl will live forever after with the consequences of lack of availability of prompt MRI scanning in her country. As MRI becomes more available and cheaper globally, we can expect such human tragedies to diminish in frequency.

In marketing the new generation of scanners, the manufacturers all say they are using Compressed Sensing, a term of art introduced by mathematicians a little over 10 years ago.

In a series of papers submitted starting in 2004, mathematicians put forward theorems showing (under a variety of assumptions) that one does not need to actually make one million measurements to reconstruct a million-pixel image.

Such theorems inspired efforts by many engineers, doctors and physicists to develop new algorithms in some way related to the mathematics, to run experiments testing the new algorithms, and to present their work at major conferences. ... In some individual research hospitals, such as Lucille Packard Children's Hospital, faster MRI scanning entered regular use within a few years. Rolling out a technology for widespread use in the US medical market entails regulatory scrutiny from the US Food and Drug Administration. In this present case, the transition to the marketplace has happened in not much more than 10 years.

In my talk, I made the point that in this instance, the Federal research funding system worked exactly as lawmakers had intended. I specifically mentioned Federal funding of individual investigators that supported the many mathematicians and engineers who worked to penetrate the mysteries of randomly-undersampled measurements; I pointed to Michael Lustig, who in 2004 was a Federally-supported graduate student in EE at Stanford but is today a Federally-supported EECS Professor at UC Berkeley; and to Emmanuel Candès, Justin Romberg, and Terence Tao, who in 2004 collaborated on some of the early and inspiring results, while Candès had an office on the UCLA campus at IPAM, one of the NSF-funded Mathematics Institutes, during an IPAM long program on multiscale geometric analysis.

I also tried to make the point that 'serious' mathematics was involved. I formulated the intellectual question in the following terms: Consider a million-dimensional vector which has 10,000 entries scattered at random among many zeros. Can one reconstruct this vector from only 100,000 measurements rather than 1,000,000? If the measurements are in an appropriate sense random, and we reconstruct by L1L^{1}-minimisation of the reconstructed vector, this is the same thing as asking if a random 900,000-dimensional linear subspace typically intersects a certain simplicial cone transversely.

An instructive but very elementary version of the same problem is to ask if a random 1-dimensional linear subspace of RN\mathbb{R}^{N} has nontrivial intersection with a simplicial cone. In the NN = 3-D dimensional case, this amounts to asking if a point randomly distributed on the surface of a sphere lies inside a given spherical triangle. This of course is the ratio of the area of the triangle to the surface area of the sphere.

Gauss himself gave us the formula that solves this elementary problem; he showed that the area of a spherical triangle is given in terms of the sum of the 3 interior angles, which will not be 180 degrees owing to the curvature of the sphere.

Today there is a field called combinatorial geometry which develops generalisations of Gauss' formula, involving interior and exterior angles of polyhedral cones and which allows us to solve the more ambitious problem involving 1,000,000-dimensional simplicial cones, mentioned above. Such mathematical results allow us to see that, for the combination of parameters outlined above, the probability is effectively 1.0 that the subspace and cone intersect transversely! ... In short, with overwhelming probability, we can recover such a vector from 10× fewer measurements than one might have naïvely supposed.

8.6. Extract from Donoho's Gauss Lecture about ICMs.

The ICM.

With the 2018 ICM, I have now attended 4 ICM's spanning 32 years (1986, 1994, 2002, 2018). During that period, this event has changed utterly. The 1986 and 1994 ICM's were held on university campuses and so had the flavour of traditional academic meetings. The opening ceremonies were straightforward affairs. The 2018 Rio ICM was held in a convention centre away from any university, and the opening ceremony was polished in a way that would have been inconceivable in 198622. The ceremony had extensive professional video presentations of the Fields Medalists and even entertainment with Samba Dancers from Brazil. The new persona is more fitting to today's situation. The Fields Medals are now a global brand.

What has stayed constant, in my view, is the quality of the ICM presentations. I remember very clearly some of the talks from the 1986 Congress; they were given by deservedly prominent figures and for me were era-defining. The talks I saw at the 2018 ICM met the same standard.

The Gauss Medal, and the Gauss Prize.

At the 1986 ICM, I remember hallway discussions about the fact that Applied Mathematics was not traditionally recognised at events like the ICM, for example no Applied Mathematician had ever been awarded a Fields Medal. The question young people were asking at the time was whether a young person should go into applications when there was no recognition. Apparently young people are very attuned to cues provided by who gets recognised.

This supposed situation of Applied Mathematics was partially addressed in 1994, with the award of the Fields Medal to Pierre-Louis Lions. It has now been directly addressed with the creation of the Gauss prize of the IMU, that directly recognises impactful work in applications.

During the Rio Congress, I learned that Martin Grötschel had the idea and took the initiative to found the Gauss Prize, using the support of the German Mathematical Society, DMV. In a sense, the previous situation - of the supposed lack of importance of Applied Mathematics - was not some inherent fact about mathematics. It was simply due to the fact that no-one previously had Grotschel's organisational skills, energy, and clear vision. Take note.

The Gauss Prize medal has a story relevant to the 2018 Lecture. The artist Jan Arnold actually undersampled the image of Gauss! At the IMU website we read
Dissolved into a linear pattern, the Gauss effigy is incomplete. It is the viewer's eye which completes the barcode of lines and transforms it into the portrait of Gauss.
This really caught my attention, because the same scheme of undersampled vertical lines is precisely the kk-space sampling pattern that is often used with compressed sensing. Here art anticipates science!

8.7. Extract from Donoho's Gauss Lecture about his collaborators.

My Collaborators.

I have had wonderful collaborators. In the context of the ICM, note that my two close collaborators Iain Johnstone and Emmanuel Candès, have each given Plenary Addresses at ICM (2006 and 2014, respectively). They are each very serious and powerful scientists in their own right. I am very lucky to have been able to work with them.

This is only the start. At the 2018 ICM itself, my co-author Raphy Coifman (Yale) gave a plenary lecture and my co-author Andrea Montanari (Stanford) gave an invited talk, as did my former PhD student Noureddine El Karoui (UC Berkeley). It is my great good fortune to have been able to interact with, and learn from, such leading figures. The broad range of work these different scientists presented was very striking.

The Long Tail.

The professional writers working with the ICM to prepare profiles of the Fields, Nevanlinna, Chern and Gauss awardees are inclined to portray these otherwise fallible individuals as heroes. But in the case of the Gauss Award this idea is poorly conceived.

An award for applications, one that specifically recognises real-world impact, ought to also recognise that applications only arise from a 'long tail' of mathematical scientists who, while not professional mathematicians themselves, are very mathematically minded and talented, while being much more applications facing than any mathematician would be. Such a long tail needs to be inspired and to be creative and driven in its own right, or no impact will result.

In this case, clearly Michael Lustig and John Pauly have distinguished themselves as electrical engineers with exceptional feel for mathematics and its proper deployment into MRI, and exceptional capacity for being inspired by mathematics. Lustig mentions several other MRI researcher colleagues who did major work translating Compressed Sensing into practice, including Tobias Block, Zhi-Pei Lian, Ricardo Otazo, Alexey Samsonov, Daniel Sodickson, Joshua Trzasko, and Julia Velikina. Doubtless there are many others that could also be mentioned.
9. IEEE Jack S Kilby Signal Processing Medal 2022.
9.1. Jack S Kilby Signal Processing Medal 2022.

David Donoho was awarded the Jack S Kilby Signal Processing Medal by the Institute of Electrical and Electronics Engineers:-
... for groundbreaking contributions to sparse signal recovery and compressed sensing.
Th medal is named for Jack St Clair Kilby (1923-2005) who was an American electrical engineer who took part, together with Robert Noyce of Fairchild Semiconductor, in the realisation of the first integrated circuit while working at Texas Instruments in 1958. The Kilby Award Foundation was founded in 1980 in his honour, and the IEEE Jack S Kilby Signal Processing Medal was created in 1995.

9.2. Citation for David Donoho.

David L Donoho's groundbreaking work in sparse signal recovery and compressed sensing revolutionised signal processing and helped change the way engineers think about data acquisition, profoundly impacting fields ranging from wireless communications to medical imaging. Donoho's early work on blind deconvolution showed that sufficiently non-Gaussian signals (sparse signals) can be recovered despite blurring by all unknown filter, which has been applicable to oil exploration, image processing, and wireless communications. He introduced the celebrated wavelet shrinkage algorithm with Iain Johnstone, which became one of the most important methods for separating sparse signals from noise. This work has very concrete significance for signal estimation and has impacted a number of applied fields, including astronomy. Donoho realised that transforming digital data using the wavelet transforms and other tools from applied harmonic analysis revealed that sparsity was everywhere - in images and other media we routinely use - and that enhanced sparsity leads to enhanced estimation, giving us far sharper signals and images to work with. His work on compressed sensing demonstrated that one can exploit sparsity or compressibility when acquiring signals of general interest, and that one can design non-adaptive sampling techniques that condense the information in a compressible signal into a small amount of data. The medical imaging research community has found ways to use the technology to speed up and improve the quality of medical imaging for millions of patients. Compressed sensing has impacted magnetic resonance imaging (MRI) by enabling scan times to be accelerated ten-fold, and a new generation of MRI scanners based on this technology has entered clinical use. Compressed sensing is also being used to improve radio intelligence gathering capability by orders of magnitude, which has impacted the development of radio-frequency sensing and spectral applications over bandwidths exceeding multiple GHz for scientific instrumentation and electronic intelligence.

An IEEE Fellow and member of the U.S. National Academy of Science, Donoho is the Anne T and Robert M Bass Professor of Humanities and Sciences and professor of statistics with the Department of Statistics at Stanford University, Stanford, CA, USA.

Last Updated June 2024