Attention NAE Members
Starting June 30, 2023, login credentials have changed for improved security. For technical assistance, please contact us at 866-291-3932 or helpdesk@nas.edu. For all other inquiries, please contact our Membership Office at 202-334-2198 or NAEMember@nae.edu.
Click here to login if you're an NAE Member
Recover Your Account Information
BY ROBERT M. GRAY
THOMAS M. COVER, one of the past half-century’s most brilliant and prolific contributors to information and communications theory, pattern recognition and learning, and the analysis of gambling and investment strategies, ...
THOMAS M. COVER, one of the past half-century’s most brilliant and prolific contributors to information and communications theory, pattern recognition and learning, and the analysis of gambling and investment strategies, died on March 26, 2012, at the age of 73.
Tom was born on August 7, 1938, in San Bernardino in California’s “Inland Empire.” His interest in sports and games developed early; he played champion-level tennis and little league baseball, including a trip to the 1951 Little League World Series (where his team came in fourth).
In high school his interests expanded to nonathletic games and probability, and he learned to play poker. These interests presaged a lifelong love of sports and his eventual interest in the statistical analysis of games and sports and in algorithms for predicting and betting on random phenomena.
Tom received his BS in physics from MIT in 1960 and then moved to Stanford for an MS in 1961 and a PhD in 1964 (under Norm Abramson), both degrees in electrical engineering. While a graduate student he became interested in statistics and the theory of games.
Perhaps as a practical application, he and several friends learned the counting technique for playing blackjack from a preview copy of Edward Thorp’s book Beat the Dealer: A Winning Strategy for the Game of Twenty-One. They became sufficiently adept to become officially unwelcome in Nevada casinos. This anecdote highlights Tom’s early and deep expertise in probability theory and its application, and his interest in algorithms for winning at stochastic games.
In 1964 he joined the faculty of Stanford University as an assistant professor of electrical engineering (EE), in 1967 he became a tenured associate professor, and in 1971 was awarded a joint appointment in EE and statistics. He remained at Stanford for his entire career, becoming the endowed Kwoh- Ting Li Professor of Engineering, Electrical Engineering, and Statistics in 1994.
He also had visiting appointments at MIT and Harvard and served as a consultant to SRI, the RAND Corporation, AT&T Bell Labs, and the California State Lottery. His dissertation, and subsequent early career work on systems of linear inequalities describing networks of linear threshold devices, laid a foundation for the study of artificial neural networks and their application to pattern recognition. A specific concern, described in his dissertation abstract, was the ability of such systems “to generalize with respect to past data”; this later became known as statistical or machine learning.
His work with his first PhD student, Peter Hart, yielded a widely acknowledged classic paper on nearest neighbor pattern classification, demonstrating that the simple nearest neighbor rules fundamental to communications and statistics provided performance close to the optimal detectors. This technique remains one of the simplest and most powerful approaches to pattern recognition, classification, detection, and learning in engineering systems and other applications making inferences based on data.
His paper “Broadcast Channels,” published in 1972, announced a new direction for his work and spawned a widespread shift in emphasis of Shannon information theory from the classic single-user point-to-point communications systems to multiuser systems.
In 1949 Claude Shannon had revolutionized the theory and eventual practice of electronic communications by quantifying the optimal performance of probabilistic models of communication systems; these results led decades later to the modern digital communication revolution.
Although Shannon emphasized single-user systems, he introduced the idea of the more general systems that would be required in networks of users—but he did not extend his basic results to such systems. Nearly a quarter-century later, Cover’s broadcast channel was the first major breakthrough in this area, providing theoretical performance bounds and actual constructions for good codes in a system involving a single transmitter with multiple receivers.
As in Shannon’s original work, it was clearly demonstrated that sophisticated coding could provide better performance than naïvely applying traditional methods. Tom’s award-winning paper is a classic both in Shannon information theory and in the theory and practice of multiuser communication systems, such as wireless and satellite. The simple model provided a good fit for certain practical systems and inspired many extensions and variations suitable for modern communication systems.
Tom’s next major shift in direction began soon after, with his interest in Kolmogorov/Chaitin/Solomonoff complexity as a complement to Shannon information theory and the implications for universal coding and universal gambling schemes, algorithms with predictable and nearly optimal performance in incompletely known statistical environments. It should be emphasized that as Tom moved into new areas, his contributions to his earlier interests continued with new insight and a historical appreciation of the development of the fields. As an illustrative example, his 1979 paper with Abbas El Gamal on capacity theorems for the relay channel became his second most cited paper.
The next year he published with El Gamal the highly cited survey “Multiple User Information Theory” in the Proceedings of the IEEE (reprinted in Multiple Access Communications: Foundations for Emerging Technologies, 1993, ed. Norm Abramson). Tom became and remained a primary expositor of open problems in information theory. In the late 1970s, Tom and his students published papers on sports statistics and on applications of gambling ideas to the estimation of random phenomena.
In 1980 he and his collaborators began a series of publications applying variations of these ideas to investment portfolios, building on the obvious similarity between gambling and investing. In 1982 Tom, again working with El Gamal, published the foundational results of multiple description coding.
These are codes designed for communication systems such as packet networks where information is spread in pieces over many channels and the receiver must reconstruct signals based on some of the pieces (e.g., packets that are not lost). Again, Tom’s work involved elegant models and analysis and proved seminal to a new field of theory and application of fundamental importance to the emerging area of network communications.
Tom put his expertise in the theory of gambling and his skills in statistics to practical use as a contract statistician for the California State Lottery from 1986 to 1994, designing tests for lottery balls and wheels, analyzing the payoff structure, and seeking vulnerabilities to fraud. Throughout his career, Tom developed many new simple proofs for famous results, both on his own and in collaboration with others.
The associated papers were marvels of pedagogy, enhancing the understanding and appreciation of many tools of information theory and more general probabilistic analysis. His pedagogical talents reached their apex in his 1991 book Elements of Information Theory, coauthored with Joy A. Thomas, a book that quickly became and remains the best-selling text by far on the subject. Tom’s list of professional awards began in 1974 with the IEEE Information Theory Group Outstanding Paper Award for “Broadcast Channels.”
He became a Fellow of the IEEE (1974), Institute of Mathematical Statistics (1981), and American Association for the Advancement of Science (1991). In 1990 he was selected as the IEEE Claude E. Shannon Lecturer, considered the highest honor for contributions to information theory, and in 1997 he received the IEEE Richard W. Hamming Medal “for fundamental contributions to information and communication theory, statistics and pattern recognition.”
He was elected to the National Academy of Engineering in 1995 and served on the organizing committee for the National Research Council’s Workshop on Statistical Analysis of Networks, the Committee on Applied and Theoretical Statistics, and a joint Academies Briefing Panel on Pattern Recognition. In 2003 he was elected to the American Academy of Arts and Sciences. Tom was a natural raconteur who enjoyed sports both as a participant and as a spectator. His wit was famous and his laughter infectious.
His ability to grasp the essence of an issue was as evident in nontechnical conversations and committee meetings as it was in his mathematics, and his humor had the added advantage in committees of speeding convergence and smoothing the path. His talks were thoughtful and thought provoking, his lectures informative, popular, and entertaining. His many students became outstanding engineers, statisticians, and teachers and Tom’s reputation as a mentor and fount of ideas is unequaled in his professional sphere. His loss was shockingly sudden to us all, and leaves a large hole in the lives of those who knew him. He was unique.
Tom is survived by his wife Karen, son Bill, daughter Cindy Black; brothers Bill, Chuck, and John; and grandchildren Carolina, Gabriella (“Gailie”), Marina, Jon, Brian, and Laura.