Attention NAE Members
Starting June 30, 2023, login credentials have changed for improved security. For technical assistance, please contact us at 866-291-3932 or email@example.com. For all other inquiries, please contact our Membership Office at 202-334-2198 or NAEMember@nae.edu.
Click here to login if you're an NAE Member
Recover Your Account Information
BY ROBERT GALLAGER
SUBMITTED BY THE NAE HOME SECRETARY
PROFESSOR PETER ELIAS, probably the most important early researcher in Information Theory after Claude Shannon, died from Creutzfeld-Jacob disease at his Cambridge, Massachusetts home on ...
PROFESSOR PETER ELIAS, probably the most important early researcher in Information Theory after Claude Shannon, died from Creutzfeld-Jacob disease at his Cambridge, Massachusetts home on December 7, 2001. His three children, Daniel, of Lincoln, Massachusetts; Paul, of Cambridge, Massachusetts; and Ellen Elias-Bursac, of Cambridge, Massachusetts, were with him. His wife, Marjorie (Forbes), predeceased him in 1993 after 43 years of marriage.
Pete was distinguished not only for his research but also for his leadership of the Electrical Engineering Department at the Massachusetts Institute of Technology (MIT) from 1960 to 1966, a crucial transition period when the emphasis changed from engineering practice to engineering science and when computer science was initially recognized as a central part of electrical engineering.
Among his many honors and awards, Pete was a fellow of IEEE, a charter fellow of the Association for Computing Machinery (ACM), and a fellow of the American Academy of Arts and Sciences. He was elected to the National Academy of Sciences in 1975 and the National Academy of Engineering in 1979. He received the Claude E. Shannon Award, the highest honor of the IEEE Information Theory Society in 1977, and the Hamming Award, a major medal of the IEEE, shortly before his death. Pete was born on November 26, 1923, in New Brunswick, New Jersey, where his father was an engineer at the Thomas Edison Laboratory.
After two years at Swarthmore College, Pete transferred to MIT, where he received an S.B. in management in 1944. After serving as an instructor for radio technicians in the U.S. Navy for the remainder of World War II, he attended Harvard University where he received a master’s degree in computation. While searching for a Ph.D. topic in 1948, Pete came upon Claude Shannon’s just published masterpiece, “A Mathematical Theory of Communication,” and was hooked for life by its intellectual power and beauty. From the beginning, he realized that information theory provided the proper conceptual basis for digital communication, but that practical utilization required much additional work.
After completing his Ph.D. thesis, Pete was appointed a Junior Fellow in the Harvard Society of Fellows and spent the next 3 years doing research on a wide variety of subjects. This included several pioneering papers on optical communication and some collaboration with Noam Chomsky on linguistic theory, but Pete’s interests were increasingly directed toward information theory. At the time, Bell Telephone Laboratories and MIT were the main centers of research on information theory, and Robert Fano at MIT persuaded Pete to become an assistant professor of electrical engineering at MIT in 1953.
Information theory created a heady atmosphere of intellectual beauty and importance that attracted the very best graduate students at MIT, and the next seven years were extremely productive for Pete as well as for information theory and MIT The cornerstone of Shannon’s theory is an existence proof that data can be encoded to assure essentially error-free transmission over arbitrary noisy channels at any rate less than their capacity.
It would take another 40 years to learn how to reach capacity in practice, but Pete’s 1954 paper, “Error-Free Coding”1 provided a major step in this evolution by developing the concepts of product codes and iterative decoding. The paper used these concepts to invent the ﬁ rst algorithm for achieving error freedom at a strictly positive tansmission rate. Pete’s paper “Coding for Noisy Channels”2 was perhaps the most inﬂ uential early information theory paper after Shannon’s original work.
This provided three giant steps toward the central problem of reliable coding and decoding over noisy channels (here restricted to the simple but easily generalized case of binary symmetric channels). The ﬁ rst step was an upper bound on the probability of error, averaged over all codes of a given rate R and block length n. This was accompanied by a lower bound for the best code of given R and n. The upper and lower bounds were effectively the same and decreased exponentially in n.
This showed that error probability is insensitive to code choice and that modest n could provide sufﬁ cient error freedom in practice. The second step was to show that parity check codes (a class of codes that are particularly simple to implement) are just as effective as arbitrary codes. The third step was the invention of convolutional codes, accompanied by a proof that they are at least as effective as the block codes of all earlier research. The majority of current practical coding systems have evolved through the use of convolutional rather than block codes.
Other early papers that became classics were “Channel Capacity without Coding” and “List Decoding for Noisy Channels.” In the ﬁ rst, Pete provided a concrete example of how the use of feedback can be used to greatly simplify transmission at capacity. The second illustrated how error probability can be reduced if the decoder can provide several possibilities rather than decoding to a single message. Both of these papers appear to be highly specialized, but have led to a number of signiﬁ cant later uses. It was characteristic of Pete’s best papers that many appeared in non-archival places allowing for rapid dissemination.
This was an era where the ﬁ eld was small and collegial, and Pete was singularly uninterested in getting credit for his work. Rather, he was interested in helping other researchers and being part of the research community. He set an excellent example for the graduate students of the time, and information theory has remained a highly collegial ﬁ eld. His classic papers have also been republished in anthologies. In 1960, Pete was promoted to full professor and, at the same time, was appointed head of the Electrical Engineering Department.
He was 37 at the time, a remarkably tender age to be appointed head of the largest department at MIT. He was chosen partly because of his widely recognized tact, good will, and integrity, and partly because he was central to the growth areas of the coming information age. Pete’s research was in high gear at the time and he was ideally situated to solve important fundamental research problems. Accepting the appointment meant putting his research on hold and leading a department of 72 faculty members, many older and more experienced than he.
Pete was an academic and intellectual at heart, but he was also a generalist and humanist who enjoyed interacting with others and the challenge of helping an outstanding group of engineers working on a wide variety of important problems. Despite his qualms, Pete accepted the appointment, and the department changed and prospered enormously over the next 6 years. His style of leadership was to help people develop their own ways of contributing, within the constraints on the department. As one of Pete’s Ph.D. students at the time, I didn’t realize what a gift it was to have a mentor who actively contributed, but also let me develop my own skills in formulating and doing research.
During Pete’s tenure, the department grew by more than 50 percent, and research topics changed even more. At the beginning, the department had a dual focus on the processing and transmission of energy and the processing and transmission of information. By the end of his tenure in 1966, the information side, particularly computer science, had dwarfed the energy side. In 1966, Pete returned to a more academic life of research and teaching. His research shifted somewhat toward computer science, particularly questions concerning storage, organization, and retrieval for large ﬁ les. His papers in this area lay part of the groundwork for the later development of universal data compression algorithms.
Pete was also a senior statesman after 1966 and in considerable demand for government, MIT, and professional committees requiring people of wisdom and tact. Years later, he chaired the Ad hoc Committee on Family and Work at MIT. The report of this committee in 1990 is generally credited with a major improvement in the rules and sensitivities at MIT for balancing family needs and work pressures. Pete became an emeritus professor in 1991.
Although he was “retired,” he still enjoyed coming to his ofﬁ ce most days. He continued to advise students, organize department colloquia and participate in the intellectual life of the community until sickness overcame him. He was always a wonderful conversationalist, so well informed and well balanced that everyone just enjoyed talking to him. His many colleagues miss him greatly.