Click here to login if you're an NAE Member
Recover Your Account Information
Author: Butler Lampson
Rapid changes in computing will continue for the foreseeable future.
The field of computing has always changed rapidly, and it is still doing so. The changes are driven, more than anything else, by Moore’s law. Many people think the pace of change is slowing, or even that because we already have the Internet and Google, there is not much left to do. I hope these papers will convince you that this view is entirely wrong.
For the last 50 years, new applications of computers have followed a pattern, as one manual activity after another has become automated. In the 1940s, it became possible to automate the calculation of ballistic trajectories and in the 1950s of payrolls and nuclear weapon simulations. By the 1970s, it was possible to create reasonably faithful representations of paper documents on computer screens. In the 1990s, we had the equivalent of a telephone system for data, in the form of the Internet. In the next two decades we will have embodied computers, machines that can interact with the physical world.
Hardware and Software
The factor that determines whether or not an activity can be automated is whether the hardware is up to it. According to Moore’s law, the cost performance of computers improves by a factor of 2 every 18 months, or a factor of 100 every 10 years; this applies to processing, storage, and communication. Moore’s law is not a law of physics, but it has held roughly true for several decades and seems likely to continue to hold true for at least another decade. Indeed, today some things are developing much faster than that. Storage capacity, for example, is doubling every 9 months, not every 18 months. Wide-area communication bandwidth is also improving faster than Moore’s law. Sometimes, with speech recognition and web search engines, for example, the cheaper cycles or bytes can be applied directly. Often, however, by spending more hardware resources, we can minimize programming effort; this is true for applications that use web browsers or database systems.
Hardware is the raw material of computing, but software gives it form. Our ability to write software is limited by complexity. People have been complaining about the "software crisis" at least since the early 1960s, and many people predicted in the 1960s and 1970s that software development would grind to a halt because of our inability to handle the increasing complexity of software. Needless to say, this has not happened. The software "crisis" will always be with us, however (so it isn’t really a crisis). There are three reasons for this: