Large programs posed a different problem, because the instructions of a program must be in memory before they can be fetched and executed. But like datasets, programs usually break into convenient chunks called subprograms or subroutines, and these could be saved on disk until the moment they were called. Such a technique was called overlaying because these subroutines overlaid old code in memory when they were executed. Then around 1961, several groups succeeded in making overlay and data management entirely automatic and invisible. Programmers would not be constrained by memory size limitations nor would they have to break up their code into overlays. They could write programs that crunched ridiculously huge data areas and contained vast armies of subroutines and never know they were using a much smaller computing engine, except that it was a little slower. Thus, virtual memory was born. |