CHM Ref:
X5142.2009
© 1980 Computer History Museum Page
28
of 54
Another thing… I was lucky that each year I did my two weeks training duty. And usually that
meant I came down to Washington and spent two weeks learning the latest things Navy was
doing. And that was an assist all the way along the line…constantly keeping me up to speed on
what was happening in the Navy.
From those 20 years I was at Univac, I saw the development in industry through Univac, and the
development in government through the Navy. I didn’t get isolated. And of course I was always
teaching. So that left me in touch with the academic world.
Pantages:
What course did you teach?
Hopper:
It was a seminar on computers, new developments, future, usually. A little bit of
history, a little bit of now, and a little bit of the future. There’s always a little bit of management in
the background, but never explicitly. I’ve also always hit the need to continue to offer training to
our young people. I think the sad thing right now is that when people start cutting back, the first
thing they cut is training and books and periodicals. And in an industry that’s moving as rapidly
as this one is, it’s terribly important to keep people up to speed. Otherwise, they begin to die on
the vine.
Now in this transition period, they also mention that one of the places to cut is on travel. But in
the travel budget is the money that sends people to seminars and the meetings. And that means
we are cutting our people off from learning what’s going on in their field. They always think they
are going to stop the high mucky mucks from traveling around the countryside, but what they do
is cut off our ability to send people to conferences and seminars and meetings.
Pantages:
Profitability is in new ideas and it always seems strange that they run counter to
that.
Hopper:
I think that the people who cut the travel in the government hear so much about
the high mucky mucks who commandeer a plane to fly around the country for some reason or
other. They don’t hear about the junior people that are being sent to courses, for instance, about
the whole business about database machines. The first seminars were given at places like Utah
and Kansas State. You need travel money to get there.
They don’t see the low-level travel and what it’s for. And when you put a new computer in,
you’ve got to send someone from headquarters to look over and see that they got all the right
things. And the inspection trips are cut back. One of the things I always did if I got put on one of
the inspection trips was see that they set up a library, which is awfully important. At least they
are getting free periodicals.
CHM Ref:
X5142.2009
© 1980 Computer History Museum Page
29
of 54
Battling General Purpose Software
Pantages:
Let’s go back to the Eckert and Mauchly era. I am aware of the compiler
developments there, but there were some arguments – interpretive techniques vs. compiler
techniques. You mentioned that you and Mauchly didn’t see eye to eye on some…
Hopper:
Well it wasn’t so much Mauchly as it was Tolly [Anatol] Holt and that group. They
built an entirely different kind of compiler in which you could constantly generate new code
which was fed into the compiler. My point was that it would become a monstrosity. Every time
they wrote something new using some of the old subroutines or something, it was made a part
of the system. General purpose coding, GP they called it, I think. And it was just going to
mushroom forever. And you couldn’t do that. We’d get beyond the capacity of the machines,
sooner or later. It was a nice idea, but… I also objected to generating code at object time.
Because then every time I ran the program, I’d generate the code again. I wanted to store it and
use it over again.
Again, good old Scots attitude, I didn’t want anything in my final program that didn’t pay off for
me. That’s why I went after the operating systems. Because it was perfectly clear that they
contained general purpose for everything. And then in running any one given program, that was
not what I needed. That’s why today I’d like to rip our compilers apart and have a generator for
handling the data and a generator for this and a generator for that and get it all done ahead of
time until I had the specific program for the specific job. Nothing general purpose left in it. You
could build a generator that would write you a particular code for the particular types of
interrupts that you are going to get in a particular problem… Instead of using a general-purpose
interrupt handler.
And at object time I would like to execute only what was necessary for that problem. I am still
fighting that one – because we are going to have to go faster and we can’t afford to use general-
purpose stuff when it’s possible to write special purpose.
Pantages:
Again, are the economics viable for that?
Hopper:
Yes. Sure we had things like that built into our original compilers. The original
UNIVAC II COBOL, for instance, generated the specific subroutines for handling the input and
output data for that specific problem when you compiled the program. It had an I/O generator in
the compiler. You can afford that because you only compile once. That’s only one step in the
thinking. You are going to use the program again and again afterwards. So it has to be the
tightest thing you write.
Pantages:
Yet IBM, for instance, has been pushing “hardware prices are coming down,
don’t fool around with the software. “