Zuse and stored-program

From: Tom Jennings <tomj_at_wps.com>
Date: Sun Sep 26 02:07:24 2004


Funny, one of the fore-mentioned URLS on the Z3 state that it wasn't
stored-program, but there's also this tantalizing tidbit from the page:

"Due to the fact that he wanted to calculate thousands of instructions
in a meaningful order, he only used the memory to store values or

I wish I could remember where I read that Zuse thought stored-program
and self-modification of programs was, in effect, Wrong, not simply 'not
a good technical idea'.

The above quote is strange, it implies to me that if programs and data
are in the same store, the resulting program would 'get out of order'.
We know that of course this IS possible (bugs, self-mod ruins a program,
etc) but somehow we've learned to carry on in spite of this horrible

Even von Neumann originally thought self-mod a problem -- if you read
the original (not later edits) of the EDVAC paper you'll see he
specifies a bit of memory-content to differentiate data vs. program, to
prevent programs from changing themselves. It was soon dropped though,
I'd have to look up the dates, but by the time anything got built the
data/prog bit was gone.

JvN got a lot of cred for the 'stored program' concept but I can tell
you from reading original materials (1938 - 1945) that it was a very
well understood concept at the time (hell it's bloody obvious when you
sit down to write up a design; memory is a huge stumble, and re-using
your sole store cuts size in half at slight increase in complexity, no
matter how insightful you were about what you were doing), he was the
first to record it, maybe, but he didn't think it up. Note that he never
claimed to have, either.

In fact I'd go so far as to say that JvN didn't really get computing *at
all*. He was a terrifyingly brilliant man, but he also was savvy enough
to not persue areas in which he was not top-of-the-pile; he curtailed
his involvement in computing pretty early.

The data/prog mem bit thing is in here somewhere


but I don't feel like reading it all again.

In fact, I'll go so far as to say that few people, even those BUILDING
machines, really had any idea what they really were. Turing knew, and
said so -- the ACE REPORT even states that electronic computers don't
"do" arithmetic, they "simulate" it, that's good enough, it's all about
manipulating symbols, not counting and arithmetic, yet look at the
instruction sets of machines ACTUALLY BUILT; they SUCKED at symbol
processing. Hell my LGP-21 barely has an AND instruction, no OR, no
right-shift (except hardware divide). Look at all the EDVAC-thread
machines that *got built*; look at Turing's ACE design vs. what got
built in Pilot ACE (they took out all the logical instructions he
designed in!) Hardware complexity is not the reason; the low-level
logical stuff is the easiest to implement -- it's basically free,
without carry an adder does OR, float augend in it does NOT, set
carry+float augend it does INCREMENT, 2's-COMPLEMENT, etc etc.

In fact, I asked a few of our grad students, programmers all (!) 'what a
computer is' and/or 'what makes it unique' and even today, few either
have the knowledge or perspective to know it's 'the machine that
modifies itself'. Sheesh.
Received on Sun Sep 26 2004 - 02:07:24 BST

This archive was generated by hypermail 2.3.0 : Fri Oct 10 2014 - 23:37:31 BST