> Why not just write a microcode emulator?
I'm trying to compare the Alto to the Apple ][ (which is what I'm most
familiar with) and they may be about the same order of magnitude. (If you
think about it, it's pretty incredible if it's true, since this little
machine ran such revolutionary programs! That's not to say it ran them
_quickly_ -- the manuals gloss over that sort of thing. That's why I'm
waiting to hear from someone who can describe the actual speed of the Alto.)
The microcycle time is 170 ns. That works out to a clock cycle time (for
the microcode) of just under 6 MHz. Suppose there are 5-10 micro-
instructions per machine-language instruction. Then the figure of 1-2 usec
per instruction (which is on Al's site -- see below) makes good sense.
But it doesn't make much sense to do these kinds of rough calculations.
The main reason is that as far as emulation goes, microcode is a totally
different animal than machine code. Things happen in parallel; although the
parts of the CPU are doing simple things (like simple arithmetic) they are
working together. You would have to duplicate them precisely and make a
single processor act as if it could do things in parallel. As Tony pointed
out, that may even involve accounting for certain individual logic gates.
The gate-level emulation by itself would be doable; the parallel-processing
emulation would be doable; both together would be much less doable.
There are two other snags. The first is that the Alto was designed with a
motley assortment of hardware, all of which would need to be emulated -- the
hard drive, the bitmapped display, the keyboard and mouse (and maybe the
5-key keyset), and the Ethernet. The hardware is exotic enough that it
could be hard to coax a modern PC to duplicate it exactly. (For example,
the Ethernet is from a previous generation, 3MB/sec.)
(Honestly, the pun "coax"="convince" vs. "coax"="coaxial" was NOT intended!)
The second snag is that the microcode drives the hardware. There are some
fancy capabilities that user programs depend on.
For example, the display hardware will take a series of blocks and put them
on the display. The blocks may be inverted. They may have left margins
(which involves shifting blocks to the right on the display). They may have
right margins. This does NOT involve shifting each block to the left! It
involves giving the block a certain size (a 1-dimensional range of memory
addresses, from low to high) and changing the number of memory words per
display line. Therefore, you're changing the size of a 2-dimensional array
which is represented as a continuous (1-dimensional) string of words. The
relationship between the offset of a word (its location in the 1-dimensional
memory) and the placement of its bits (their location on the 2-dimensional
screen) could get rather intricate.
And the microcode has to manage all of the display capabilities, as well as
managing the disk and keyboard and Ethernet, all with precise timings. I
think the microcode actually has a task-switching executive in it.
I wouldn't say an emulator is impossible. Atari and Commodore emulators
emulate display hardware that's more advanced than the Alto's. But they can
emulate the hardware directly. An Alto emulator would have to preserve the
intimate and delicate relationship between the Alto's microcode and its
hardware. The timing involved might be slow enough that a modern PC could
handle it. But it (the timing) is detailed enough that I wouldn't want to
write an Alto emulator without a LOT of spare time and documentation. Of
course, I have neither of those things. :)
The other alternative is not emulating the microcode, just the instructions
that sit on top of them. (I guess that'd be an implicit answer to your
question.) Unfortunately those instructions can _change at run-time_.
Take loading Smalltalk as an example. First you have to boot off the
Ethernet or disk, which is purely microcode.
(As an aside, a parallel thread is about the keyboard combinations used to
select which program you want sent over the Ethernet to your workstation.
The combinations are very arbitrary. That's because the bits from the
keyboard scanning get plopped right into the appropriate field of the
Ethernet boot packet. It's ugly because it's so low-level. Obviously there
weren't any microinstructions to spare to make the interface prettier.)
Then I think you use the base instruction set (a clone of the NOVA, made for
BCPL) to run the OS and executive. That loads Smalltalk. I suspect that
Smalltalk uses its OWN microcode! (There's a big chunk (not in the base
instruction set?) that does fancy BitBlt instructions. Think VGA cards. Of
course since this is Xerox, not IBM, the capability is done _right_, but I
digress.)
I don't know if the details are right, but you can imagine the overhead that
would be involved in writing such an emulator. At the very least, you'd
have to emulate certain "canned" instruction sets (BCPL, Smalltalk) and some
boot microcode. That might prevent you from playing the games, which may
use their own microcode.
On another subject, this Apple Lisa web page:
http://galena.tjs.org/lisa/People/
points to Al's page (Eric said he's the person who's selling the Alto on
eBay which started this whole discussion):
http://www.spies.com/aek/xerox.html
and implies that an emulator is in progress.
But only Eric, or Al, or someone who has experience with an Alto (and who
wants to come forward) can say for sure.
<ferris-bueller>
Anyone? Anyone?
</ferris-bueller>
I would rank Tony's opinion next in terms of credibility. Tony, do my rough
not-based-on-experienced calculations seem plausible? Do the prospects of a
PERQ emulator sound similar to my description?
So there you have it, Sam. This is why computer science is such a
fascinating and frustrating subject. Add bad marketing, entropy, and
general greed and selfishness and things become REALLY interesting. :)
-- Derek
Received on Mon Jan 25 1999 - 22:52:54 GMT