Altair benchmark

From: Jim Battle <frustum_at_pacbell.net>
Date: Sat Sep 1 22:37:43 2001

>...
> I already know that the Altair32 is painfully slow, primarily
> because of
>the graphics routines used to draw the front panel LEDs. Disabling LED
>updating improves the speed greatly. Doing this, unfortunately, results in
>you have a Turnkey system...
>
> Thanks to all.


Rich --

I looked at your altair32 code a while back, but it was pretty cursory and
it was a while ago so I don't recall how the program is structured.

I'd hope that there is a section of code that emulates the hardware, and a
pretty clean separation of the code that handles the windows GUI
stuff. I'd imagine that after each cycle, some part of the CPU emulator
section calls the GUI part to indicate what each LED should be set to.

It sounds like right now the GUI part requests a BLIT of each LED state,
either on or off, each cycle, which would eat up lots and lots of times
since, in theory, there should be millions of these blits per second when
the CPU isn't single stepping.

As it stands, you might BLIT a given LED thousands of times a second, but
you only get to see the one blit that was most recent at the time the
screen is scanned at a 60 or 72 hz rate. That is, 99% of the current BLITs
get overwritten before any of them show up on the screen.

I have two suggestions that vary in complexity and results.

If you are happy with the current visual appearance of the LEDs and can
live with the temporal aliasing, there is something very simply you can
do. As the CPU part emulates the processor, it just goes on its merry way
and doesn't tell the GUI anything. Set up a 60 or 72 (or whatever) Hz
timer on the windows side of the house. The timer handler then gets the
most recent LED state when it gets triggered and does the BLIT of on or off
for each LED, skipping thousands of updates that would never be seen
anyway. It is not important to synchronize the update to the screen to the
actual screen refresh rate.

Here is a second way that is more complicated, but gives superior results.

Every time the GUI is called with a "turn LED on" or "turn LED off"
request, don't do the BLIT. Instead, just tally how many "on" requests you
get out of the total number of requests you get. In the GUI part of the
code, set up a windows timer to trigger each 60th or 72nd of a
second. When the timer goes off, the timer handler looks at what
percentage of the time each LED was supposed to be on vs off. You should
then BLIT an image of the LED that is x% on (how fine grained you want to
make this is up to you, but probably 16 levels of on-ness is good
enough). Also reset the LED duty cycle counters to 0.

This assumes that the LED update gets called every cycle. If the update
rate is variable, then you need to also know how many cycles have passed
since the last update so that you can properly calculate the weighted
average of the on vs off time.

Not only will this save 99% of the BLITs you are doing, but it will perform
a temporal antialiasing. In your existing code and my first simple
suggestion, you are effectively sampling a 2 MHz signal at 72
Hz. Depending on what the 2 MHz signal looks like, you could get some
strange output. This second more complicated suggestion effectively
implements a box lowpass filter.

Once your emulator runs faster than real time, then implementing automatic
rate regulation gets interesting, and is a lot more heuristic. But that's
another subject...

-----
Jim Battle == frustum_at_pacbell.net
Received on Sat Sep 01 2001 - 22:37:43 BST

This archive was generated by hypermail 2.3.0 : Fri Oct 10 2014 - 23:34:23 BST