y2k stuff

From: Eric Smith <eric_at_brouhaha.com>
Date: Tue Jan 5 13:10:49 1999

John Foust <jfoust_at_threedee.com> wrote:
> There's a dozen lame excuses as to why They Did It That Way. Few of
> them make any sense. If they'd stored the year in a seven or eight
> bits offset from their earliest year, instead of two ASCII or BCD
> digits, they'd halve their storage requirements. There is a good
> discussion of this at <http://language.perl.com/news/y2k.html>.

I'd love to see code to do this for an IBM business computer of the
late 50s or early 60s. I.e., IBM 702, 705, 705-III, 1401, 1410, 7010, 7070,
7080 or even the 650.

AFAIK, *NONE* of those machines had the capability of storing a binary
number into a character of memory. I suppose it might have been possible
to wedge a six-bit number into a character, but it would probably have
required a 64-character lookup table to encode (not too bad), and a loop to
decode (horribly inefficient).

I think it is necessary to understand a bit more about what was actually going
on at the time, rather than just reasoning that since modern computers are
good at storing binary integers, the programmers in the 50s and early 60s
must have been idiots.

Eric
Received on Tue Jan 05 1999 - 13:10:49 GMT

This archive was generated by hypermail 2.3.0 : Fri Oct 10 2014 - 23:32:03 BST