On Fri, 2004-06-25 at 16:47, David V. Corbin wrote:
> "What significant advantage did octal have over hex notation (especially in
> the late '60s timeframe)?"
I'm a bit skeptical of the printer-hardware answer. Printing calculators
don't care about notation, only humans do.
Historically, eg. before computers, characters were logically defined in
six positions -- 5 for the "character" and one for the case (FIGS, LTRS,
etc). Many early electronic representations of symbols used 6 bits to
define the symbol set.
(Why five? how many fingers you got?)
In fact, word widths of computing machines were occasionally described
in how many characters wide, a character being 6 bits.
ALl this character notation crap was inherited from telecomm gear.
Telecom gear (ttys, etc) were adapted to computers for i/o because they
were cheap and plentiful and (sort of) easy to use, not the other 'way
'round.
The other historic data paradigm was hollerith's cards, which IBM
bought. Both were adapted to electronic computing use, neither was
designed for it.
http://wps.com/projects/codes/index.html
Received on Sat Jun 26 2004 - 12:24:21 BST