"Nobody programs in machine language",ReRe: "Who you callin Nobody?", ReReRe(...

From: David V. Corbin <dvcorbin_at_optonline.net>
Date: Tue Jun 22 11:00:26 2004

Paul,

Thank for your comments.

First you left out my intentional use of the word "consistantly" when
discussing performance. I was not discussing raw performance. If you ask a
given programmer of any experience level (except novice) to code the EXACT
same algorithm once a week for a year, I strongly doubt that you will get 52
identical binaries. If nothing else a different register will be used
somewhere, etc.

When I have been in a marathon coding session [fortunately not as common,
the body will not really take it any more], I am much more likely to code a
register over-write or forget that a given instruction effects some
conditional flag [or code an unnecessary "update the conditionals" type
instruction] than the compiler is when compiling a piece of
C/C++/C#/VB/Cobol/Fortram code.

Turning the discusion to performance [raw]. On simpler microprocessors [PIC,
6809, 6502, etc] I will concede that an excellent assembler level program
can beat the compiler in many cases. From a business standpoint, it may very
well not be worth the additional cost however. On a multi-processor pentium
class machine with L1 and L2 cache, working in a virtual memory environment,
etc, my experience has shown that most senior programmers with years of
experience can not even come close to matching the compiler! In fact I have
only worked with one individual who really had the "feel" to be able to
create code that ran noticably faster than the code I produced in C++ [after
MUCH study of the code generation, benchmarking different constructs, etc].
Since none of the other [8] team members could code in anything other than
C/C++, his code was "untouchable" if he and I were both unavailable. However
ANY of the team members could easily read my code and make changes. The
result was that he concentrated on the one area of the code that DID require
his skills [the DSP/PC data exchange] while the rest of the team developed
the remainder of the project [Otari's first 100% digital audio mixing
console...Advanta!].

btw: I did not mean to imply that a non-stack oriented architecture implied
that "C" [or other languages] could not be implemented. Simply that the
*vendor name deleted* compiler (which is available as a free download....)
did not. Since the program was fairly trivial, and of low performance
requirements, this was perfectly acceptable and met a nice cost /
programming time match for the specific project.

David.

>>> -----Original Message-----
>>> From: cctalk-bounces_at_classiccmp.org
>>> [mailto:cctalk-bounces_at_classiccmp.org] On Behalf Of Paul Koning
>>> Sent: Tuesday, June 22, 2004 11:01 AM
>>> To: cctalk_at_classiccmp.org
>>> Subject: RE: "Nobody programs in machine language",ReRe:
>>> "Who you callin Nobody?", ReReRe(...
>>>
>>> >>>>> "David" == David V Corbin <dvcorbin_at_optonline.net> writes:
>>>
>>> David> 1) Reliability and Maintainability are more
>>> important than David> Memory/CPU Cycles these days. 2)
>>> Good Code or Bad code can be David> written in ANY
>>> environment. 3) Compilers are better than David> humans
>>> at doing things consistantly.
>>>
>>> (1) Reliability is always more important. But memory/CPU
>>> cycles cannot be ignored when your customers are running
>>> benchmarks, and when you're trying to beat the competition
>>> using less expensive hardware than they are. (2) Yes
>>> indeed -- but being skilled at assembly language
>>> programming imposes a useful discipline that carries over
>>> into other languages. (3) Not true. A compiler will beat
>>> a poor assembly programmer all the time, and an average one
>>> much of the time. But a programmer can know more about the
>>> problem than the compiler can ever know (because the higher
>>> level language can't express everything there is to say
>>> about the problem) so an excellent programmer can always
>>> tie the compiler, and in selected spots can beat the
>>> compiler by a very large margin. It's important to know
>>> when to spend the effort, and that is also part of what
>>> marks an excellent programmer.
>>>
>>> David> The end result is that it is not necessary to spend
>>> the David> hours/days/weeks [many fondly remembered] to
>>> make a program a David> few bytes smaller or a few cycles quicker.
>>>
>>> True, but about a year ago I spent a week or so on a
>>> routine that takes about 30% of the CPU, and (with the help
>>> of a CPU expert) made it 50% faster. It started out faster
>>> already than what the compiler could do; the end result is
>>> way beyond any compiler designer's fondest imagination.
>>>
>>> David> Having a basic understanding of machine
>>> architecture and "what David> goes on under the hood" is
>>> critical to writing good David> code. Knowing the details
>>> [e.g. specific directives for David> implementing a
>>> macro] is not. I recently finished a job using David> a
>>> major microsprocessor [PIC] that DOES NOT HAVE A David>
>>> [accessable] STACK.
>>>
>>> So? A 360 doesn't have a stack either, but that doesn't
>>> prevent it from implementing C. Nor does a Cyber have a
>>> stack; its subroutine call is like the PDP8. But it
>>> supported Algol, which taught recursion to C.
>>>
>>> paul
>>>
Received on Tue Jun 22 2004 - 11:00:26 BST

This archive was generated by hypermail 2.3.0 : Fri Oct 10 2014 - 23:36:59 BST