"Geeks" and licensing

From: Chris Kennedy <chris_at_mainecoon.com>
Date: Sat Dec 15 13:17:26 2001

On Sat, 15 Dec 2001, Jim Davis wrote:

> IMHO: All software development should be performed as such:
>
> 1) Requirements - what should it do, and not do. Spin this till
> everybody signs off.
> 2) Prelim design - Ok, a rough outline of the design, data structures
> and control/data flow defined here.
> 3) Detailed design - Define all the modules and their function, break it
> down.
> 4) Test plan - integrate testing into detailed design, make it unit
> testable. a unit is somthing that has input and output and side
> effects, like a function.
> 5) Finally, coding - build modules in parallel with test code.
> 6) Unit testing - verify that modules comply with detailed design.
> 7) Integration testing - hook it all together, make sure it works, apply
> test plan developed in step 4 for fully integrated aplication.
>
> Do 1-3 until marketing decides what they want,4-7 until you find no
> errors.

Ah, creationism (see the Jargon Dictionary if you're not familiar
with the term).

What is describe is all very textbook. That, then begs the question,
"why is it that software isn't, in general, done in this fashion?".

The answer is found in step one: the assumption that innovative
hunks of software can be fully specified in advanced. For software
that is truly innovative this is empirically very rarely the case,
as what the customer _thinks_ they want is frequently not what they
end up wanting/needing in the end. Trotting out signed-off specs
in no way improves the situation.

In practice the solution is to accept the fact that specs, no matter
how much they are labored over, are for most efforts soft, and
then describe mechanism to deal with uncertainty as part of the
development process. Rapid prototype, stepwise refinement,
the understanding that one cannot test their way to correctness and
a willingness to throw at least one implementation completely
away, coupled with small, agile teams that work with an active
user community empirically produce more correct results in less
time that alternative techniques.

Not like any of this is new. Brooks described this eons ago
in _The Mythical Man Month_...

--
Chris Kennedy
chris_at_mainecoon.com
http://www.mainecoon.com
PGP fingerprint: 4E99 10B6 7253 B048 6685  6CBC 55E1 20A3 108D AB97
-----Original Message-----
From: owner-classiccmp_at_classiccmp.org
[mailto:owner-classiccmp_at_classiccmp.org]On Behalf Of Jeffrey S. Sharp
Sent: Saturday, December 15, 2001 8:51 AM
To: classiccmp_at_classiccmp.org
Subject: Re: "Geeks" and licensing
And possibly add some planning where you use historical data and basic
statistics to figure out how big the product will likely be, how long it
will take, and what resources need to be allocated to complete it.
> For safety critical, you should /have to perform statement and
> decision coverage in step 6 and 7 and the detailed design should have
> a one-to-one corespondence with the detailed design document. Jim
> Davis.
The software process course I just got out of also heavily pushed peer
review of designs and code as a way to filter out defects before testing.
These have their benefits, but I remain unconvinced of their status as the
great panacea of software engineering that the course touted them as.
For a little bit of on-topic goodness, what is the group's opinion on the
trend of software engineering quality starting from ancient times?  Have
we improved (practially, not academically) or worsened?
--
Jeffrey S. Sharp
jss_at_subatomix.com
Received on Sat Dec 15 2001 - 13:17:26 GMT

This archive was generated by hypermail 2.3.0 : Fri Oct 10 2014 - 23:33:39 BST