At the risk of opening old wounds, I'd point out that there's a significant
difference between microprocessors and microcomputing. While the one requires
the other, and while the Apple][ is an adequate platform for microcomputING, it
doesn't offer much support for learning much about microPROCESSORS, since
they're a hardware device, and the Apple][ environment limits the user's access
to such hardware features considerably.
Microprocessors were developed, in large part, out of a need to reduce the
amount of hardware that implementing various functions required. A good example
would be a video terminal. The benefit that a microprocessor provides is that
it allows the same logic, i.e. that in the CPU to be used for several purposes,
while most functions in a comparable device fabricated using MSI/SSI logic
functions, as was the custom prior to the availability of microprocessors,
required that each function be discretely provided with dedicated logic.
The business of learning about microPROCESSORs involves lots of things other
than BASIC programming, assembler, or the availability of monitor ROMs, most of
which are, in fact, hardware-related, rather than software-related. These are
all very important with respect to microCOMPUTING, but, after all, the
differences between microCOMPUTING and computing in general, are simply a matter
of the choice of platform.
The business of learning about microPROCESSORs involves issues such as system
timing (not an easy thing to change in a video-dedicated environment), effective
decoding techniques, (not an option in a fixed environment, particularly one
requiring limitations on hardware/software interactions as the Apple][ imposes,
doesn't lend itself to learning about the microPROCESSOR and its use.
MicroCOMPUTING can be investigated on nearly any platform, but learning about
microPROCESSORs is more a hardware process, certainly one requiring
understanding of hardware/firmware interaction, but nevertheless one which is
based in the way the microPROCESSOR works and not so much dependent on whether
one is using a resident monitor or a cross-development software suite on a
mainframe to generate the code under study.
Apple][ is a fine place to do your software/firmware development, if you have
the tools to support your effort. It won't allow you to do much of the stuff
that microprocessors were used for back when the '70's and '80's, however.
Microprocessors were, after all, not develped for the benefit of computer
hobbyists. MicroCOMPUTING was not widely accepted as a form of COMPUTING until
the PC was legitimized by the placement of the IBM logo on a microCOMPUTER.
MicroPROCESSORS, however, were in wide use in appliances, switching systems,
communication devices, etc. by then.
The Apple was a nice video-based and clearly graphics-targeted system. At the
time it was developed, there were few applications for the graphics, aside from
games, and that's what Apple promoted early on. One important thing made it
different from many other systems was that its designer(s) were clever enough to
see that it could be applied in a much more general way than many of their
predecessors had done, or even than their successors would do. They wisely
cooked up a memory map that left fairly large contiguous regions of memory
available for use by user programs, albeit interrupted by segments of graphics
memory.
Although the Apple was a decent platform for microCOMPUTING, the very features
that made it desirable for the home user were likely to interfere with the
process of learning about the details of microPROCESSORs and their application.
Development of computing platforms, after all, was then, a really small part of
the microprocessor market.
No system as complex as the Apple could provide a particularly good environment
for learning about the generalities of microPROCESSORs, if for no other reason
than the number of compromises that are made in getting a system to completion.
Microprocessors were devised in order to minimize the extent of the hardware
required to implement a given function. Making a circuit general enough to
provide a computing platform was probably not an initial goal. The fact that it
was so easy to get there from the basic concept is, of course, one of the great
things about this technology.
As in that infamous previous thread, it seems some people have chosen to read
what they wanted to read rather than was was written. That's not my fault,
though.
Dick
----- Original Message -----
From: "Eric Chomko" <chomko_at_greenbelt.com>
To: <classiccmp_at_classiccmp.org>
Sent: Friday, July 13, 2001 9:35 PM
Subject: Re: Apple II for into to microprocessors
>
>
> Sellam Ismail wrote:
>
> > On Thu, 12 Jul 2001, Richard Erlacher wrote:
> >
> > > The Apple][ is not a terribly good system for introducing someone to
> > > microcomputers for a couple of pretty simple reasons. (1) it was
> > > designed from the ground up as a video game, with emphasis and many
> > > compromises on the graphics and little real attention to the more
> > > basic aspects of computing. (2) it was designed around BASIC, rather
> >
> > Here we go again. At the risk of offending Dick with historical fact, the
> > Apple ][ was NOT "designed from the ground up as a video game".
> > Certainly it implemented graphics and sound features, but these were just
> > clever hacks by Woz that added these powerful features without significant
> > additional circuitry.
> >
> > I don't know why I even welcome the eventual flood of nonsense from our
> > friend Dick by even bothering to respond to his message, but nonsense
> > coming from anyone should not go unanswered.
> >
>
> Sellam, as I would agree with you regarding Dick politically, I think in this
> instance,
> though you disagree with him regarding the Apple II and its ability to serve
as a
> microprocessor test system, I think we should at least acknowledge the
on-topic
> aspect of his post.
>
> >
> > > than around a more elementary debugger/assembler, though there were,
> > > in the later models, provisions for assembler, which is probably the
> >
> > BZZZT. Wrong again, the original Apple ][, as I just previously
> > mentioned, had an assembler built into the ROM.
> >
>
> Okay, this I'd like to play around with.
>
> >
> > Thanks for playing. Your consolation prize is a vat of molten iron.
> >
> > Just to keep things in perspective, when Woz designed the Apple-1, and
> > subsequently, by way of evolving the design, the Apple ][, he very much
> > had in mind the design elements and structure of mini-computers of the
> > day. Woz told me personally that he was very inspired by the design of
> > the Data General Nova (mostly because of the simplicity of it's
> > circuitry). The big difference was that the Apple was designed around a
> > cheap microprocessor, rather than implementing his own processor, which I
> > think even you will agree makes more sense, considering the time
> > (1975-76). The monitor feature was there from the start, and was the main
> > interface by which the user interacted with or programmed the computer.
> > Creating a BASIC interpreter was obviously an attempt to make the computer
> > more immediately useful to the average computer geek of the time.
> >
>
> Yes, that made me interested. I think I sold the first one in No. VA, prior to
> Computerland.
>
> >
> > > best tool for learning about the architecture and about microcomputers
> > > in general. That doesn't make it a bad choice as a first computer,
> > > but it does mean one has to take a number of things into
> > > consideration. I don't think it matters terribly whether one has an
> > > Apple][, ][+, ][c, or ][e, in that regard. They all have the same
> > > entaglements with the video hardware, hence, don't allow much
> > > understanding of the workings of the system until a pretty complete
> > > understanding of how NTSC video works is acquired.
> >
> > More nonsense. The Apple ][ is a very good introduction to modern day
> > PCs. It's cheap, abundant, and easy to use and program.
> >
> > > The video-targeted compromises made in the Apple][, e.g. splitting the
> > > video memory into separate portions, serve to make the process of
> > > learning about the interaction of the video subsystem and rest of the
> > > machine more cumbersome, though, ultimately, that's not a bad thing.
> >
> > What better way to learn about video than to HAVE to confront the video
> > system limitations head on and basically write code to emulate what would
> > amount to video hardware in other machines?
> >
> > Anyway, I'm putting my armor back on in anticipation of the impending
> > battle.
> >
>
> Came from a family feud in Texas. Can't we all just get along?
>
> Eric
>
> >
> > Sellam Ismail Vintage Computer
Festival
>
> ------------------------------------------------------------------------------
> > International Man of Intrigue and Danger
http://www.vintage.org
>
>
>
>
Received on Sat Jul 14 2001 - 09:11:20 BST