Tim's own version of the Catweasel/Compaticard/whatever

From: Richard Erlacher <richard_at_idcomm.com>
Date: Tue Jul 4 22:05:29 2000

Please see embedded remarks below.

Dick

----- Original Message -----
From: Chuck McManis <cmcmanis_at_mcmanis.com>
To: <classiccmp_at_classiccmp.org>
Sent: Tuesday, July 04, 2000 7:34 PM
Subject: Re: Tim's own version of the Catweasel/Compaticard/whatever


> At 06:11 PM 7/4/00 -0700, someone wrote:
> > > The CPLD offers a number of advantages over FPGA's. They're not the
same.
> > > An FPGA uses a RAM lookup table in to define the function executed by
a
> > > logic cell. CPLD's use EPROM/EEPROM cells to store the logic
> > configuration.
> > > Most FPGA's require an external configuration prom of one sort or
another,
> > > which it boots on reset.
>
> Not all FPGAs use RAM, infact it is one of the suggested advantages of
> Xilinx FPGAs but it isn't a requirement. CPLDs generally have simpler
> configuration blocks (or macrocells, or what ever the manufacturer wants
to
> call them)
>
Well, the majority of them are RAM based because they get faster technology
out of the RAM-based setups and at lower cost. The real issue is
routability. With FPGA's, be they RAM LUT types or antifuse types, it's
pretty good if you get 60% utilization of the claimed available resources.
It's lower if you're redesigning a pin-locked design or one in which you
need to control timing very closely.

CPLD's OTOH, are almost always nearly 100% useable because of their
architecture. It's noticeably simpler, but that's to its credit.
Manufacturers of both make misleading claims, however. A typical Xilinx
part has a bunch of logic cells, normally consisting of pairs of 16-bit
lookup tables, easily concatenated into a 32-bit table, and, likewise,
easily useable as RAM in the later architectures. These things abound in
great numbers, but they're hard to interconnect. Making a design fit in the
larger arrays, at more than 50% utilization takes weeks, if not months, and
would have to be done for every configuration. I agree that you can program
them from a serial stream or a port as well as a prom of some sort. I just
find it easier to use the prom. A ram would work, too, but why bother if
you can do it from a parallel port?

I think you'll find the FPGA's a mite more expensive than the CPLD's. I
once looked for a small part on Avnet's site and found that only one of the
four available packages was under $1k each. COnsequently it's been a long
time since I used an FPGA from Xilinx. Their 5200 series is pretty
inexpensive, though, but still often over $50 per device. Moreover, DigiKey
doesn't carry the full line.

> >Actually it's pretty easy to get an Xylinx FPGA to read a program from a
> >serial or parallel line, so all you need to do to reprogram is assert
reset
> >and feed in the data correctly.
>
> This is true and the ultimate "catweasal" type board would have a nice fat
> ram loaded FPGA connected to the interface logic and a high speed ram
> buffer. Probably a programmable clock too. That way you could run anything
> from an actual WD1793 type chip to an HP logic analyzer just by
downloading
> a different setup.
>
Making all the functions come out in the same or at least similar FPGA's on
the same pins at the correct timing would be a lot of work. This function
is, by comparison, quite simple and, after all, the CPLD's are
in-situ-programmable. You could do the same trick, i.e. load them from your
parallel port. I submit, howver, that making a design work out in FPGA with
its limited routing resources, on a predetermined pinout would be a really
big job, i.e. assuming you want to put it the functions in a 179x and a 765,
plus the PLL and precomp logic, plus the drive selects, steering logic, I'd
say you're looking at a number of man-years unless you are able to get
in-house logic diagrams from Western Digital and NEC. Getting a
predermined pinout to work in a CPLD isn't always easy either, but a small
job like this one is a week's work.
>
> --Chuck
>
>
>
Received on Tue Jul 04 2000 - 22:05:29 BST

This archive was generated by hypermail 2.3.0 : Fri Oct 10 2014 - 23:32:56 BST