Let's develop an open-source media archive standard

From: Fred Cisin <cisin_at_xenosoft.com>
Date: Wed Aug 11 14:41:11 2004

> On Wed, 2004-08-11 at 13:13, Steve Thatcher wrote:
> > I would encode binary data as hex to keep everything ascii. Data size would expand,
> > but the data would also be compressable so things could be kept in ZIP files of
> > whatever choice a person would want to for their archiving purposes.

On Wed, 11 Aug 2004, Jules Richardson wrote:
> "could be kept" in zip files, yes - but then that's no use in 50 years
> time if someone stumbles across a compressed file and has no idea how to
> decompress it in order to read it and see what it is :-)

Keeping documentation with them of how LZW compression works
would not be hard, and writing the decompression routines
is not very hard.

But if you want to keep it VERY simple, then just include a
run-length-encoding, instead of LZW.

> Hence keeping the archive small would seem sensible so that it can be
> left as-is without any compression. My wild guesstimate on archive size
> would be to aim for 110 - 120% of the raw data size if possible...
>
> But the data describing all aspects of disk image would be readable by a
> human; it's only the raw data itself that wouldn't be - for both
> efficiency and for ease of use. The driving force for having
> human-readable data in the archive is so that it can be reconstructed at
> a later date, possibly without any reference to any spec, is it not? If
> it was guaranteed that a spec was *always* going to be available, having
> human-readable data at all wouldn't make much sense as it just
> introduces bloat; a pure binary format would be better.

A human readable form is not going to hit your goal of
110 - 120% unless there is SOME form of compression.
Received on Wed Aug 11 2004 - 14:41:11 BST

This archive was generated by hypermail 2.3.0 : Fri Oct 10 2014 - 23:36:34 BST