[lug] File Compression Question

Tkil tkil at scrye.com
Sat Feb 3 01:25:32 MST 2001


>>>>> "David" == David Morris <boru at frii.com> writes:

David> I have a few directories filled mostly with text files.
David> Combined, I have about 800 MB to 900 MB of plain text files
David> that can be compressed down to 200 MB or so at best.

David> I was just wondering if LINUX offers any way to to compress
David> files, but still allow read access to the files...though at a
David> slower rate, of course.  Sort of like the native compression
David> available NTFS...not the best compression in the world, but it
David> can at times save a huge amount of space on the hard disk.

there have been various compressed filesystems made for linux.  do a
search for "e2compr", for instance; also, "cramfs" has some of these
attributes, but it's for a very different purpose.  it doesn't look
like e2compr has been worked on much lately, tho.  :(

my personal tactic for these is to simply put them into tar.gz files,
and then use emacs (which can uncompress and un-tar on the fly) to
browse them when needed.

it becomes a little more cumbersome to do the equivalent of a "grep"
in this situation, but it's much more lightweight than any filesystem
could be.  also, you get much better compression on a single tar file
than you would on block-sized chunks (as with any sane compressed FS),
and it will definitely be higher than most small files could compress
on their own.

t.




More information about the LUG mailing list