[lug] File Compression Question
D. Stimits
stimits at idcomm.com
Sat Feb 3 23:38:01 MST 2001
Hugh Brown wrote:
>
> if you compress the files individually, you can also use zless to browse
> them.
This is what I prefer...doesn't work on executables of course. But an
addition, check out also zcat, bzless, and bzcat. It's useful to make
alias with "less -r -M...." variations. The bz versions are for bzip2
(and bunzip2), which provide a lot better compression than gzip does.
And if you want the best possible, use the "-9" option to either:
gzip -9 somefile.txt
bzip2 -9 somefile.txt
One source code package I recently distributed managed to compress from
over 10 meg to just over 1 meg, and part of its space contains icon
images which don't compress at all well. That was using bzip2 -9.
>
> Hugh
>
> Tkil wrote:
> >
> > >>>>> "David" == David Morris <boru at frii.com> writes:
> >
> > David> I have a few directories filled mostly with text files.
> > David> Combined, I have about 800 MB to 900 MB of plain text files
> > David> that can be compressed down to 200 MB or so at best.
> >
> > David> I was just wondering if LINUX offers any way to to compress
> > David> files, but still allow read access to the files...though at a
> > David> slower rate, of course. Sort of like the native compression
> > David> available NTFS...not the best compression in the world, but it
> > David> can at times save a huge amount of space on the hard disk.
> >
> > there have been various compressed filesystems made for linux. do a
> > search for "e2compr", for instance; also, "cramfs" has some of these
> > attributes, but it's for a very different purpose. it doesn't look
> > like e2compr has been worked on much lately, tho. :(
> >
> > my personal tactic for these is to simply put them into tar.gz files,
> > and then use emacs (which can uncompress and un-tar on the fly) to
> > browse them when needed.
> >
> > it becomes a little more cumbersome to do the equivalent of a "grep"
> > in this situation, but it's much more lightweight than any filesystem
> > could be. also, you get much better compression on a single tar file
> > than you would on block-sized chunks (as with any sane compressed FS),
> > and it will definitely be higher than most small files could compress
> > on their own.
> >
> > t.
> >
> > _______________________________________________
> > Web Page: http://lug.boulder.co.us
> > Mailing List: http://lists.lug.boulder.co.us/mailman/listinfo/lug
> _______________________________________________
> Web Page: http://lug.boulder.co.us
> Mailing List: http://lists.lug.boulder.co.us/mailman/listinfo/lug
More information about the LUG
mailing list