[lug] ssh compression

D. Stimits stimits at idcomm.com
Wed May 1 12:56:13 MDT 2002


"Sexton, George" wrote:
> 
> The best solution I have found for things like this is to write a cron job
> that bzips the data, pipes the output through gpg, and then use FTP to move
> the encrypted data. You could use NCFTPGET at the destination site to
> retrieve the file. I move a client's SQL Database to my site once a week
> this way. 2.5GB compresses down to 350MB. I also do it early in the morning,
> so it doesn't use bandwidth during production hours.

One advantage here is that bzip is the better compression, and any ssh
tunneling should then have compression turned off. The bzip with -9
compression (or even just -7) is very very good, and a cron job can use
a nice level to cut down how much cpu is used, whereas you might find
problems if you cut the nice level to very small priority in ssh.
Separation of compression from transmission is appealing for batch
processes.

D. Stimits, stimits at idcomm.com

> 
> -----Original Message-----
> From: lug-admin at lug.boulder.co.us [mailto:lug-admin at lug.boulder.co.us]On
> Behalf Of Hugh Brown
> Sent: 01 May, 2002 10:04 AM
> To: LUG
> Subject: [lug] ssh compression
> 
> I am doing an over the net backup of a mail server over an ssh tunnel.
> The bandwidth that it is consuming is killing me.  Is there any way to
> throttle it down?  Would using the compression option in ssh help me or
> hurt me (the man page says it would slow it down, but I don't know if
> that means less bandwidth, longer download or same bandwidth, longer
> download).
> 
> This is being done over a 1Mbps DSL line (on both ends).
> 
> TIA,
> 
> Hugh



More information about the LUG mailing list