[SGVLUG] How to split a "big backup" [that uses tar]

Michael Proctor-Smith mproctor13 at gmail.com
Thu Sep 29 11:28:28 PDT 2005


On 9/29/05, Emerson, Tom <Tom.Emerson at wbconsultant.com> wrote:
> > -----Original Message-----
> > Behalf Of Michael Proctor-Smith
> >
> > tar -c STUFFGETSCOMPRESSED | gzip - | split -b 2048 - backup
>
> It took a moment to think about it, but just to be certain, would "recovering a file" from this entail the following:
>
>    cat backup* | gunzip | tar -x <files-to-restore>
>
> I'm still a little fuzzy on whether or not some "temporary" file might be involved (which might run into the 4GB filesize limit)  or is that the nature of "pipes" in that these programs actually work "concurrently" so that the number of bytes "in the queue" remains trivially small?

Ya, that is the power of pipes as the tar writes to stdout then gzip
reads from stdin and writres to stdout the split reads from stdin and
write to files. So you only have the output from split every hitting
the disk.

well it would be:
cat backup* | gunzip | tar -x

I have to ask what kernel is running on this server? as 2.4 and above
should not have a 4GB file size limit. Unless it is a fat32
filesystem. Or is it that he can not access the files from a windows
machine?


More information about the SGVLUG mailing list