[SGVLUG] How to split a "big backup" [that uses tar]

Emerson, Tom Tom.Emerson at wbconsultant.com
Thu Sep 29 10:06:35 PDT 2005


My reluctanct-sysadmin brother-in-law has run into a problem: one of his backups seems to be hanging, and he tells me there is over 4 gigabytes worth of data being backed up.  The backup method is tar (which is then gzip'd) and the resulting file is physically copied on another system.  The other hitch (other than 4gb+ of data) is that he claims there are some 5000+ files, "all in a single directory" (which I find hard to believe) so he cannot re-factor the backup to do perhaps directories beginning with "a" through "m", then another with directories "n" through "z".

The system is acting as a windows file server, and this directory is the "catch all" directory for everyone on the network (which is why I find it hard to believe there are no subdirectories -- human nature would tend that people using this resource would naturally want to classify "their" files as opposed to "everyone else's" files, and thus create a folder to establish their own boundary...) BUT, that is the claim he is making, so I'm sticking to that story for the moment.

Any (other) suggestions on how to use tar to back up 4gb+ of data?  (I could get the details of the actual command he is using, if that would help)

for that matter, any suggestions on a (better?) way to do backups altogether?  [yeah, I know of Mike's rsynch method, but this is the same guy that can't wait to get rid of the "linux" server and replace it with a "windows" server because he is more comfortable with a gui, and the previous sysadmin was a dyed-in-the-wool servers-are-always-command-line type.]



More information about the SGVLUG mailing list