Zip, Download and unzip, That’s what I am doing for more than 6 hours now…transferring data from one server to another…and the worst part is…data size is 30 GIG and that too from one single site…
so tar -cvzf and wget and tar -xvzf
that’s what I am doing….it sucks…
and it sucks even more when you zip 6 GB of data, download on the other server, it downloads only 2 gig of data, unzip it and at that time it shows error…grrr…you think there might be problem in the download, so you download the file again…and again same process…so now you think that file is corrupt…so you go to the old server…delete old zip file…compress the data again…and again download it…and the result…same…f****ng 2 gb file…nothing works…..now time to compress files in small bunch…
this totally sucks…
This can’t be helping you in getting that much needed sleep now, can it?
My condolences to you. Anyway, nice weblog, one of the very few that I bother visiting.
I slept at 6 in the morning, woke up just now…
btw glad to hear that you liked my blog 🙂
Regards
Deep
Did you ever figure out what the problem was with the Zip files? Did you try a different archive format?
– Sean
Actually I couldnt find it out…i was doing gz compression…
I had 2 big files actually…this one was exact 6 GB and other one also was around same size but I am not sure whether it was 5.xx or 6.xx
1st file worked fine but 2nd didnt so I am assuming that..the problem could be some limitation for max. no. of files in an archive or size of an archive limitation…
I did not bother to find out the reason because it was getting too late and we had kept the sites down so that users do not update data on old server….
Deep
Seems like RARing the gz file into hundreds of smaller pieces, and then putting them back together on the other end would work well. Rarlabs makes a Unix version that you can get from http://www.rarlab.com/download.htm.