Archive

Posts Tagged ‘Backup’

Backup file of multiple user with rsync

December 17th, 2008 No comments

root problem with rsync

Imagine that you want to backup the /home directory of server ‘A’ to server ‘B’ using rsync.

There is two way to do this :

  • You can run rsync on the server ‘A’, but if you want to correctly backup (I mean, having correct uid/gid/.. on backuped files) files you should connect to the server ‘B’ as root. I’m sure you don’t want to do that.
  • You can run rsync on the server ‘B’, but you should connect to ‘A’ with an user that can read all file in /home. This could be complicated depending of your gid managment.

When Tar start to be your best friend

So how can you do ? the solution would be to store (uid/gid/permission/..) information in a dedicated file, so that you can apply them if you need to restore data.
How can you do that ? I’m sure you are too lazy to write a shell/perl/python/.. script to do that. You’re right ! Use tar.
What ? What ? You want me to tar /home and rsync it ? Are you mad ? I don’t use rsync to transfer 20Go at each backup.

When 1 option and 2 lines can save you

Tar as an incremental option. This mean that you can make a 1st tar file with /home then you can do a 2nd tar file with only modified file since previous tar. This option is -g.
Here is a 2 lines shell script to do the job

gtar -g /var/backup/home/home-backup.snar -cpvzf /var/backup/home/home-backup.`/bin/date +%s`.tgz /home/
rsync --delay-updates -avz -e ssh /var/backup/home backupuser@'B':/var/backup/

–delay-updates is very important because if you don’t use it if ‘A’ crash when rsync is copying the .snar file (used to store incrementation information) you will miss it on ‘B’ and can’t retore tar file correctly.
-g only exist in GNU Tar. You may have to install it if you’re running *BSD. First check if you have a gtar binnary

Categories: Backup, Unix Tags: , ,

Remote Web Site Incremental Backup

December 17th, 2008 No comments

Problem: You have a web site on a server but can’t connect to it with ssh but only FTP and you want to backup it daily but the website is huge, too huge to make a complete backup daily

My solution is to put a php script on the webserver that list files with a modification date in the last X seconds.
Then within the following shell script I get the result of the php page then I retreive listed file using FTP.

This is, I think, the most generic solution. I see an other one:

  • It would be to create a tar archive in the php script, then download the .tar directly. This as 2 problems: It’s less secure unless you can apply an htaccess file on the .tar file to put an auth, you may not be allowed to execute binary on the remote server to do the tar.
  • You may also use FTPFS, it’s nice, but require that the host making the backup can use ftpfs 🙂

This is the source of my SH script

#! /bin/sh

BK_YEAR="`date +%Y`"
BK_MONTH="`date +%b`"
BK_DAY="`date +%d`"

for file in `lynx -auth=login:password -source http://www.domain.com/admtool/last-modified.php`
do ADL="`echo $file | sed "s|/path/to/www.domain.com/root/document/or/the/ftp/chroot/path/||"`"
mkdir -p "/some/path/backup/domain.com/$BK_YEAR/$BK_MONTH/$BK_DAY/web-page/`dirname $ADL`"
wget -q --output-document="/some/path/backup/domain.com/$BK_YEAR/$BK_MONTH/$BK_DAY/web-page/$ADL" -r "ftp://login:pass@ftp.domain.com/$ADL"
done

tar -czf "/some/path/backup/domain.com/$BK_YEAR/$BK_MONTH/$BK_DAY/web-page.tgz" "/some/path/backup/domain.com/$BK_YEAR/$BK_MONTH/$BK_DAY/web-page"
rm -r "/some/path/wd1a/backup/domain.com/$BK_YEAR/$BK_MONTH/$BK_DAY/web-page"

This is the source of my PHP script

<?php 

$path = '.';
$time = 86400;      // will print all file modified in the last 86400 seconds

$curtime = time();

GetFileList($path);

function GetFileList($path) {
$curtime = time();
        $handle=opendir($path);
                while($file = readdir($handle)) {
                        if ($file=='.' || $file=='..') continue;
                        if (is_dir($path . '/' . $file)) {
                                GetFileList($path . '/' . $file);
                        }
                        else {
                                if (($curtime - filemtime($path . '/' . $file)) < 86400) {
                                        echo $path . '/' . $file . "n";
                                }
                        }
                }
        closedir($handle);
}

?>
Categories: Backup, Unix Tags: , ,

Backup through SSH

December 17th, 2008 No comments

Why using SSH to transfer data when you can use SCP/SFTP ?
Because sometime SCP is disable in SSH configuration.

So here is an easy way to transfert data with ssh. Run something like this

ssh -C <host> "cd /path/to/folder/to/transfer; tar cvf - *" | tar xfv

-C will compress tranfer using gzip, I didn’t do any test to see if it’s better to use -C to compress on the SSH level or to use tar cvzf to compress the tar file. If you do please give me result !
“| tar xvf” will extract file in your current directory, of course, you may want to leave them in the .tar file.

Categories: Backup, Unix Tags: , ,