One of my customers is being hosted at a crappy hosting provider, which I do not trust at all. In fact, I have actually seen that I made changes to the website, which were reverted a couple of days later.
To never lose any data on the FTP, I wrote a script to make backups of the FTP, while not wasting too much bandwidth or disk space. I based this script on the principle that rsnapshot uses: hardlinks and rotation.
#!/bin/bash for i in `seq 100 -1 2`; do if [ -d $i ]; then echo mv $i $((i+1)) mv $i $((i+1)) fi done echo cp -al 1 2 cp -al 1 2 HOST="type-hostname-here.com" USER="type-username-here" PASS="type-password-here" LCD="/backups/1" RCD="/remote/path/httpdocs" mkdir -p $LCD lftp -c "set ftp:list-options -a; set ftp:ssl-force; open ftp://$USER:$PASS@$HOST; lcd $LCD; cd $RCD; mirror --verbose \ --delete \ --exclude-glob __old \ --exclude-glob phpmyadmin
In this example the directory __old is not copied, nor is phpmyadmin. What is does, is move the directory 99 to 100, then it moves 98 to 99, 97 to 98 etc until 2 is moved to 3. It then hardlinks the directory 1 to 2. This way, a 100Mb file that is not modified can exist in all 100 directories while only using one single block of 100Mb of disk space.
Finally, the script uses lftp to download all modified files from the remote ftp server. Luckily, lftp doesn't just open a local file to modify its contents: instead remotely modified files are first unlinked locally, then re-downloaded. This way, lftp does not interfere with the hardlink system.
This method does NOT backup your database. Don't forget to backup your database!© GeekLabInfo Backup your website over FTP is a post from GeekLab.info. You are free to copy materials from GeekLab.info, but you are required to link back to http://www.geeklab.info