I don't know about you but I just don't trust my web host with my website data. Over the past few years I've had my website dissappear quite a few times and I've had to upload a backup copy.
Backing up my website can be a pain and I'm lazy, so I don't do it as often as I should.
So I thought I'd look into a way of backing it up automatically.
My initial idea was to use a batch file and the built-in FTP software in Windows (XP), scheduling the batch file to be run every week:
backupwebsite.bat
backup.txt:@Echo Off
@Echo Deleting old copy
del BACKUP_DRIVE_LETTER:\Website
@Echo Backing up current version
ftp -s:backup.txt
But it just hangs on the mget * command.open MY_HOST
USERNAME
PASSWORD
lcd BACKUP_DRIVE_LETTER:\Website
mget *
quit
Does anyone know what I'm doing wrong? Could someone possibly suggest another (free and simple) way to backup my website, please?
Cheers.