Let's say I have 2 *nix servers. Server1 that hosts many websites and Server2 which I want to use as an offsite backup.
I am trying to consider the best backup method. Obviously I could write a script to compress each home user directory to a tgz and then FTP it to Server2. But that would probably take some time if I have say 100+ accounts.
Or do I somehow diff all files and only FTP the new/changed files and not bother compressing?
For starters, the backup site should connect to the main site, if you're interested in security. If main can automatically connect to backup, then any compromise of main will also compromise your backups.
There are plenty of programs that can automatically figure out what was changed since the last time and intelligently back things up. I use a program called duplicity for my personal backups (desktop –> home server). I definitely would not back everything up every time since that's incredibly wasteful and can become prohibitive if you have enough stuff being backed up.
There are other programs that can do things like incremental backups, with periodical full backups. This lets you delete old backups from time to time to save disk space. This is what we do on our family server, backing up to one of my servers: we generate a full backup every month, and incremental backups every day of the month. I don't know offhand what program my father used to do this, though.
In both cases, the dumps are fully compressed.
If you want the simple approach, I recommend duplicity. IIRC, it can be set up to do periodic full backups, if anything by tweaking your cron job settings yourself.
I hear rsync is pretty nice. I've never taken time to learn it well enough myself, but the rsync-ftpsync script used to power all Debian mirrors uses rsync with restricted public key logins. Safe, secure, and incremental.