Everything depends on the level of access you have, and level of backups you want.
For me, at home it is often a live CD (if doing /), followed something like dump [args] -f - what | ssh -i keyfile user@host 'cat > /srv/Backups/host/what-YYYY-MM-DD.dump', with a gzip or other compressor on the fastest side. Once a year the backups get CD-R'd and stuffed in a shoe box at a safer location. While web systems I need to care from from a far usually get done by sftp'ing a backup to a server on the other side of the planet.
The permissions/ownership of things being retained depends on the software you use, and how you use it. I believe OpenBSDs file sets are done with extracting tarballs over /, retentive of permissions.
I've always been partial to tar.
__________________
My Journal
Thou shalt check the array bounds of all strings (indeed, all arrays), for surely where thou typest ``foo'' someone someday shall type ``supercalifragilisticexpialidocious''.
|