|
General software and network General OS-independent software and network questions, X11, MTA, routing, etc. |
|
Thread Tools | Display Modes |
|
|||
I use the simpler RCS and mail a tarred copy of the RCS directory to one my Gmail accounts.
On OpenBSD that could be automated with a /etc/rc.shutdown script.
__________________
You don't need to be a genius to debug a pf.conf firewall ruleset, you just need the guts to run tcpdump |
|
||||
One utility you might want to look at is,rsync. I've yet to employ it, but it
might be useful. I have a similiar problem to the one you've brought up Oko, and have been using a CVS+SSH solution for months under OpenBSD, FreeBSD, and Windows XP machines. Over the years, I've made use of scp, tar, cvs, nfs, samba, and sshfs. For small projects, I usually use a simple tar & scp operation to move it from workstation to file server, and then pull it from the next machine that needs it. Important stuff now gets stored under revision control, either locally or committed to a central repository at home. Any kind of data that I always carry around with me (like my vimrc) irregardless of machine, gets placed under CVS. Local modifications are tested, then committed back to the repos. on the OpenBSD box, and other machines then periodically update from the CVS repository. Anything that isn't suitable for use on all machines, has a date with the M4 preprocessor or symbolic links right after checkout. Conflicts, are of course dealt with in the usual manor. There's many version control systems available. If you have any intention to shop around for one; bzr, svn, and git are worth checking out, especially if you'll be dealing with conflicting changes. My file server runs OpenBSD, which comes with CVS pre-installed because that is what the OpenBSD project uses, and thus that is what my personal repository runs -- I can live with CVS perfectly well under that condition. All physical data is of course backed up separately from the repository to avoid a single point of failure.... It is also possible to compress the CVS repository and bring a copy along via E-Mail or flash drive. It makes working on an important project much nicer. If you use a setup like git, it is even possible to have each machine that pulled a copy of files, in effect become a living backup of your repository. Trust me -- using an VCS/SCMS is worth it. The advantages you mention are there, and disadvantages basically depend on the software you choose to use, and the precise situation you face. Getting the current copy of files is easy, as long as you've both committed it to, and can access the repository is trivial. Getting older versions of a file or all the files is somewhat the point of such software, not just an advantage. One thing that you also gain, is the ability to maintain a log of your changes, which you wouldn't have with a file synchronization or network mounting solution. Using file sync stuff like rsync or network mounts like SSHFS or NFS help avoid the need to maintain current versions of files, but rather suck during times without network access to your server. Things like RCS and CVS help deal with files that will change, need history, and may have multiple-versions running around. The rcs program in the base system is a very basic Version Control System (VCS), but IMHO is less well suited to files that are regularly used. You could think of rcs, cvs, and svn (subversion) as the ed, ex, and vi of their problem domain. Everything tools like CVS do, can be emulated with the file system, but under that condition, it doesn't work with the problem of getting lazy or "dang, I forgot to ..." as cleanly as CVS does. Distributed Source Code Management Systems (DSCM) or SCM systems capable of distributed work flows can also be handy (git, bazaar, perforce), especially if you want to maintain a file set under version control on your computer while playing around, then push the entire set of commits to a central repository when you are sure that what you are committing, is what you want. (A good system also makes it easy to pick & choose what parts you want to commit; some can even let you 'rewrite history' before pushing it out, so to speak.) Using a SCMS is also much better suited to situations where you have to live without network access to your home server.
__________________
My Journal Thou shalt check the array bounds of all strings (indeed, all arrays), for surely where thou typest ``foo'' someone someday shall type ``supercalifragilisticexpialidocious''. Last edited by TerryP; 2nd February 2009 at 05:55 AM. Reason: spell check |
|
|||
Quote:
However, if mere synchronization is the fundamental problem, then setting up NFS is even simpler, plus you don't have to worry about committing the local sandbox copy all the time, because with NFS whatever is being edited is the only copy. |
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
Backup strategies and disaster planning | sherekhan | OpenBSD General | 20 | 24th February 2021 11:21 AM |
backup freeBSD 7.0 using Backup Exec | ccc | FreeBSD General | 2 | 25th April 2009 09:23 PM |
The best way to backup windows | TerryP | Other OS | 4 | 8th February 2009 10:32 PM |
Yet another backup question | rex | FreeBSD General | 7 | 7th November 2008 04:22 PM |
Auto backup | cwhitmore | FreeBSD General | 6 | 19th August 2008 05:17 PM |