Reputation: 15559
Like most *nix people, I tend to play with my tools and get them configured just the way that I like them. This was all well and good until recently. As I do more and more work, I tend to log onto more and more machines, and have more and more stuff that's configured great on my home machine, but not necessarily on my work machine, or my web server, or any of my work servers...
How do you keep these config files updated? Do you just manually copy them over? Do you have them stored somewhere public?
Upvotes: 23
Views: 6507
Reputation: 18467
Ira/mackup is a utility for Linux & Mac systems that will sync application preferences using almost any popular shared storage provider (dropbox, icloud, google drive). It works by replacing the dot files with symlinks.
It also has a large library of hundreds of applications that are supported https://github.com/lra/mackup/tree/master/mackup/applications
Upvotes: 0
Reputation: 4072
briefcase is a tool to facilitate keeping dotfiles in git, including those with private information (such as .gitconfig).
By keeping your configuration files in a git public git repository, you can share your settings with others. Any secret information is kept in a single file outside the repository (it’s up to you to backup and transport this file).
Upvotes: 0
Reputation: 6948
I would definetly recommend homesick. It uses git and automatically symlinks your files. homesick track
tracks a new dotfile, while homesick symlink
symlinks new dotfiles from the repository into your homefolder. This way you can even have more than one repository.
Upvotes: 5
Reputation: 1523
I put all my dotfiles in to a folder on Dropbox and then symlink them to each machine. Changes made on one machine are available to all the others almost immediately. It just works.
Upvotes: 1
Reputation: 1603
I also use subversion to manage my dotfiles. When I login to a box my confs are automagically updated for me. I also use github to store my confs publicly. I use git-svn to keep the two in sync.
Getting up and running on a new server is just a matter of running a few commands. The create_links script just creates the symlinks from the .dotfiles folder items into my $HOME
, and also touches some files that don't need to be checked in.
$ cd
# checkout the files
$ svn co https://path/to/my/dotfiles/trunk .dotfiles
# remove any files that might be in the way
$ .dotfiles/create_links.sh unlink
# create the symlinks and other random tasks needed for setup
$ .dotfiles/create_links.sh
Upvotes: 8
Reputation: 6232
Speaking about storing dot files in public there are
and
But it would be really painful to manually update your files as (AFAIK) none of these services provide any API.
The latter is really minimalistic (no contact form, no information about who made/owns it etc.)
Upvotes: 0
Reputation: 242100
I've had pretty good luck keeping my files under a revision control system. It's not for everyone, but most programmers should be able to appreciate the benefits. Read
for an excellent description, including how to handle non-dotfile configuration (like cron jobs via the svnfix script) on multiple machines.
Upvotes: 17
Reputation: 51693
Depending on your environment you can also use (fully backupped) NFS shares ...
Upvotes: 0
Reputation: 9619
Now I use Live Mesh which keeps all my files synchronized across multiple machines.
Upvotes: 1
Reputation: 16926
Svn here, too. Rsync or unison would be a good idea, except that sometimes stuff stops working and i wonder what was in my .bashrc file last week. Svn is a life saver in that case.
Upvotes: 1
Reputation: 218
I use git for this.
There is a wiki/mailing list dedicated to the topic.
Upvotes: 6
Reputation: 2488
I store mine in a git
repository, which allows me to easily merge beyond system dependent changes, yet share changes that I want as well.
Upvotes: 2
Reputation: 10685
i use svn ... having a public and a private repository ... so as soon as i get on a server i just
svn co http://my.rep/home/public
and have all my dot files ...
Upvotes: 2
Reputation: 9938
It seems like everywhere I look these days I find a new thing that makes me say "Hey, that'd be a good thing to use DropBox for"
Upvotes: 7
Reputation: 994649
There is netskel where you put your common files on a web server, and then the client program maintains the dot-files on any number of client machines. It's designed to run on any level of client machine, so the shell scripts are proper sh
scripts and have a minimal amount of dependencies.
Upvotes: 1
Reputation: 131
Rsync is about your best solution. Examples can be found here:
http://troy.jdmz.net/rsync/index.html
Upvotes: 6
Reputation: 755026
I keep master versions of the files under CM control on my main machine, and where I need to, arrange to copy the updates around. Fortunately, we have NFS mounts for home directories on most of our machines, so I actually don't have to copy all that often. My profile, on the other hand, is rather complex - and has provision for different PATH settings, etc, on different machines. Roughly, the machines I have administrative control over tend to have more open source software installed than machines I use occasionally without administrative control.
So, I have a random mix of manual and semi-automatic process.
Upvotes: 1
Reputation: 3302
You could use rsync. It works through ssh which I've found useful since I only setup new servers with ssh access.
Or, create a tar file that you move around everywhere and unpack.
Upvotes: 3