Reputation: 27235
I'm preparing to upgrade my main development system to Ubuntu 9.10. Previously it was running 8.10 with no virtualization. I'd like to start taking advantage of the virtualization technologies built into the newer release, but am unsure what kinds of best practices there are for doing this. Although I'm using Ubuntu, answers for other platforms could be useful.
I typically use the following during development:
Does it make sense to set up the base system with no software installed? And then set up virtual machines for doing development? Or should I just install maven/eclipse/etc. into the main host system? Or should I use a VM per project? Or a VM for each tool I use? How can I best take advantage of snapshots, etc?
For instance, if I am testing/debugging an app in Chrome, I might restart the browser many times. So it would be nice to have it running in a separate VM so I can have another Chrome instance running with tabs open for my issue-tracker, javadocs, google research, etc. without it needing to keep getting restarted.
I guess I'm looking for examples of how others have things configured and the pros and cons to doing it that way.
Upvotes: 2
Views: 240
Reputation: 6360
What do you want to use virtualization for? It is a technology, what do you need it for?
If you want to know and understand this technology, just create a virtual single machine, install BSD/Linux/OpenSolaris/etc. and the playground is yours. Experiment with networking, snapshots etc.
Separating a developement environment - just for the kick - between virtual machines does not make any sense to me. Snapshots are quite heavyweight and are no replacement for Subversion and regular backups.
Scenarios worth considering virtualization:
Use virtualization only if you would need to separate stuff on different physical machines. It just gives you the ability to do this virtually.
Upvotes: 2