zenna
zenna

Reputation: 9176

Best practices for (php/mysql) deployment to shared hosting?

I have worked within a web development company where we had our local machines, a staging server and a a number of production servers. We worked on macs in perl and used svn to commit to stage, and perl scripts to load to production servers. Now I am working on my own project and would like to find good practices for web development when using shared web hosting and not working from a unix based environment (with all the magic I could do with perl / bash scripting / cron jobs etc)

So my question is given my conditions, which are:

What setup do you suggest for testing, deployment, migration of code/data? I have a xampp server installed on my local machine, but was unsure which methods use to migrate data etc under windows.

Upvotes: 6

Views: 2773

Answers (3)

Pascal MARTIN
Pascal MARTIN

Reputation: 400912

I have some PHP personal projects on shared-hosting; here are a couple of thoughts, from what I'm doing on one of those (the one that is the most active, and needs some at least semi-automated synchronization way) :

A few words about my setup :

  • Some time ago, I had everything on SVN ; now, I'm using Bazaar ; but the idea is exactly the same (except, with bazaar, I have local history and all that)
  • I have an ssh access to the production server, like you do
  • I work on Linux exclusively (so, what I do might not be as easy with windows)

Now, How I work :

  • Everything that has to be on the production server (source-code, images, ...) is committed to SVN/Bazaar/whatever
  • I work locally, with Apache/PHP/MySQL (I use a dump of the production DB that I import locally once in a while)
  • I am the only one working on that project ; it would probably be OK for a small team of 2/3 developers, but not more.

What I did before :

  • I had some PHP script that checked the SVN server for modification between "last revision pushed to production" and HEAD
    • I'm guessing this home-made PHP script looks like the Perl script you are currently using ^^
  • That script built a list of directories/files to upload to production
  • And uploaded those via FTP access
  • This was not very satisfying (there were bugs in my script, I suppose ; I never took time to correct those) ; and forced me to remember the revision number of the time I last pushed to production (well, it was automatically stored in a file by the script, so not that hard ^^ )

What I do now :

  • When switching to Bazaar, I didn't want to rewrite that script, which didn't work very well anyway
  • I have dropped the script totally
  • As I have ssh access to the production server, I use rsync to synchronise from my development machine to the production server, when what I have locally is considered stable/production-ready.

A couple of notes about that way of doing things :

  • I don't have a staging server : my local setup is close enough to the production's one
  • Not having a staging server is OK for a simple project with one or two developers
  • If I had a staging server, I'd probably go with :
    • do an "svn update" on it when you want to stage
    • when it is OK, launch the rsync command from the staging server (which will ba at the latest "stable" revision, so OK to be pushed to production)
  • With a bigger project, with more developers, I would probably not go with that kind of setup ; but I find it quite OK for a (not too big) personal project.

The only thing "special" here, which might be "Linux-oriented" is using rsync ; a quick search seems to indicate there is a rsync executable that can be installed on windows : http://www.itefix.no/i2/node/10650

I've never tried it, though.


As a sidenote, here's what my rsync command looks like :

rsync --checksum \
    --ignore-times \
    --human-readable \
    --progress \
    --itemize-changes \
    --archive \
    --recursive \
    --update \
    --verbose \
    --executability \
    --delay-updates \
    --compress --skip-compress=gz/zip/z/rpm/deb/iso/bz2/t[gb]z/7z/mp[34]/mov/avi/ogg/jpg/jpeg/png/gif \
    --exclude-from=/SOME_LOCAL_PATH/ignore-rsync.txt \
    /LOCAL_PATH/ \
    USER@HOST:/REMOTE_PATH/

I'm using private/public keys mechanism, so rsync doesn't ask for a password, incidentally.

And, of course, I generally use the same command in "dry-run" mode first, to see what is going to be synchronised, with the option "--dry-run"

And the ignore-rsync.txt contains a list of files that I don't want to be pushed to production :

.svn
cache/cbfeed/*
cache/cbtpl/*
cache/dcstaticcache/*
cache/delicious.cache.html
cache/versions/*

Here, I just prevent cache directories to be pushed to production -- seems logical to not send those, as production data is not the same as development data.

(I'm just noticing there's still the ".svn" in this file... I could remove it, as I don't use SVN any more for that project ^^ )

Upvotes: 11

troelskn
troelskn

Reputation: 117417

One option is to use a dedicated framework for the task. Capistrano fits very well with scripting languages such as php. It's based on Ruby, but if you do a search, you should be able to find instructions on how to use it for deploying php applications.

Upvotes: 1

aleemb
aleemb

Reputation: 32065

Regarding SVN, I would suggest you go with a dedicated SVN host like beanstalk or use the same server machine to run an SVN server so both developers can work off it.

In the latter case, your deployment script would simply move the bits to a staging web folder (accessible via beta.mysite.com) and then another deployment script could move that to the live web directory. Deploying directly to the live site is obviously not a good idea.

If you decide to go with a dedicated host or want to deploy from your machine to the server, use rsync. This is also my current setup. RSync does differential syncs (over SSH) so it's fast and it was built for just this sort of stuff.

As you grow you can start using build tools with unit tests and whatnot. This leaves only the data sync issue.

I only sync data from remote -> local and use a DOS batch file that does this over SSH using mysqldump. Cygwin is useful for Windows machines but you can skip it. The SQL import script also runs a one line query to update some cells such as hostname and web root for local deployment.

Once you have this setup, you can focus on just writing code and remote deployment or local sync and deployement becomes a one click process.

Upvotes: 1

Related Questions