Reputation: 2696
I want to deploy a Laravel app from Gitlab to a server with no downtime. On the server, I serve the app with php artisan serve
. Currently, I'm thinking that I would first copy all the files, then stop the old php artisan serve
process on the server and start a new one on the server in the directory with the new files. However, this introduces a small downtime. Is there a way to avoid this?
Upvotes: 1
Views: 294
Reputation: 1202
If you are serving with a single server, you can not achieve 0 downtime. If downtime is a crucial part of your system, then use two server and load balance between them smartly. Remember, no hosting or VPS provider ensure you to deliver 100% availability. So if you want to get 100% availability in the deployment process, in an irony situation, your site may have been down in another time. What I'm saying is that, if tiny moment of restarting php artisan serve
is matter, then scale up to more than one server.
A workaround solution would be use some 3rd party service (like CloudFlare) which can smartly detect server down situation and notify user when it is back, I personally use that.
If you really want full uptime, docker with kubernetes is your technology.
Upvotes: 1