Reputation: 2233
So I have Laravel 5.2 project deployed on the remote server, and the problem is that all the HTTP request take giant amount of time, sometimes reaching Connection Timed Out
error.
When I've been testing on local server, everything was working great, so I'm sure that the problem is not with my code, there are no loops or big queries or something.
I suppose that there were too many connections to the server at the same time (when remote), the load on the server was huge. Today I've added more CPU and the problem looks to be solved.
But I'm still curious, how could I avoid situations like this in the future? How can I optimize my code and requests?
For now I use Redis as cache driver, try to use models instead of making query on database each time, also I try to use loops everywhere not to make repetitive requests, but I still have many POST and GET requests and also a lot of work with database.
How can I optimize my code to make it not so heavy to operate with?
Upvotes: 1
Views: 3758
Reputation: 2855
Taking a look at my own projects that I have deployed, I used digital ocean + forge. The server that ran my project had 512MB of ram and one CPU core. This was the least expensive option and my project ran blazingly fast.
I recommend you SSH in to your production server and run the following commands:
composer self-update
composer update
php artisan config:cache
and just to be safe
php artisan optimize
Hope this helps.
Upvotes: 2