Reputation: 296
I am continually running a few server scripts (on different ports) with nodejs using forever.
There is a considerable amount of traffic on some of these servers. The console.log commands I have for tracking connection anomalies result in bloated log files that I don't need all of the time - only for debugging. I have been manually stopping the scripts late at night, truncating the files, and restarting them. This won't do for long term, so we decided to find a solution.
Someone else on my system deleted the log files I had set up for each of the servers without my knowledge. Calling forever list on the command line shows that the server scripts are still running but now I can't tail the log files to see how the nodes are doing.
Node downtime should be kept to a bare minimum, so I'm hesitant to stop the servers during daylight hours for longer than a few minutes. Initial testing from the client side seems to indicate that the scripts are doing fine, but I can't be 100% sure there are no errors due to failed attempts at logging to a nonexistent file.
I have a few questions actually:
Thanks for your time.
Upvotes: 11
Views: 7790
Reputation: 1002
1) Just build in a periodic function or admin option to clear the forever logs. From the manual forever cleanlogs
2) At least for linux. Send each log file to /dev/null
. Each log type is specified by options -l
-o
and -r
. The -a
option for append log
, will stop it complaining about the log already existing.
forever start -a -l /dev/null -o /dev/null -r /dev/null your-server.js
Perhaps employ your own logging system, I use log4js
, it doesn't complain if I delete the log file while the node process is still running.
Upvotes: 2
Reputation: 675
according to https://github.com/foreverjs/forever, try to pass -s to silent all log.
forever -s start YOURSCRIPT
Surely, before doing this, try to update forever to the latest:
sudo curl -L https://npmjs.com/install.sh | sudo sh
sudo npm update -g
.
Upvotes: 4