Reputation: 4363
I've got a RabbitMQ queue that might, at times, hold a considerable amount of data to process.
As far as I understand, using channel.consume
will try to force the messages into the Node program, even if it's reaching its RAM limit (and, eventually, crash).
What is the best way to ensure workers get only as many tasks to process as they are capable of handling?
I'm thinking about using a chain of (transform) streams together with channel.get
(which gets just one message). If the first stream's buffer is full, we simply stop getting messages.
Upvotes: 2
Views: 1589
Reputation: 2691
I believe what you want is to specify the consumer prefetch. This indicates to RabbitMQ how many messages it should "push" to the consumer at once.
An example is provided here
channel.prefetch(1);
Would be the lowest value to provide, and should ensure the least memory consumption for your node program.
This is based on your description, if my understanding is correct, I'd also recommend renaming your question (parallel processing would relate more to multiple consumers on a single queue, not a single consumer getting all the messages)
Upvotes: 2