Reputation: 333
I'm having issues enqueuing jobs with Python-RQ, jobs seems well enqueued but they don't run, crash or whatever they have to do.
The process I'm doing is the following:
loren@RONDAN1:/mnt/c/Users/rondan$ sudo service redis-server restart
[sudo] password for loren:
Stopping redis-server: redis-server.
Starting redis-server: redis-server.
loren@RONDAN1:/mnt/c/Users/rondan$ rq info
default | 0
1 queues, 0 jobs total
72a81f7d4cde4e7c865b178772766aef (RONDAN1 95): idle default
1 workers, 1 queues
Updated: 2021-03-16 19:26:52.459427
loren@RONDAN1:/mnt/c/Users/rondan$ rq worker --with-scheduler
19:26:42 Worker rq:worker:72a81f7d4cde4e7c865b178772766aef: started, version 1.7.0
19:26:42 Subscribing to channel rq:pubsub:72a81f7d4cde4e7c865b178772766aef
19:26:42 *** Listening on default...
19:26:42 Trying to acquire locks for default
19:26:42 Scheduler for default started with PID 98
19:26:42 Cleaning registries for queue: default
main.py
import lightquery as lq
from datetime import datetime, timedelta
import time
from redis import Redis
from rq import Queue
queue = Queue(connection=Redis())
def queue_tasks():
queue.enqueue(lq.print_task, 5)
queue.enqueue_in(timedelta(seconds=10), lq.print_numbers, 5)
def main():
queue_tasks()
if __name__ == "__main__":
main()
lightquery.py
import time
def print_task(seconds):
print("Starting task")
for num in range(seconds):
print(num, ". Hello World!")
time.sleep(1)
print("Task completed")
def print_numbers(seconds):
print("Starting num task")
for num in range(seconds):
print(num)
time.sleep(1)
print("Task to print_numbers completed")
loren@RONDAN1:/mnt/c/Users/rondan$ rq worker --with-scheduler
19:26:42 Worker rq:worker:72a81f7d4cde4e7c865b178772766aef: started, version 1.7.0
19:26:42 Subscribing to channel rq:pubsub:72a81f7d4cde4e7c865b178772766aef
19:26:42 *** Listening on default...
19:26:42 Trying to acquire locks for default
19:26:42 Scheduler for default started with PID 98
19:26:42 Cleaning registries for queue: default
19:28:43 default: lightquery.print_task(5) (02a07849-64b5-4f44-8e0b-89bb466110ee)
I tried to display the jobs queue after adding the job to it and prints an empty array (Maybe that can help you to give me a solution)
EDITED
loren@RONDAN1:/mnt/c/Users/rondan$ sudo service redis-server restart
Stopping redis-server: redis-server.
Starting redis-server: redis-server.
loren@RONDAN1:/mnt/c/Users/rondan$ rq info
default | 0
1 queues, 0 jobs total
0 workers, 1 queues
Updated: 2021-03-17 07:59:13.587043
loren@RONDAN1:/mnt/c/Users/rondan$
loren@RONDAN1:/mnt/c/Users/rondan$ rq worker --with-scheduler
08:00:15 Worker rq:worker:b1fcdaaf1c224e239d28f9fdf8509a1a: started, version 1.7.0
08:00:15 Subscribing to channel rq:pubsub:b1fcdaaf1c224e239d28f9fdf8509a1a
08:00:15 *** Listening on default...
08:00:15 Trying to acquire locks for default
08:00:15 Scheduler for default started with PID 65
08:00:15 Cleaning registries for queue: default
loren@RONDAN1:/mnt/c/Users/rondan$ rq info
default | 0
1 queues, 0 jobs total
b1fcdaaf1c224e239d28f9fdf8509a1a (RONDAN1 62): idle default
1 workers, 1 queues
Updated: 2021-03-17 08:02:29.590837
Upvotes: 0
Views: 4197
Reputation: 567
For my case I totally forgot to run the worker and got stuck with this issue for some time. Make sure that you have run the worker when you jobs are being executed. You can run the worker by running:
rq worker --with-scheduler
See https://python-rq.org/docs/workers/
for the reference.
Upvotes: 1
Reputation: 333
Solved, the problem was the way I import the module...
Use
from lightquery import print_task
...
queue.enqueue(print_task, 5)
Instead of:
import lightquery as lq
...
queue.enqueue(lq.print_task(), 5)
Upvotes: 0