Alex
Alex

Reputation: 1

TypeError: cannot pickle '_thread.lock' object with RQ (Redis Queue) [python 3.9 & 3.8 & 3.7]

I have been trying to use rq to queue API requests from BigQuery since they take so much time I get an H12 (Timeout) Error. The code keeps breaking down when a pass a dataframe to the next enqueue.

here's my worker.py file:

import os

import redis
from rq import Worker, Queue, Connection

listen = ['high', 'default', 'low']

redis_url = os.getenv('REDISTOGO_URL', 'redis://localhost:6379')

conn = redis.from_url(redis_url)

if __name__ == '__main__':
    with Connection(conn):
        worker = Worker(map(Queue, listen))
        worker.work()

this is where the error originates:

data_full = q.enqueue(furnish_stops)
daily_count = q.enqueue(furnish_daily, data_full)

what the first function does is simply call the api to download the data to the data_full dataframe then pass that to another function to create an array for visualization purposes.

the full error report:

Traceback (most recent call last):
  File "app copy.py", line 29, in <module>
    daily_count = q.enqueue(furnish_daily, data_full)
  File "/home/alexis/.local/lib/python3.8/site-packages/rq/queue.py", line 502, in enqueue
    return self.enqueue_call(
  File "/home/alexis/.local/lib/python3.8/site-packages/rq/queue.py", line 400, in enqueue_call
    return self.enqueue_job(job, pipeline=pipeline, at_front=at_front)
  File "/home/alexis/.local/lib/python3.8/site-packages/rq/queue.py", line 560, in enqueue_job
    job.save(pipeline=pipe)
  File "/home/alexis/.local/lib/python3.8/site-packages/rq/job.py", line 648, in save
    mapping = self.to_dict(include_meta=include_meta)
  File "/home/alexis/.local/lib/python3.8/site-packages/rq/job.py", line 590, in to_dict
    'data': zlib.compress(self.data),
  File "/home/alexis/.local/lib/python3.8/site-packages/rq/job.py", line 270, in data
    self._data = self.serializer.dumps(job_tuple)
TypeError: cannot pickle '_thread.lock' object

I have tried this with python version 3.7.11 & 3.8.10 & 3.9.6 all getting the same error

The only other mention of a solution to a similar problem was in this thread but the solution to downgrade to 3.7 did not work for me.

Upvotes: 0

Views: 1010

Answers (0)

Related Questions