Reputation: 589
thanks in advance :)
I have this async Celery task call:
update_solr.delay(id, context)
where id is an integer and context is a Python dict.
My task definition looks like:
@task
def update_solr(id, context):
clip = Clip.objects.get(pk=id)
clip_serializer = SOLRClipSerializer(clip, context=context)
response = requests.post(url, data=clip_serializer.data)
where clip_serializer.data
is a dict and url
is a string representing a url.
When I try to call update_solr.delay()
, I get this error:
PicklingError: Can't pickle <type 'instancemethod'>: attribute lookup __builtin__.instancemethod failed
Neither of the args to the task are instance methods so I'm confused.
When the task code is run synchronously, no error.
Update: Fixed per comments about passing pk instead of object.
Upvotes: 4
Views: 4820
Reputation: 589
The context
dict had an object in it, unbeknownst to me...
To fix, I executed code dependent on the context
before the async call and just passed a dict with only native types:
def post_save(self, obj, created=False):
context = self.get_serializer_context()
clip_serializer = SolrClipSerializer(obj, context=context)
update_solr.delay(clip_serializer.data)
The task ended up like this:
@task
def update_solr(data):
response = requests.post(url, data=data)
This works out perfectly fine because the only purpose of making this an async task is to make the POST non-blocking.
Thanks for the help!
Upvotes: 5
Reputation: 1666
Try passing the model instance primary key (pk
). This is much simpler to pickle, reduces the payload and avoids race conditions.
Upvotes: 1
Reputation: 133978
import pickle
class X:
def y(self):
pass
pickle.dumps(X.y)
Pickle works recursively so it might be anywhere in your object graph. You were given the solution - transfer the minimal objects only, that is primary keys and such, instead of Django model objects.
Upvotes: 0