realshadow
realshadow

Reputation: 2585

Celery transfer command line arguments to Task

I am struggling with transfering additional command line arguments to celery task. I can set the desired attribute in bootstep however the same attribute is emtpy when accessed directly from task (I guess it gets overriden)

class Arguments(bootsteps.Step):
  def __init__(self, worker, environment, **options):
    ArgumentTask.args = {'environment': environment}

    # this works
    print ArgumentTask.args

Here is the custom task

class ArgumentTask(Task):
  abstract = True

  _args = {}

  @property
  def args(self):
    return self._args

  @args.setter
  def args(self, value):
    self._args.update(value)

And actual task

@celery.task(base = ArgumentTask, bind = True, name = 'jobs.send')
def send(self):
  # this prints empty dictionary
  print self.args

Do I need to use some additional persistence layer, eg. persistent objects or am I missing something really obvious?

Similar question

Upvotes: 3

Views: 1802

Answers (1)

Benoît Latinier
Benoît Latinier

Reputation: 2110

It does not seem to be possible. The reason for that is that your task could be consumed anywhere by any consumer of the queue and each consumer having different command line parameters and therefore it's processing should not depend on workers configuration.

If your problem is to manage environment dev/prod this is the way we managed it in our project:

Each environment is jailed in it's venv having a configuration so that the project is self aware of it's environment(in our case it's just db links in configuration that changes). And each environment has its queues and celery workers launched with this command:

/path/venv/bin/celery worker -A async.myapp --workdir /path -E -n celery-name@server -Ofair

Hope it helped.

If you really want to dig hard on that, each task can access a .control which allows to launch control operations on celery (like some monitoring). But I didn't find anything helpful there.

Upvotes: 3

Related Questions