deecodameeko
deecodameeko

Reputation: 505

celery task logger not writing

I've a problem with celery's logger. I have a function that renders frames. I log the output of the subprocess I spawn but it seems only the first job picked off the queue by each worker is written. All subsequent tasks in the queue do not produce a log file. I've tried using python's own logging as well and the same issue happens. Is there a configuration I may be missing?

 @task(queue='rndr')
 def rndr(params):

     path      = get_logger_path(params)
     logger    = rndr.get_logger(logfile=path)    
     return render(params, logger)

I define my task this way since my retry logger would be defined differently, ie rndr_retry.get_logger...

My celeryconfig looks like the following:

 BROKER_HOST = "xxx.xxx.xxx.xxx"
 BROKER_PORT = 5672
 BROKER_USER = "xxxx"
 BROKER_PASSWORD = "xxxx"

 CELERY_RESULT_BACKEND = 'amqp'
 CELERY_DISABLE_RATE_LIMITS = True
 CELERY_ACKS_LATE = True
 CELERY_IMPORTS = ['lib.tasks.concatenate', 'lib.tasks.encode', 'lib.tasks.render',  'lib.tasks.still_image', 'lib.tasks.retry']
 CELERY_ROUTES = {'lib.tasks.encode':{'queue': 'encode'},
             'lib.tasks.concatenate':{'queue': 'encode'},
             'lib.tasks.still_image':{'queue': 'encode'},
             'lib.tasks.render':{'queue':'rndr'},
             'lib.tasks.retry':{'queue': 'retry'}
             }

Hoping someone can shed some light as to why only the first task off the queue writes...

Thank you in advance.

update: as requested here's a partial version of the render method without all the nitty gritty details...

def render(params, logger):

    #load params to local values

    try:
        #create subprocess
        output = child_proc.communicate()[0]

        logger.info('output')
        logger.info(output)       
        ret = child_proc.wait()                
        if ret not in [0,1]:
            raise Exception('subprocess failed')              

    except Exception, exc:
        logger.info(' '.join(str(x) for x in exc.args))
        #mark as failed...
        return          

    return

I should add that not only does the file not get written during subsequent tasks, it doesn't even create a log file....

Upvotes: 0

Views: 3711

Answers (1)

deecodameeko
deecodameeko

Reputation: 505

After some trials, I notice the log file passed wasn't being created. I added a method to assure the file existed before passing it to get_logger(). Still no luck. Since most of my tasks run subprocess, I decided to take a simpler approach and have an open file object and pass that to the subprocess call in stdout and stderr and close the file object where appropriate. This seems to work no matter how many tasks I run. I should note each task writes to a unique file.

Anyhoo, I think I'll write to the celery devs and mark it as a bug. I had read somewhere in dev forums somewhere that the logger for celery was in need of some love.

Cheers.

Update:

After speaking with the devs of celery it was concluded that the logger isn't meant to be used this way. An instance of logging starts for a task but repeated tasks would not log. I ended up simply writing to a file to circumvent issues with the Logging module. Does the trick and doesn't cause any collisions as each render task uses a unique file.

Upvotes: 5

Related Questions