Sylvain Brohee
Sylvain Brohee

Reputation: 91

Limit the number of Luigi workers when several scripts are run concurently

From what I saw and understood, when running several Luigi workflows at the same time, the number of workers is summed. This means that if I run two workflows together and that the number of workers is set to n, in the luigi.cfg file and provided that the workflows use more than n workers at the same time, the central scheduler will make use of 2xn workers.

In the manual of Luigi, I could not find any way to restrict the number of workers to n, even if I run a dozen of workflows at the same time.

This is my luigi.cfg file

[core]
workers: 3

This is the example script I am using (it makes in fact use of sciluigi (a layer on top of luigi) but I don't think it makes a difference concerning the task and the scheduler configuration). I'd like that when I run it several times together, the 3 last workflows wait that the three first workflows are done before starting.

import optparse
import luigi
import sciluigi
import random
import time
import sys
import os
import subprocess


class MyFooWriter(sciluigi.Task):
    # We have no inputs here
    # Define outputs:
    outdir = sciluigi.Parameter();
    def out_foo(self):
        return sciluigi.TargetInfo(self, os.path.join(self.outdir,'foo.txt'))
    def run(self):
        with self.out_foo().open('w') as foofile:
            foofile.write('foo\n')

class MyFooReplacer(sciluigi.Task):
    replacement = sciluigi.Parameter() # Here, we take as a parameter
                                  # what to replace foo with.
    outFile = sciluigi.Parameter();
    outdir = sciluigi.Parameter();
    # Here we have one input, a "foo file":
    in_foo = None

    # ... and an output, a "bar file":
    def out_replaced(self):
        return sciluigi.TargetInfo(self, os.path.join(self.outdir, self.outFile))

    def run(self):
        replacement = ""
        with open(self.in_foo().path, 'r') as content_file:
            content = content_file.read()
            replacement = content.replace('foo', self.replacement)
            for i in range(1,30):
                sys.stderr.write(str(i)+"\n")
                time.sleep(1)
        with open(self.out_replaced().path,'w') as out_f:
            out_f.write(replacement)



class MyWorkflow(sciluigi.WorkflowTask):
    outdir = luigi.Parameter()
    def workflow(self):
        #rdint = random.randint(1,1000)
        rdint = 100
        barfile = "foobar_" + str(rdint) +'.bar.txt'
        foowriter = self.new_task('foowriter', MyFooWriter, outdir = self.outdir)
        fooreplacer = self.new_task('fooreplacer', MyFooReplacer, replacement='bar', outFile = barfile,  outdir =  self.outdir)
        fooreplacer.in_foo = foowriter.out_foo
        return fooreplacer


# End of script ....
if __name__ == '__main__':
    parser = optparse.OptionParser()
    parser.add_option('-d', dest = "outdir", action="store", default=".")
    options, remainder = parser.parse_args()
    params = {"outdir" : options.outdir}    
    wf = [MyWorkflow(outdir = options.outdir)]
    luigi.build(wf)

This is a little perl script I use to run the script concurrently (in Perl, my favourite language :-)).

#! /usr/bin/perl

use strict;

for (my $i = 0; $i < 6; $i++) {
  my $testdir = "test".$i;
  system("mkdir -p $testdir");
  system("python run_sciluigi.py -d $testdir&");
  sleep (2)
}

Upvotes: 4

Views: 3328

Answers (1)

MattMcKnight
MattMcKnight

Reputation: 8290

While not exactly a workers restriction, it is possible to use the resources concept to put a global limit on concurrent execution.

In luigi.cfg

[resources]
max_workers=5

In all of your tasks:

class MyFooReplacer(sciluigi.Task):
    resources = {'max_workers': 1}

http://luigi.readthedocs.io/en/stable/configuration.html#resources

Upvotes: 5

Related Questions