Sneha
Sneha

Reputation: 1

In Simpy how to write parallel process where one process's output is input for next process

I have a problem with optimized time calculation

I have multiple machines in my environment, lets say A,B,C,D,E and all of them have different capacities lets say 1,3,2,5,1 and I have multiple machines of same kind lets say i have A:1, B:2, C:2, D:1, E:3 machines

Now using these machines i want to create components which goes through a series of process, and the process involves these machines lets say I have Steps like S1, S2, S3, S4, S5. and they use S1:A, S2:A, S3:D, S4:B, S5:E (all the machines in the env is not used, one machine can be used in multiple steps)The component creation is in batch wise therefore, the components must go through all the steps sequentially, now I want to make the output of the previous process to be the input, so that all the steps run parallelly and none of the machine is kept idle, if the machine is already in use for any of the previous process then the component production must wait until the machine is available to use, and one machine cant be used for different steps until and unless all the components of that batch goes through this process

I want to create a model where, once the component enters a steps say step1 it should look at the capacity of the machine used in step2, once the number of components that completed step1 matches the capacity of the machine in next step, next step must be excecuted parallelly

As of now, I'm able to get the time required only when all the components finishes the step1 it will go to the next step

Upvotes: 0

Views: 173

Answers (1)

Michael
Michael

Reputation: 1969

I made a process that grabs a component from a queue, then launches a sub task for each item in the component, then waits for all the item sub task to finish before sending the item to the next station/task.

Note where I use a yield, and where I do not use a yield.

"""
A simulation where a batch of items get processed at each station

Each station processes one batch at a time but all the items in the batch will compeate
for the station's resoures.  So if a station has a exclusive resource pool of two resoures, two 
items will be processed at a time.  A resource pool can be shared with
more then one station

A batch does not advance to the next station untill all items have been processed

Programmer: Michael R. Gibbs
"""

import simpy
import random

class Comp():
    """
    component class
    Has unique ids for tracking
    and a batch of items
    """

    next_id = 1

    def __init__(self, item_list):
        self.id = self.__class__.next_id
        self.__class__.next_id += 1
        
        self.item_list = item_list


class Item():
    """
    item class
    Has unique ids for tracking
    """

    next_id = 1

    def __init__(self):
        self.id = self.__class__.next_id
        self.__class__.next_id += 1

class Comp_task():
    """
    Models a component task
    where components are processed one at a time after being queue
    all the items in the componet are queue at onece for a resource
    where each item seizes a resource
    take some time to do processing
    releases the resource
    when all the items finish processing the component is sent to the next task

    using a class to wrap all the parameters for the task and a queue.
    """

    def __init__(self, env, task_name, resource_pool, task_time_func, next_comp_task):
        self.env = env
        self.task_name = task_name
        self.resource_pool = resource_pool
        self.task_time_func = task_time_func
        self.next_comp_task = next_comp_task

        self.comp_q = simpy.Store(env)

        # start processing queue
        self.env.process(self.comp_task_loop())

    def comp_task(self, comp):
        """
        queues a component for this task to process
        """
        print(f'{self.env.now:.2f} task {self.task_name} queued comp {comp.id}' )
        yield self.comp_q.put(comp)

    def _item_task(self, comp, item):
        """
        Sub task to process each item
        """
        
        # requesting a resouce also queues the item
        # until the resource is avaliable
        with self.resource_pool.request() as req:
            yield req
            print(f'{self.env.now:.2f} task {self.task_name} started comp {comp.id} item {item.id}' )

            yield self.env.timeout(self.task_time_func())

        print(f'{self.env.now:.2f} task {self.task_name} finished comp {comp.id} item {item.id}' )

    def comp_task_loop(self):
        """
        Process the componets from the queue one at a time
        """

        while True:
            comp = yield self.comp_q.get()

            print(f'{self.env.now:.2f} task {self.task_name} started comp {comp.id}')

            # list of all the item sub tasks
            item_tasks_list = []

            # start all the componet's items processing
            # compention for a resource will result is the items getting queued up
            for item in comp.item_list:
                item_tasks_list.append(self.env.process(self._item_task(comp, item)))

            # wait for all the items to get processed
            yield self.env.all_of(item_tasks_list)

            print(f'{self.env.now:.2f} task {self.task_name} finished comp {comp.id}' )

            # send component to next step/task
            self.env.process(self.next_comp_task(comp))

def _comp_end_task(env, comp):
    """
    special end/sink comp task to log the completion of the compent's processing
    """

    yield env.timeout(0) # need this to make it a simpy process
    print(f'{env.now:.2f} finished all processing for comp {comp.id}' )

def gen_comps(env, first_task):
    """
    Generates a stream of components, each with a list of items
    and start the component processign at first task / step 1
    """
    while True:
        yield env.timeout(random.triangular(1,2,1))

        item_list = [Item() for _ in range(random.randint(5,10))]
        comp = Comp(item_list)

        env.process(first_task(comp))

# boot up
env = simpy.Environment()

# wrap the env param so has same signiture as other comp process tasks
comp_end_task = lambda comp: _comp_end_task(env, comp)

resource_pool_a = simpy.Resource(env,1)
resource_pool_b = simpy.Resource(env,2)
resource_pool_c = simpy.Resource(env,2)
resource_pool_d = simpy.Resource(env,1)
resource_pool_e = simpy.Resource(env,3)

#build the path of tasks
task_s5 = Comp_task(env, "task S5", resource_pool_e, lambda :random.triangular(3,9,6), comp_end_task)
task_s4 = Comp_task(env, "task S4", resource_pool_b, lambda :random.triangular(1,4,3), task_s5.comp_task)
task_s3 = Comp_task(env, "task S3", resource_pool_d, lambda :random.triangular(1,3,2), task_s4.comp_task)
task_s2 = Comp_task(env, "task S2", resource_pool_a, lambda :random.triangular(1,3,2), task_s3.comp_task)
task_s1 = Comp_task(env, "task S1", resource_pool_a, lambda :random.triangular(1,3,1), task_s2.comp_task)

env.process(gen_comps(env, task_s1.comp_task))

env.run(120)

Upvotes: 0

Related Questions