Newbie0105
Newbie0105

Reputation: 119

Multiprocessing seems like it doesn't work

I made a simple multiprocessing code, but I think it doesn't work.

When I tried this code at my laptop, I checked the processor through activity monitor app, and it showed that some processors worked. So, with this code, I ran it at workstation(Core up to 28 and used 24), and checked it again through task manager. But, CPU usage doesn't increase, just processors increased.

# Multiprocessing

def multi(input_field):
    result = subvolume.label(input_field)
    return result

test_list = [resampled_sub_1, resampled_sub_2, resampled_sub_3,
             resampled_sub_4, resampled_sub_5]

if __name__ == '__main__':
    pool = multiprocessing.Pool(processes=24)
    results = pool.map(multi, test_list)
    pool.close()
    pool.join()

When multiprocessing is done right, I think CPU usage increased than this. Where did I do something wrong?

Upvotes: 1

Views: 529

Answers (1)

Darkonaut
Darkonaut

Reputation: 21694

You have 24 processes in your pool, but your iterable test_list has only 5 items in it. When you pick calc_chunksize_info() from my answer here, you can calculate the generated and distributed chunks:

calc_chunksize_info(n_workers=24, len_iterable=5)
# Out: Chunkinfo(n_workers=24, len_iterable=5, n_chunks=5, chunksize=1, last_chunk=1)

Chunksize will be 1, so up to five worker-processes could run in parallel. There are simply not enough items in your input-iterable to employ all worker-processes.

As a side note: test_list should be defined within the if __name__ == '__main__':-block.

Upvotes: 1

Related Questions