Reputation: 404
I am working on a Django project in which i'll create an online judge (User can submit solutions to programming problems and will be autograded at server using python subprocess library where i can call g++) (something similiar to Codeforces website).
When user submits a solution i take his solution file and have to asynchronously call the grading function (automatic grading may take time)
I didnt use celery for the purpose because i read it has overheads and would have been overkill for mey project. So i built a queue object in Django and called a function to run on queue by creating a thread and it's working fine (i have read threads are evil warning).
I wanted to ask should i call python multiprocessing library to optimize the automatic grading on different cores. For example i'll create 4 queues and call grading function on each queue through different Processes. Will it be nice idea or overheads of calling multiprocessing will be larger?
Also will it be safe to do (if not good style of programming)?
Upvotes: 0
Views: 1134
Reputation: 11
I think using celery is a much better and safer option. Run workers with the concurrency you want. That is hard and has to control multi processes in your application level.
Upvotes: 1