Reputation: 4112
So i have a existing Spring library that performs some blocking tasks(exposed as services) that i intend to wrap using Scala Futures to showcase multi processor capabilities. The intention is to get people interested in the Scala/Akka tech stack.
Here is my problem. Lets say i get two services from the existing Spring library. These services perform different blocking tasks(IO,db operations). How do i make sure that these tasks(service calls) are carried out across multiple cores ? For example how do i make use of custom execution contexts? Do i need one per service call? How does the execution context(s) / thread pools relate to multi core operations ?
Appreciate any help in this understanding.
Upvotes: 1
Views: 345
Reputation: 853
You cannot ensure that tasks will be executed on different cores. The workflow for the sample program would be as such.
Rule of the thumb for writing concurrent and seemingly parallel applications is: "Here, take my e.g. 10 threads and TRY to split them among the cores."
There are some tricks, like tuning CPU affinity (low-level, very risky) or spawning a plethora of threads to make sure that they are parallelized (a lot of overhead and work for the GC). However, in general, OS is usually not that overloaded and if you create two e.g. actors, one for db one for network IO they should work well in parallel.
UPDATE:
The global ExecutionContext manages the thread pool. However, you can define your own and submit runnables to it myThreadPool.submit(runnable: Runnable)
. Have a look at the links provided in the comment.
Upvotes: 2