Reputation: 2517
I am having some Spark code that processes a queue; the first element of the queue is taken, processed using a scala Futures construct and on completion of the Future the next element is taken. The code looks (simplified) as follows:
import scala.concurrent.ExecutionContext.Implicits.global
def nextExperiment() {
Future { ... }.onComplete(x => nextExperiment())
}
When running this code on a cluster (not locally), it will only start processing the first element of the queue and then quickly shutdown the context, before finishing all elements in the queue. If I remove the Future{ }
structure, however, it does perfectly what it is supposed to.
Why is this?
Upvotes: 1
Views: 1410
Reputation: 5999
Well, you don't show where you are calling nextExperiment
from. But given what you are seeing, the program probably just returns from this call, carries on to the end of the program, and finishes. If you want to wait for all Futures to finish before the program itself finishes you need to block until all is done.
Upvotes: 4