Raghavan Narasimhan
Raghavan Narasimhan

Reputation: 11

Can slave process be dynamically provisioned based on load using Spring Cloud data flow?

We are currently using Spring batch - remote chunking for scaling batch process . Thinking of using Cloud data flow but would like to know if based on load Slaves can be dynamically provisioned? we are deployed in Google Cloud and hence want to think of using Spring Cloud data flow support for kubernetes as well if Cloud data flow would fit our needs ?

Upvotes: 1

Views: 489

Answers (1)

Michael Minella
Michael Minella

Reputation: 21483

When using the batch extensions of Spring Cloud Task (specifically the DeployerPartitionHandler), workers are dynamically launched as needed. That PartitionHandler allows you to configure a maxiumum number of workers, then it will process each partition as an independent worker up to that max (processing the rest of the partitions as others finish up). The "dynamic" aspect is really controlled by the number of partitions returned by the Partitioner. The more partitions returned means the more workers launched.

You can see a simple example configured to use CloudFoundry in this repo: https://github.com/mminella/S3JDBC The main difference between it and what you'd need is that you'd swap out the CloudFoundryTaskLauncher for a KubernetesTaskLauncher and it's appropriate configuration.

Upvotes: 1

Related Questions