YACINE GACI
YACINE GACI

Reputation: 145

Is it possible to change Spark parameters at runtime?

I am currently working on Spark and trying to suggest an adaptive execution plan. However, I am wondering whether it is possible to modify the parameters of the Spark engine at runtime. For example, Can I use different compression codecs for two separate stages, or can I modify the memory fractions reserved for shuffling and computation at runtime? Say for the map phase, I diminish the memory fraction allocated for shuffling, to increase it later when the shuffling occurs?

Thanks

Upvotes: 2

Views: 6929

Answers (1)

user10900284
user10900284

Reputation: 21

It is not possible in general.

While a subset of configuration options can be changed on runtime using (Customize SparkContext using sparkConf.set(..) when using spark-shell) RuntimeConfig object, core options, cannot be modified unless SparkContext is restarted.

Upvotes: 2

Related Questions