under_the_sea_salad
under_the_sea_salad

Reputation: 1804

Can spark's driver memory be set to something other than a number of gigabytes?

I am launching pyspark and I can supply the driver-memory parameter via command line to specify the maximum memory usage by the driver. In Spark's online documentation, they often just use a value like 1g or 2g as examples, but I am not sure if it's legal to use 3300m or 4500m as the value.

I think this parameter is related to the jvm's Xmx parameter which must be a multiple of 1024m which is more of a reason why I am confused.

Does spark's driver memory parameter properly handle something other than a number of gigabytes?

Upvotes: 1

Views: 499

Answers (1)

Darshan
Darshan

Reputation: 2333

Yes, it works. Looking at the documentation and my previous experience, you can set the driver-memory in mbs also. Eg: 512m

See: http://spark.apache.org/docs/latest/configuration.html

Properties that specify a byte size should be configured with a unit of size. The following format is accepted:

1b (bytes)
1k or 1kb (kibibytes = 1024 bytes)
1m or 1mb (mebibytes = 1024 kibibytes)
1g or 1gb (gibibytes = 1024 mebibytes)
1t or 1tb (tebibytes = 1024 gibibytes)
1p or 1pb (pebibytes = 1024 tebibytes)

Upvotes: 4

Related Questions