Reputation: 309
I'm trying to install Spark1.5.1 on Ubuntu14.04 VM. After un-tarring the file, I changed the directory to the extracted folder and executed the command "./bin/pyspark" which should fire up the pyspark shell. But I got an error message as follows:
[ OpenJDK 64-Bit Server VM warning: INFO: os::commit_memory(0x00000000c5550000, 715849728, 0) failed; error='Cannot allocate memory' (errno=12) There is insufficient memory for the Java Runtime Environment to continue.
Native memory allocation (malloc) failed to allocate 715849728 bytes for committing reserved memory.
An error report file with more information is saved as: /home/datascience/spark-1.5.1-bin-hadoop2.6/hs_err_pid2750.log ]
Could anyone please give me some directions to sort out the problem?
Upvotes: 1
Views: 8891
Reputation: 1948
We need to set spark.executor.memory
in conf/spark-defaults.conf
file to a value specific to your machine. For example,
usr1@host:~/spark-1.6.1$ cp conf/spark-defaults.conf.template conf/spark-defaults.conf
nano conf/spark-defaults.conf
spark.driver.memory 512m
For more information, refer to the official documentation: http://spark.apache.org/docs/latest/configuration.html
Upvotes: 0
Reputation: 21690
Pretty much what it says. It wants 7GB of RAM. So give the VM ~ 8GB of RAM.
Upvotes: -1