user3858193
user3858193

Reputation: 1518

Java Memory issue while executing sbt package in spark

Can you please suggest me solution for the below issues.

hduser@hduser-VirtualBox:/usr/local/spark1/project$ sbt package OpenJDK 64-Bit Server VM warning: INFO: os::commit_memory(0x00000000a8000000, 1073741824, 0) failed; error='Cannot allocate memory' (errno=12) #

There is insufficient memory for the Java Runtime Environment to continue.

Native memory allocation (malloc) failed to allocate 1073741824 bytes for committing reserved memory.

An error report file with more information is saved as:

/usr/local/spark-1.1.0-bin-hadoop1/project/hs_err_pid26824.log

hduser@hduser-VirtualBox:/usr/local/spark1/project$ java -version java version "1.7.0_65" OpenJDK Runtime Environment (IcedTea 2.5.3) (7u71-2.5.3-0ubuntu0.14.04.1) OpenJDK 64-Bit Server VM (build 24.65-b04, mixed mode)

Upvotes: 0

Views: 1831

Answers (1)

chiastic-security
chiastic-security

Reputation: 20520

Looks like you're trying to run with quite a large Java heap size (1GB). I'd start by reducing that. If you really do need that much, you might be in trouble: it looks as though your machine just doesn't have enough RAM to allocate it for you.

Upvotes: 1

Related Questions