himanshuIIITian
himanshuIIITian

Reputation: 6095

Apache Spark Shell does not Start with Less Memory

I have been using Apache Spark Shell for quite some time. So, I knew that we can start Spark Shell with options like --driver-memory and --executor-memory to alter default values.

So, I started spark-shell with following command:

$ spark-shell --driver-memory 100M

But, I was hit by following error:

Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel).
java.lang.OutOfMemoryError: Java heap space
    at scala.reflect.internal.Names$class.enterChars(Names.scala:70)
    at scala.reflect.internal.Names$class.body$1(Names.scala:116)
    at scala.reflect.internal.Names$class.newTermName(Names.scala:127)
    at scala.reflect.internal.SymbolTable.newTermName(SymbolTable.scala:16)
    at scala.reflect.internal.Names$class.newTermName(Names.scala:135)
    at scala.reflect.internal.SymbolTable.newTermName(SymbolTable.scala:16)
    at scala.reflect.internal.Names$class.newTypeName(Names.scala:139)
    at scala.reflect.internal.SymbolTable.newTypeName(SymbolTable.scala:16)
    at scala.tools.nsc.symtab.SymbolLoaders.enterClass(SymbolLoaders.scala:61)
    at scala.tools.nsc.symtab.SymbolLoaders.enterClassAndModule(SymbolLoaders.scala:119)
    at scala.tools.nsc.symtab.SymbolLoaders.initializeFromClassPath(SymbolLoaders.scala:167)
    at scala.tools.nsc.symtab.SymbolLoaders$PackageLoader$$anonfun$doComplete$1$$anonfun$apply$mcV$sp$1.apply(SymbolLoaders.scala:265)
    at scala.tools.nsc.symtab.SymbolLoaders$PackageLoader$$anonfun$doComplete$1$$anonfun$apply$mcV$sp$1.apply(SymbolLoaders.scala:264)
    at scala.collection.Iterator$class.foreach(Iterator.scala:893)
    at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
    at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
    at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
    at scala.tools.nsc.symtab.SymbolLoaders$PackageLoader$$anonfun$doComplete$1.apply$mcV$sp(SymbolLoaders.scala:264)
    at scala.tools.nsc.symtab.SymbolLoaders$PackageLoader$$anonfun$doComplete$1.apply(SymbolLoaders.scala:260)
    at scala.tools.nsc.symtab.SymbolLoaders$PackageLoader$$anonfun$doComplete$1.apply(SymbolLoaders.scala:260)
    at scala.reflect.internal.SymbolTable.enteringPhase(SymbolTable.scala:235)
    at scala.tools.nsc.symtab.SymbolLoaders$PackageLoader.doComplete(SymbolLoaders.scala:260)
    at scala.tools.nsc.symtab.SymbolLoaders$SymbolLoader.complete(SymbolLoaders.scala:211)
    at scala.tools.nsc.symtab.SymbolLoaders$SymbolLoader.load(SymbolLoaders.scala:227)
    at scala.reflect.internal.Symbols$Symbol.typeParams(Symbols.scala:1733)
    at scala.reflect.internal.Types$class.isRawIfWithoutArgs(Types.scala:3756)
    at scala.reflect.internal.SymbolTable.isRawIfWithoutArgs(SymbolTable.scala:16)
    at scala.reflect.internal.tpe.TypeMaps$$anon$1.apply(TypeMaps.scala:328)
    at scala.reflect.internal.tpe.TypeMaps$$anon$1.apply(TypeMaps.scala:325)
    at scala.reflect.internal.Symbols$Symbol.modifyInfo(Symbols.scala:1542)
    at scala.reflect.internal.Symbols$Symbol.cookJavaRawInfo(Symbols.scala:1688)
    at scala.tools.nsc.typechecker.Infer$Inferencer.checkAccessible(Infer.scala:270)

I got confused by this error. Since, we can start spark-shell with any amount of memory than why does it fail with 100M ?

Upvotes: 0

Views: 97

Answers (1)

user7277726
user7277726

Reputation: 36

Nothing strange happens here. Spark is a complex engine and requires a lot of memory. Memory footprint of an idle driver process is around 250mb and you need much more than that for stable work.

Upvotes: 2

Related Questions