knt
knt

Reputation: 7593

HBase java.lang.OutOfMemoryError

I'm having the following issue with Hbase.

I have a script which starts the HBase shell and inserts many rows into a table with a single column. I have tried inserting 10,000 rows but after about 1,700 I get the dreaded "java.lang.OutOfMemoryError: unable to create new native thread" error. I have tried changing the Java heap size from 1000mb default to 1800mb, but this doesn't allow me to insert any more than the 1700 or so rows.

However, I've noticed that I can insert 1000 rows, exit the shell, restart the shell, insert 1000 more into the same table, exit again, and so on and so forth. I don't really understand enough about the JVM to figure out why it's allowing me to do this in several sessions, but not allowing me to batch insert in the same session.

Can someone please explain to me what is going on here, and what I might do about it?

EDIT:

I am now using 64-bit machine, red hat linux 5, with Java 1.6. I'm giving HBase a heapsize of 20gb (I have ~32 gigs memory total). For stack size, I'm giving 8mb. The default on 64-bit is 2mb I believe; with 2mb I got this same error, and increasing it to 8mb did not help at all (I was only able to insert the same amount of rows regardless of stack size, ~1700).

I have read that decreasing the heap size could make this error go away but that did not help either. Below are the jvm options that I'm setting (everything is default except for stack size).

HBASE_OPTS="$HBASE_OPTS -ea -Xss8M -XX:+HeapDumpOnOutOfMemoryError -XX:+UseConcMarkSweepGC -XX:+CMSIncrementalMode"

Upvotes: 3

Views: 5976

Answers (3)

shal
shal

Reputation: 1

I also encountered the same issue and as explained by kosii above, the root cause was not closing the HTableInterface instance which I got from the HTablePool after utilization.

HTableInterface table = tablePool.getTable(tableName);
// Do the work
....
....
table.close()

Upvotes: 0

mutantacule
mutantacule

Reputation: 7063

I encountered this error when I was using a HTablePool instance to get my HTableInterface instances, but after the utilization I forget to call the close() method on it.

Upvotes: 0

QuinnG
QuinnG

Reputation: 6424

I encountered this error yesterday. What was happening in my case is that I was creating a lot of instances of HTable which created way too many threads when I was using the put on a record. (I was using a mapper and creating it inside the map function)

I'd check to see if your connection to HBase is being created a lot (inside a loop or a map function. If that is happening, then moving it to instantiate fewer connections to HBase (I used HTable) may solve the problem. It solved mine.

HTH

Upvotes: 4

Related Questions