Reputation: 267287
I have some code which accesses a SQLite database using JDBC.
I've noticed that every time a query is made, the memory usage increases - and it does not go down, even after the connection is closed.
Here's what I'm doing:
1) Closing the PreparedStatement
2) Closing the ResultSet
3) Closing the connection
Here's a screenshot of the heapdump analysis:
It shows a lot of java.lang.ref.Finalizer
and a lot of PreparedStatement
and ResultSet
objects.
Here's the code (its in scala but should be easily comparable to java):
val conn: Connection = DriverManager.getConnection(url)
// Gets strings by a query like SELECT .. WHERE foo = ?
def getStringsByQuery(query: String, param: String, field: String):Seq[String] = {
val st = conn.prepareStatement(query)
st.setString(1, param) //value of foo = ?
st.setFetchSize(Integer.MAX_VALUE)
st.setMaxRows(Integer.MAX_VALUE)
//Holder of results
var results = collection.mutable.Seq.empty[String]
val rs: ResultSet = st.executeQuery()
//add results to holder
while (rs.next())
results :+= rs.getString(field)
rs.close() //closing ResultSet
st.close() //closing PreparedStatement
results
}
Here's the test I wrote to test this:
test("detect memory leak") {
log.info("Starting in 10 sec")
Thread.sleep(10.seconds.toMillis)
//Calls a method over and over to see if there's a memory leak or not..
(1 to 1000).par.foreach(i => {
val randomWord = getRandomWord() //this produces a random word
val sql = "SELECT foo FROM myTable where bar = ?"
val results = getStringsByQuery(sql, randomWord, "bar")
})
conn.close() //close the connection
log.info("Closed conn, closing in 30 sec")
Thread.sleep(1.minutes.toMillis)
}
When I run the test - the memory usage steadily increases from 24.6 GB to 33 gb and never goes down (even though ResultSet + PreparedStatement are being closed), and even at the end when the conn is closed and thread sleeps for 1 min - the memory usage still doesn't go down.
Does anyone know what's going on here? I'd appreciate any help.
Upvotes: 0
Views: 1564
Reputation: 1
I closed the connection and database object, created a new object database with the same parameters and a new connection, and finally called garbage collection.
Repeat the process as you need. I make a counter to repeat the process when I fill the half of my computer's memory.
Now I can run without errors.
Upvotes: 0
Reputation: 267287
I resolved this issue by porting my Sqlite datasets to CQEngine. Couldn't be happier.
Upvotes: -3
Reputation: 699
We run sqlite-jdbc in a production environment and we noticed similar behavior when inspecting heap dumps.
These objects are only on the heap because they haven't yet been run through the garbage collector. In the sense of https://blog.nelhage.com/post/three-kinds-of-leaks/ this is a "type 2 memory leak", in that the objects have been allocated and are living a bit longer than you expect.
The problem this caused for us is machines would retain memory on the Java heap for unexpectedly long periods of time, running into memory pressure during peak request times. We did a few different things to ensure that these objects would be collected more frequently.
These may not match your use case but these are some options you can consider depending on what you're running into.
Upvotes: 5