Reputation: 19290
I am getting an ORA-01000 SQL exception. So I have some queries related to it.
Does executing a prepared statement in a loop cause this issue ? (Of course, I could have used sqlBatch) Note: pStmt is closed once loop is over.
{ //method try starts
String sql = "INSERT into TblName (col1, col2) VALUES(?, ?)";
pStmt = obj.getConnection().prepareStatement(sql);
pStmt.setLong(1, subscriberID);
for (String language : additionalLangs) {
pStmt.setInt(2, Integer.parseInt(language));
pStmt.execute();
}
} //method/try ends
{ //finally starts
pStmt.close()
} //finally ends
What will happen if conn.createStatement() and conn.prepareStatement(sql) are called multiple times on single connection object ?
Edit1: 6. Will the use of Weak/Soft reference statement object help in preventing the leakage ?
Edit2: 1. Is there any way, I can find all the missing "statement.close()"s in my project ? I understand it is not a memory leak. But I need to find a statement reference (where close() is not performed) eligible for garbage collection ? Any tool available ? Or do I have to analyze it manually ?
Please help me understand it.
Go to ORACLE machine and start sqlplus as sysdba.
[oracle@db01 ~]$ sqlplus / as sysdba
Then run
SELECT A.VALUE,
S.USERNAME,
S.SID,
S.SERIAL#
FROM V$SESSTAT A,
V$STATNAME B,
V$SESSION S
WHERE A.STATISTIC# = B.STATISTIC#
AND S.SID = A.SID
AND B.NAME = 'opened cursors current'
AND USERNAME = 'VELU';
If possible please read my answer for more understanding of my solution
Upvotes: 133
Views: 462758
Reputation: 12259
I ran into this issue after setting the prepared statement cache size to a large value. Apparently, when prepared statements are kept in cache, the cursor stays open.
Upvotes: 0
Reputation: 234
I faced the same issue because I was querying db for more than 1000 iterations. I have used try and finally in my code. But was still getting error.
To solve this I just logged into oracle db and ran below query:
ALTER SYSTEM SET open_cursors = 8000 SCOPE=BOTH;
And this solved my problem immediately.
Upvotes: 0
Reputation: 399
I too had faced this issue.The below exception used to come
java.sql.SQLException: - ORA-01000: maximum open cursors exceeded
I was using Spring Framework with Spring JDBC for dao layer.
My application used to leak cursors somehow and after few minutes or so, It used to give me this exception.
After a lot of thorough debugging and analysis, I found that there was the problem with the Indexing, Primary Key and Unique Constraints in one of the Table being used in the Query i was executing.
My application was trying to update the Columns which were mistakenly Indexed. So, whenever my application was hitting the update query on the indexed columns, The database was trying to do the reindexing based on the updated values. It was leaking the cursors.
I was able to solve the problem by doing Proper Indexing on the columns which were used to search in the query and applying appropriate constraints wherever required.
Upvotes: 3
Reputation: 176
I faced the same problem (ORA-01000) today. I had a for loop in the try{}, to execute a SELECT statement in an Oracle DB many times, (each time changing a parameter), and in the finally{} I had my code to close Resultset, PreparedStatement and Connection as usual. But as soon as I reached a specific amount of loops (1000) I got the Oracle error about too many open cursors.
Based on the post by Andrew Alcock above, I made changes so that inside the loop, I closed each resultset and each statement after getting the data and before looping again, and that solved the problem.
Additionaly, the exact same problem occured in another loop of Insert Statements, in another Oracle DB (ORA-01000), this time after 300 statements. Again it was solved in the same way, so either the PreparedStatement or the ResultSet or both, count as open cursors until they are closed.
Upvotes: 2
Reputation: 21
I had this problem with my datasource in WildFly and Tomcat, connecting to a Oracle 10g.
I found that under certain conditions the statement wasn't closed even when the statement.close() was invoked. The problem was with the Oracle Driver we were using: ojdbc7.jar. This driver is intended for Oracle 12c and 11g, and it seems has some issues when is used with Oracle 10g, so I downgrade to ojdbc5.jar and now everything is running fine.
Upvotes: 0
Reputation: 19290
I am adding few more understanding.
Loggin as sysdba.
In Putty (Oracle login):
[oracle@db01 ~]$ sqlplus / as sysdba
In SqlPlus:
UserName: sys as sysdba
alter session set session_cached_cursors=0
select * from V$PARAMETER where name='session_cached_cursors'
SELECT max(a.value) as highest_open_cur, p.value as max_open_cur FROM v$sesstat a, v$statname b, v$parameter p WHERE a.statistic# = b.statistic# AND b.name = 'opened cursors current' AND p.name= 'open_cursors' GROUP BY p.value;
SELECT a.value, s.username, s.sid, s.serial#
FROM v$sesstat a, v$statname b, v$session s
WHERE a.statistic# = b.statistic# AND s.sid=a.sid
AND b.name = 'opened cursors current' AND username = 'SCHEMA_NAME_IN_CAPS'
SELECT oc.sql_text, s.sid
FROM v$open_cursor oc, v$session s
WHERE OC.sid = S.sid
AND s.sid=1604
AND OC.USER_NAME ='SCHEMA_NAME_IN_CAPS'
Now debug the Code and Enjoy!!! :)
Upvotes: 32
Reputation: 11
This problem mainly happens when you are using connection pooling because when you close connection that connection go back to the connection pool and all cursor associated with that connection never get closed as the connection to database is still open. So one alternative is to decrease the idle connection time of connections in pool, so may whenever connection sits idle in connection for say 10 sec , connection to database will get closed and new connection created to put in pool.
Upvotes: 1
Reputation: 306
In our case, we were using Hibernate and we had many variables referencing the same Hibernate mapped entity. We were creating and saving these references in a loop. Each reference opened a cursor and kept it open.
We discovered this by using a query to check the number of open cursors while running our code, stepping through with a debugger and selectively commenting things out.
As to why each new reference opened another cursor - the entity in question had collections of other entities mapped to it and I think this had something to do with it (perhaps not just this alone but in combination with how we had configured the fetch mode and cache settings). Hibernate itself has had bugs around failing to close open cursors, though it looks like these have been fixed in later versions.
Since we didn't really need to have so many duplicate references to the same entity anyway, the solution was to stop creating and holding onto all those redundant references. Once we did that the problem when away.
Upvotes: 0
Reputation: 26993
If your application is a Java EE application running on Oracle WebLogic as the application server, a possible cause for this issue is the Statement Cache Size setting in WebLogic.
If the Statement Cache Size setting for a particular data source is about equal to, or greater than, the Oracle database maximum open cursor count setting, then all of the open cursors can be consumed by cached SQL statements that are held open by WebLogic, resulting in the ORA-01000 error.
To address this, reduce the Statement Cache Size setting for each WebLogic datasource that points to the Oracle database to be significantly less than the maximum cursor count setting on the database.
In the WebLogic 10 Admin Console, the Statement Cache Size setting for each data source can be found at Services (left nav) > Data Sources > (individual data source) > Connection Pool tab.
Upvotes: 3
Reputation: 971
query to find sql that opened.
SELECT s.machine, oc.user_name, oc.sql_text, count(1)
FROM v$open_cursor oc, v$session s
WHERE oc.sid = s.sid
and S.USERNAME='XXXX'
GROUP BY user_name, sql_text, machine
HAVING COUNT(1) > 2
ORDER BY count(1) DESC
Upvotes: 1
Reputation: 1
Using batch processing will result in less overhead. See the following link for examples: http://www.tutorialspoint.com/jdbc/jdbc-batch-processing.htm
Upvotes: 0
Reputation: 19671
ORA-01000, the maximum-open-cursors error, is an extremely common error in Oracle database development. In the context of Java, it happens when the application attempts to open more ResultSets than there are configured cursors on a database instance.
Common causes are:
Configuration mistake
Solution:
Cursor leak
This section describes some of the theory behind cursors and how JDBC should be used. If you don't need to know the background, you can skip this and go straight to 'Eliminating Leaks'.
A cursor is a resource on the database that holds the state of a query, specifically the position where a reader is in a ResultSet. Each SELECT statement has a cursor, and PL/SQL stored procedures can open and use as many cursors as they require. You can find out more about cursors on Orafaq.
A database instance typically serves several different schemas, many different users each with multiple sessions. To do this, it has a fixed number of cursors available for all schemas, users and sessions. When all cursors are open (in use) and request comes in that requires a new cursor, the request fails with an ORA-010000 error.
The number is normally configured by the DBA on installation. The number of cursors currently in use, the maximum number and the configuration can be accessed in the Administrator functions in Oracle SQL Developer. From SQL it can be set with:
ALTER SYSTEM SET OPEN_CURSORS=1337 SID='*' SCOPE=BOTH;
The JDBC objects below are tightly coupled to the following database concepts:
JDBC is thread safe: It is quite OK to pass the various JDBC objects between threads.
For example, you can create the connection in one thread; another thread can use this connection to create a PreparedStatement and a third thread can process the result set. The single major restriction is that you cannot have more than one ResultSet open on a single PreparedStatement at any time. See Does Oracle DB support multiple (parallel) operations per connection?
Note that a database commit occurs on a Connection, and so all DML (INSERT, UPDATE and DELETE's) on that connection will commit together. Therefore, if you want to support multiple transactions at the same time, you must have at least one Connection for each concurrent Transaction.
A typical example of executing a ResultSet is:
Statement stmt = conn.createStatement();
try {
ResultSet rs = stmt.executeQuery( "SELECT FULL_NAME FROM EMP" );
try {
while ( rs.next() ) {
System.out.println( "Name: " + rs.getString("FULL_NAME") );
}
} finally {
try { rs.close(); } catch (Exception ignore) { }
}
} finally {
try { stmt.close(); } catch (Exception ignore) { }
}
Note how the finally clause ignores any exception raised by the close():
In Java 7, Oracle has introduced the AutoCloseable interface which replaces most of the Java 6 boilerplate with some nice syntactic sugar.
JDBC objects can be safely held in local variables, object instance and class members. It is generally better practice to:
There is, however, one exception: If you are using EJBs, or a Servlet/JSP container, you have to follow a strict threading model:
There are a number of processes and tools available for helping detect and eliminating JDBC leaks:
During development - catching bugs early is by far the best approach:
Development practices: Good development practices should reduce the number of bugs in your software before it leaves the developer's desk. Specific practices include:
Static Code Analysis: Use a tool like the excellent Findbugs to perform a static code analysis. This picks up many places where the close() has not been correctly handled. Findbugs has a plugin for Eclipse, but it also runs standalone for one-offs, has integrations into Jenkins CI and other build tools
At runtime:
Holdability and commit
Logging at runtime.
You can add a debugging JDBC driver to your project (for debugging - don't actually deploy it). One example (I have not used it) is log4jdbc. You then need to do some simple analysis on this file to see which executes don't have a corresponding close. Counting the open and closes should highlight if there is a potential problem
Weak and soft references are ways of allowing you to reference an object in a way that allows the JVM to garbage collect the referent at any time it deems fit (assuming there are no strong reference chains to that object).
If you pass a ReferenceQueue in the constructor to the soft or weak Reference, the object is placed in the ReferenceQueue when the object is GC'ed when it occurs (if it occurs at all). With this approach, you can interact with the object's finalization and you could close or finalize the object at that moment.
Phantom references are a bit weirder; their purpose is only to control finalization, but you can never get a reference to the original object, so it's going to be hard to call the close() method on it.
However, it is rarely a good idea to attempt to control when the GC is run (Weak, Soft and PhantomReferences let you know after the fact that the object is enqueued for GC). In fact, if the amount of memory in the JVM is large (eg -Xmx2000m) you might never GC the object, and you will still experience the ORA-01000. If the JVM memory is small relative to your program's requirements, you may find that the ResultSet and PreparedStatement objects are GCed immediately after creation (before you can read from them), which will likely fail your program.
TL;DR: The weak reference mechanism is not a good way to manage and close Statement and ResultSet objects.
Upvotes: 319
Reputation: 1552
Correct your Code like this:
try
{ //method try starts
String sql = "INSERT into TblName (col1, col2) VALUES(?, ?)";
pStmt = obj.getConnection().prepareStatement(sql);
pStmt.setLong(1, subscriberID);
for (String language : additionalLangs) {
pStmt.setInt(2, Integer.parseInt(language));
pStmt.execute();
}
} //method/try ends
finally
{ //finally starts
pStmt.close()
}
Are you sure, that you're really closing your pStatements, connections and results?
To analyze open objects you can implment a delegator pattern, which wraps code around your statemant, connection and result objects. So you'll see, if an object will successfully closed.
An Example for: pStmt = obj.getConnection().prepareStatement(sql);
class obj{
public Connection getConnection(){
return new ConnectionDelegator(...here create your connection object and put it into ...);
}
}
class ConnectionDelegator implements Connection{
Connection delegates;
public ConnectionDelegator(Connection con){
this.delegates = con;
}
public Statement prepareStatement(String sql){
return delegates.prepareStatement(sql);
}
public void close(){
try{
delegates.close();
}finally{
log.debug(delegates.toString() + " was closed");
}
}
}
Upvotes: 5
Reputation: 18904
Did you set autocommit=true? If not try this:
{ //method try starts
String sql = "INSERT into TblName (col1, col2) VALUES(?, ?)";
Connection conn = obj.getConnection()
pStmt = conn.prepareStatement(sql);
for (String language : additionalLangs) {
pStmt.setLong(1, subscriberID);
pStmt.setInt(2, Integer.parseInt(language));
pStmt.execute();
conn.commit();
}
} //method/try ends {
//finally starts
pStmt.close()
} //finally ends
Upvotes: 1