Reputation: 5966
I am having memory issues because I am trying to read a huge ResultSet from a Netezza database. Does Netezza support any kind of "streaming" ResultSet like MySQL does? If not, will limiting the fetch size like this work instead?:
stmt.setFetchSize(50);
conn.setAutoCommitMode(false);
Upvotes: 2
Views: 655
Reputation: 3887
If you want to pull the rows to store in a file, then your best best is to use a remote external table.
Here is an example that creates a transient remote external table over JDBC. This will invoke the bulk export/load funciontality provided with the JDBC driver, and create a pipe delimited text file.
create external table 'c:\mytest.txt'
USING (DELIMITER '|' REMOTESOURCE 'JDBC' ) as
select *
from table1;
You can call this using conn.createStatement().execute, and you will likely have to add change the file specification to c:\mytest.txt to escape the existing backslash.
You can read more about external tables in the documentation here.
You can use setFetchSize, by the way. I'm not sure that it would solve your memory issue though.
Upvotes: 1
Reputation: 1043
From following URL, you can get information about ROWSET_LIMIT, with this variable setting you can limit query result to your requirement and create streams as per need - https://www-01.ibm.com/support/knowledgecenter/SSULQD_7.1.0/com.ibm.nz.adm.doc/c_sysadm_user_rowset_limits.html?lang=en
"You can place a limit on the number of rows a query can return and thus restrict resources for large result sets. Specifying a rowset limit when you create a user or a group automatically limits the rows that are returned so that users do not have to append a limit clause to their SQL queries."
Upvotes: 0