Reputation: 251
I have successfully established the JDBC connection and can successfully execute statements like "use warehouse ...". When I try to run any SELECT statement I get the following error:
net.snowflake.client.jdbc.SnowflakeSQLLoggedException: JDBC driver internal error: Fail to retrieve row count for first arrow chunk: null.
I am able to see that my request was successful, and returned the expected data in the snowflake UI.
The error occurs on this line: rs = statement.executeQuery("select TOP 1 EVENT_ID from snowflake.account_usage.login_history");
The statement was able to execute queries prior to this line and the result set was as expected. Any insight would be appreciated!
Upvotes: 25
Views: 29486
Reputation: 6832
This happens on modern JVMs due to Arrow wanting to do low level memory allocations. It logs to stdout that it wants the JVM to be started with --add-opens=java.base/java.nio=ALL-UNNAMED
.
The fixes that are good for JDK17 are described on the Arrow Docs as:
# Directly on the command line
java --add-opens=java.base/java.nio=ALL-UNNAMED -jar ...
# Indirectly via environment variables
env _JAVA_OPTIONS="--add-opens=java.base/java.nio=ALL-UNNAMED" java -jar ...
So if you are using intellij/Docker setting the env var of _JAVA_OPTIONS
to be --add-opens=java.base/java.nio=ALL-UNNAMED
is probably the best solution.
Upvotes: 1
Reputation: 4578
This could happen due to several reasons:
-Djdk.module.illegalAccess=permit
This is a workaround until we get a fix for the following Apache Arrow issue ARROW-12747
ALTER SESSION SET JDBC_QUERY_RESULT_FORMAT='JSON'
Upvotes: 26
Reputation: 79
You can add the following 2 settings in the following file (macOS) /Applications/DBeaver.app/Contents/Eclipse/dbeaver.ini
-Djdk.module.illegalAccess=permit --add-opens=java.base/java.nio=ALL-UNNAMED
Information from: https://support.dbvis.com/support/solutions/articles/1000309803-snowflake-fail-to-retrieve-row-count-for-first-arrow-chunk-
Another alternative that worked for me on my MAC M1, is to use JDK11
brew install openjdk@11
Edit: /Applications/DBeaver.app/Contents/Eclipse/dbeaver.ini this line: ../Eclipse/jre/Contents/Home/bin/java change to /opt/homebrew/opt/openjdk@11/bin/java
Restart dbeaver
Upvotes: 4
Reputation: 21
Before executing actual query you need to set this:
statement.executeQuery("ALTER SESSION SET JDBC_QUERY_RESULT_FORMAT='JSON'");
Upvotes: 2
Reputation: 14022
The official solution from snowflake is to configure an extra property in your datasource configurations: https://community.snowflake.com/s/article/SAP-BW-Java-lang-NoClassDefFoundError-for-Apache-arrow
Customer can use this property (jdbc_query_result_format=json) in datasouce property of Application server or session property in application like
Statement = connection.createStatement();
Statement.executeQuery("ALTER SESSION SET JDBC_QUERY_RESULT_FORMAT='JSON'");
which will use result format as JSON instead of Arrow and which will avoid the above error.
Upvotes: 1
Reputation: 151
Using DBeaver to connect snowflake and had the same issue. It is resolved by setting the session parameter in each editor window as following: ALTER SESSION SET JDBC_QUERY_RESULT_FORMAT='JSON';
This solution can be automated by configuring boot-strap queries in connection settings->Initialization. With every new-editor window this session parameter will preset during initialization.
Upvotes: 15
Reputation: 29958
I hit the same problem, and was able to get it working by downgrading to Java 11 for version
[net.snowflake/snowflake-jdbc "3.13.8"]
Upvotes: 2