AKs
AKs

Reputation: 1745

net.snowflake.client.jdbc.SnowflakeSQLException: SQL compilation error: Operation is not supported in reader account

I am trying to read data from the view created in Snowflake and store data to GCS through PySpark.

SNOWFLAKE_SOURCE_NAME = "net.snowflake.spark.snowflake"

sfOptions = {
  "sfURL" : "XXX.snowflakecomputing.com",
  "sfUser" : "XXX",
  "sfPassword" : "XXX",
  "sfDatabase" : "DB",
  "sfSchema" : "XXX",
  "sfWarehouse": "DWH"
}

df = spark.read.format(SNOWFLAKE_SOURCE_NAME) \
  options(**sfOptions) \
  option("query",  "SELECT * FROM JOB_v1").load()

df.show()

I am using the following packages:

packages net.snowflake:snowflake-jdbc:3.8.0
net.snowflake:spark-snowflake_2.11:2.4.14-spark_2.4

I can load the data, count the rows, print the schema but when I am trying to write or show the data frame then getting following exception:

Py4JJavaError: An error occurred while calling o43.showString.
: net.snowflake.client.jdbc.SnowflakeSQLException: SQL compilation error:
Operation is not supported in reader account.
    at net.snowflake.client.jdbc.SnowflakeUtil.checkErrorAndThrowExceptionSub(SnowflakeUtil.java:139)
    at net.snowflake.client.jdbc.SnowflakeUtil.checkErrorAndThrowException(SnowflakeUtil.java:64)
    at net.snowflake.client.core.StmtUtil.pollForOutput(StmtUtil.java:491)
    at net.snowflake.client.core.StmtUtil.execute(StmtUtil.java:368)
    at net.snowflake.client.core.SFStatement.executeHelper(SFStatement.java:486)
    at net.snowflake.client.core.SFStatement.executeQueryInternal(SFStatement.java:237)
    at net.snowflake.client.core.SFStatement.executeQuery(SFStatement.java:176)
    at net.snowflake.client.core.SFStatement.execute(SFStatement.java:683)
    at net.snowflake.client.jdbc.SnowflakeStatementV1.executeQueryInternal(SnowflakeStatementV1.java:242)
    at net.snowflake.client.jdbc.SnowflakePreparedStatementV1.executeQuery(SnowflakePreparedStatementV1.java:160)
    at net.snowflake.spark.snowflake.JDBCWrapper$$anonfun$executePreparedQueryInterruptibly$1.apply(SnowflakeJDBCWrapper.scala:256)
    at net.snowflake.spark.snowflake.JDBCWrapper$$anonfun$executePreparedQueryInterruptibly$1.apply(SnowflakeJDBCWrapper.scala:254)
    at net.snowflake.spark.snowflake.JDBCWrapper$$anonfun$1.apply(SnowflakeJDBCWrapper.scala:291)
    at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
    at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    at java.lang.Thread.run(Thread.java:748)

Upvotes: 3

Views: 2668

Answers (1)

FKayani
FKayani

Reputation: 1021

Couple of things to consider here:

  1. A reader account is intended primarily for querying data shared by the provider of the account. Adding new data to the account and/or updating shared data in the account is not supported - Details: https://docs.snowflake.com/en/user-guide/data-sharing-reader-create.html#what-is-restricted-allowed-in-a-reader-account

  2. use_copy_unload - If this is FALSE, Snowflake uses the Arrow data format when SELECTing data. If this is set to TRUE, then Snowflake reverts to the old behaviour of using the COPY UNLOAD command to transmit selected data. This parameter is optional. The default value is FALSE.

Upvotes: 3

Related Questions