SamP8844
SamP8844

Reputation: 11

Databricks structured streaming with Snowflake as source?

Is it possible to use a Snowflake table as a source for spark structured streaming in Databricks? When I run the following pyspark code:

    options = dict(sfUrl=our_snowflake_url,
              sfUser=user,
              sfPassword=password,
              sfDatabase=database,
              sfSchema=schema,
              sfWarehouse=warehouse)

    df = spark.readStream.format("snowflake") \
              .schema(final_struct) \
              .options(**options) \
              .option("dbtable", "BASIC_DATA_TEST") \
              .load()

I get this warning:

java.lang.UnsupportedOperationException: Data source snowflake does not support streamed reading

I haven't been able to find anything in the Spark Structured Streaming Docs that explicitly says Snowflake is supported as a source, but I'd like to make sure I'm not missing anything obvious.

Thanks!

Upvotes: 1

Views: 839

Answers (1)

manuelschipper
manuelschipper

Reputation: 137

The Spark Snowflake connector currently does not support using the .writeStream/.readStream calls from Spark Structured Streaming

Upvotes: 2

Related Questions