Reputation: 1723
I want to drop my table using DROP TABLE
, but I got an error JDBC on my Databricks, but if I use SELECT
, it doesn't get any error, I have 2 function different that can read_from_synapse
and write_to_synapse
.
This is my read_from_synapse
, that I think can DROP TABLE
use query
def read_from_synapse(sql, url, tempDir):
df = (spark.read
.format("com.databricks.spark.sqldw")
.option("url", url)
.option("tempDir", tempDir)
.option("forwardSparkAzureStorageCredentials", "true")
.option("query", sql)
.load()
)
return df
And this is my write_to_synapse
script that can insert into a table
def write_to_synapse(df, table, write_mode, url, tempDir):
df.write
.format("com.databricks.spark.sqldw")
.option("tableOptions", "CLUSTERED COLUMNSTORE INDEX, DISTRIBUTION = ROUND_ROBIN") # Added at 20200121
.option("url", url)
.option("dbtable", table)
.option("forward_spark_azure_storage_credentials","True")
.option("tempdir", tempDir)
.mode(write_mode)
.save()
Updated
I have tried this link, and it didn't solve my problem, it got an error like this
com.databricks.spark.sqldw.SqlDWSideException: Azure Synapse Analytics failed to execute the JDBC query produced by the connector.
This is more detailed from the error:
---------------------------------------------------------------------------
Py4JJavaError Traceback (most recent call last)
<command-1485523310718777> in <module>
----> 1 drop_sdf_sales = azSynapse._delete_from_synapse(drop_sql, tempDir=temp_read_dir)
<command-1485523310718391> in _delete_from_synapse(self, sql, url, tempDir)
5 if not tempDir:
6 tempDir = self.azblob._get_blob_path('04-publish', 'sqlDwReadTempDirs')
----> 7 df = UtilAzSynapse.read_from_synapse(sql, url, tempDir)
8 return df
<command-2362013028695578> in read_from_synapse(sql, url, tempDir)
16 .option("tempDir", tempDir)
17 .option("forwardSparkAzureStorageCredentials", "true")
---> 18 .option("query", sql)
19 .load()
20 )
/databricks/spark/python/pyspark/sql/readwriter.py in load(self, path, format, schema, **options)
182 return self._df(self._jreader.load(self._spark._sc._jvm.PythonUtils.toSeq(path)))
183 else:
Upvotes: 1
Views: 1877
Reputation: 12768
To delete a row from the Azure Synapse table using Azure Databricks:
postActionsSQL = "DELETE from Persons WHERE PersonID=3"
df.write.format("com.databricks.spark.sqldw") \
.option("url", "jdbc:sqlserver://synapse.sql.azuresynapse.net:1433;database=master;user=master@synapse;password=XXXXXXX;encrypt=true;trustServerCertificate=true;hostNameInCertificate=*.sql.azuresynapse.net;loginTimeout=30;") \
.option("tempDir", "wasbs://[email protected]/temp") \
.option("forwardSparkAzureStorageCredentials", "true") \
.option("dbTable", "Persons") \
.option("postActions",postActionsSQL) \
.mode("overwrite") \
.save()
To delete table from Azure Synapse table using Azure Databricks:
postActionsSQL = "DROP TABLE Trip003"
df.write.format("com.databricks.spark.sqldw") \
.option("url", "jdbc:sqlserver://synapse.sql.azuresynapse.net:1433;database=master;user=master@synapse;password=XXXXXXX;encrypt=true;trustServerCertificate=true;hostNameInCertificate=*.sql.azuresynapse.net;loginTimeout=30;") \
.option("tempDir", "wasbs://[email protected]/temp") \
.option("forwardSparkAzureStorageCredentials", "true") \
.option("dbTable", "Trip003") \
.option("postActions",postActionsSQL) \
.mode("overwrite") \
.save()
Upvotes: 1