Reputation: 1344
I am trying to export the results from a spark.sql query in Databricks to a folder in Azure Data Lake Store - ADLS
The tables that I'm querying are also in ADLS.
I have accessed the files in ADLS from Databricks with the following commnad:
base = spark.read.csv("adl://carlslake.azuredatalakestore.net/landing/",inferSchema=True,header=True)
base.createOrReplaceTempView('basetable')
I am querying the table with the following command:
try:
dataframe = spark.sql("select * from basetable where LOAD_ID = 1199")
except:
print("Exception occurred 1166")
else:
print("Table Load_id 1166")
I am then attempting to export the results to the folder in Azure using the following:
try:
dataframe.coalesce(1).write.option("header","true").mode("overwrite").csv("adl://carlslake.azuredatalakestore.net/jfolder2/outputfiles/")
rename_file("adl://carlslake.azuredatalakestore.net/jfolder2/outputfiles", "adl://carlslake.azuredatalakestore.net/landing/RAW", "csv", "Delta_LoyaltyAccount_merged")
except:
print("Exception Occurred 1166")
else:
print("Delta File Created")
There are two weird issues here:
I have specified to query on load_id = 1199, and although there isn't a load_id = 1199 the query is still successful.
I would would like the second "try" statement to fail if the first "try" failed, but the second try statement runs regards of the first "try" statement.
Can someone let me know where I'm going wrong?
The table can be viewed here thetable
Upvotes: 1
Views: 5794
Reputation: 1344
Just thought I would share with you the answer;
try:
dataframe = spark.sql("select * from basetable where LOAD_ID = 1166")
except:
print("Exception occurred 1166")
if dataframe.count() == 0:
print("No data rows 1166")
else:
dataframe.coalesce(1).write.option("header","true").mode("overwrite").csv("adl://carlslake.azuredatalakestore.net/jfolder2/outputfiles/")
rename_file("adl://carlslake.azuredatalakestore.net/jfolder2/outputfiles", "adl://carlslake.azuredatalakestore.net/landing/RAW", "csv", "Delta_LoyaltyAccount_merged")
I hope it works for you too.
Upvotes: 2