Patterson
Patterson

Reputation: 2757

Databricks Move Table to a particular Schema in SQL Database

The following pyspark code will copy data to the default dbo Schema in Azure SQL Database.

test2.write.mode("overwrite") \
    .format("jdbc") \
    .option("url", jdbcUrl) \
    .option("dbtable", 'UpdatedProducts')\
    .save()

However, we have multiple schemas in the database.

We have a schema called OCC. Can someone modify the code to allow us to copy data to the schema OCC?

Upvotes: 1

Views: 1910

Answers (1)

Vamsi Bitra
Vamsi Bitra

Reputation: 2729

Thank you @Alex Ott, making it as an answer it will useful to other community members

I tried to reproduce the same in my environment and got the below results with dbo.schema:

enter image description here

If you have multiple schema then, use concatenation and making as schema.table_name

l_schemas=["OCC","dbo","one"]# list for storing your schemas
l_tables=["table1","table2","table3"] # list for storing respective tables in that particular indexed schema
for i in range(0,len(l_schemas)):
    s=l_schemas[i]+"."+l_tables[i] # concatenation and making as schema.table_name
    df.write.mode("overwrite") \
    .format("jdbc") \
    .option("url", jdbcUrl) \
    .option("dbtable", s)\
    .save()

Upvotes: 2

Related Questions