Geet
Geet

Reputation: 2575

Write spark dataframe from databricks onto the azure data lack store using R

I want to save/write/upload a spark dataframe from databricks onto the azure data lack store folder using R. I found the following python code.

spark_df.coalesce(1).write.format("com.databricks.spark.csv").option("header", "true").mode("overwrite").save('...path to azure data lake store folder')

Can you suggest me a SparkR equivalent of this code?

Upvotes: 1

Views: 319

Answers (1)

user10355350
user10355350

Reputation: 208

This should be:

spark_df %>% 
  coalesce(1L) %>%          # Same as coalesce(1).
  write.df(                 # Generic writer, because there is no csv specific one
    "...path to azure...",  # Path as before 
     source = "csv",        # Since 2.0 you don't need com.databricks 
     mode = "overwrite", 
     header = "true"        # All ... are used as options
  )

Upvotes: 2

Related Questions