Reputation: 2757
I have imported Dataverse tables from Dynamics 365 into Azure Data Lake using PowerApps.
The tables have been imported into an ADLS container in CSV format. I now want to read in the data with Databricks using the regular spark.read.csv(/mnt/lake). However, the data doesn't show any column headers when it is read in.
Is there something that I'm doing wrong?
Upvotes: 0
Views: 276
Reputation: 3230
You can add column headers when you read CSV file to dataframe. Refer below sample code to add column headers.
data = "/path/to/file_name.csv"
columns = ["column_name_1"," column_name_2"," column_name_3"]
df = spark.read.csv(filename)(data=data, schema=columns)
Refer this article for more information
Upvotes: 1