Sophie Dinka
Sophie Dinka

Reputation: 83

Saving output of spark to csv in spark 1.6

Spark 1.6 scala

How to save output to csv file of spark 1.6.

i did something like this.

myCleanData.write.mode(SaveMode.Append).csv(path="file:///filepath")

but it throw error as

cannot resolve symbol csv

i tried like this even.

for dependency

       <!-- https://mvnrepository.com/artifact/com.databricks/spark-csv -->
        <dependency>
            <groupId>com.databricks</groupId>
            <artifactId>spark-csv_2.10</artifactId>
            <version>1.5.0</version>
        </dependency>

val outputfile = "file:///D:/path/output"

    val myCleanData= sqlContext.sql("""SELECT
                                               col1,
                                               col1,
                                               col1
                                               FROM dataframe
                                               WHERE col1 LIKE "^[a-zA-Z0-9]*$"
                                                  """ )



    myCleanData.write
      .format("com.databricks.spark.csv")
      .option("header", "true")
      .mode("overwrite")
      .save(outputfile)

But this give error as java.lang.NoClassDefFoundError: scala/collection/GenTraversableOnce$class

Please help if it is possible with spark 1.6.

Upvotes: 0

Views: 187

Answers (0)

Related Questions