Ygg
Ygg

Reputation: 3870

Hadoop Configuration in Spark

I need to get hold of the current Hadoop org.apache.hadoop.conf.Configuration in my Spark job, for debugging purposes. Specifically I need to get org.apache.hadoop.fs.FileSystem for a path with the org.apache.hadoop.fs.Path#getFileSystem(conf: Configuration) method

Given a org.apache.spark.SparkContext, is there a way to get the Configuration?

Upvotes: 4

Views: 8749

Answers (1)

Sahil Desai
Sahil Desai

Reputation: 3696

you can set configration as per below code

sc.hadoopConfiguration.set("my.mapreduce.setting","someValue")

Upvotes: 6

Related Questions