Joe
Joe

Reputation: 13091

How to run HDFS file system command via Spark?

I can run this command for HDFS:

hadoop fs -ls /user/hive/warehouse/databasename.db/tablename

How to write command in Spark to show all files under specific folder in HDFS?

Thanks.

Upvotes: 1

Views: 1655

Answers (1)

Shawn.X
Shawn.X

Reputation: 1353

OK, the below scala code just give you a function to print all the hdfs files under a parent path.You can improve it according to your needs.

  def getAllPaths(parentPath:String, fs: FileSystem) = {
    val fileStatus = fs.listStatus(new Path(parentPath))
    for( file<- fileStatus) {
      println(file.getPath.toString)
    }
  }

Upvotes: 1

Related Questions