Ravinder Karra
Ravinder Karra

Reputation: 307

Spark Scala How to use replace function in RDD

I am having a tweet file

396124436845178880,"When's 12.4k gonna roll around",Matty_T_03
396124437168537600,"I really wish I didn't give up everything I did for you.     I'm so mad at my self for even letting it get as far as it did.",savava143
396124436958412800,"I really need to double check who I'm sending my     snapchats to before sending it 😩😭",juliannpham
396124437218885632,"@Darrin_myers30 I feel you man, gotta stay prayed up.     Year is important",Ful_of_Ambition
396124437558611968,"tell me what I did in my life to deserve this.",_ItsNotBragging
396124437499502592,"Too many fine men out here...see me drooling",LolaofLife
396124437722198016,"@jaiclynclausen will do",I_harley99

I am trying to replace all special character after reading file into RDD,

    val fileReadRdd = sc.textFile(fileInput)
    val fileReadRdd2 = fileReadRdd.map(x => x.map(_.replace(","," ")))
    val fileFlat = fileReadRdd.flatMap(rec => rec.split(" "))

I am getting following error

Error:(41, 57) value replace is not a member of Char
    val fileReadRdd2 = fileReadRdd.map(x => x.map(_.replace(",","")))

Upvotes: 5

Views: 14450

Answers (2)

Yordan Georgiev
Yordan Georgiev

Reputation: 5460

The Perl's oneliner perl -pi 's/\s+//' $file in a regular file system would look as follows in spark scala on any spark supported file system ( feel free to adjust your regex ) :

// read the file into rdd of strings
val rdd: RDD[String] = spark.sparkContext.textFile(uri)

// for each line in rdd apply pattern and save to file
rdd
  .map(line => line.replaceAll("^\\s+", ""))
  .saveAsTextFile(uri + ".tmp")

Upvotes: 0

Brian Agnew
Brian Agnew

Reputation: 272417

I suspect:

x => x.map(_.replace(",",""))

is treating your string as a sequence of characters, and you actually want

x => x.replace(",", "")

(i.e. you don't need to map over the 'sequence' of chars)

Upvotes: 4

Related Questions