GameOfThrows
GameOfThrows

Reputation: 4510

scala-spark Array mapping

I have a question about mapping Array in scala. I have the following Array:

Array[(scala.collection.immutable.Set[String], com.trends.City, com.trends.State)]

Basically, I want to map the Array such that each String in the Set will have com.trends.City and State attached to it. The result should look something like:

Array[(String, com.trends.City, com.trends.State)] 

Which is like flatMap, but I want the com.trends to be in there.

I could also convert the Array into a RDD if needed and use flatMapValues, but I am concerned for the efficiency, could someone tell me what is the best way to go?

Upvotes: 1

Views: 2206

Answers (1)

abalcerek
abalcerek

Reputation: 1819

You can use flatMap on scala array like this:

class City
class State
val array: Array[(scala.collection.immutable.Set[String], City, State)] = Array()
array.flatMap(p => p._1.map(q => (q, p._2, p._3)))

Upvotes: 3

Related Questions