am1212
am1212

Reputation: 553

case map will not work in spark shell

I have a code like this

val pop = sc.textFile("population.csv")
.filter(line => !line.startsWith("Country"))
.map(line => line.split(","))
.map { case Array(CountryName, CountryCode, Year, Value) => (CountryName, CountryCode, Year, Value) }

The file looks like this.

Country Name,Country Code,Year,Value
Arab World,ARB,1960,93485943
Arab World,ARB,1961,96058179
Arab World,ARB,1962,98728995
Arab World,ARB,1963,101496308
Arab World,ARB,1964,104359772
Arab World,ARB,1965,107318159
Arab World,ARB,1966,110379639
Arab World,ARB,1967,113543760
Arab World,ARB,1968,116787194

up until the .map {case}, I can print out by pop.take(10), And I get Array[Array[String]]. But once the case is added, I'm getting

error: not found: value (all columns)

all columns meaning 4 different errors with CountryName, CountryCode, Year, Value, etc... Not sure where I'm doing wrong. The data is clean.

Upvotes: 0

Views: 148

Answers (1)

Aivean
Aivean

Reputation: 10882

You need to use lowercase variable names in pattern matching. I.e:

.map { case Array(countryName, countryCode, year, value) => (countryName, countryCode, year, value) }

In Scala's pattern matching variables that are Capitalized as well as variables enclosed in backticks (`) are taken from outer scope and used as constants. Here is an example to illustrate what I'm saying:

Array("a") match {
  case Array(a) => a
}

Will match array with any string, while:

val A = "a"
Array("a") match {
  case Array(A) => 
}

will match only literal "a". Or, equivalent:

val a = "a"
Array("a") match {
  case Array(`a`) => 
}

will also match only literal "a".

Upvotes: 1

Related Questions