Reputation: 761
I have
val colNames = data.schema.fieldNames
.filter(colName => colName.split("-")(0) == "20003" || colName == "eid")
which I then use to select a subset of a dataframe:
var medData = data.select(colNames.map(c => col(c)): _*).rdd
but I get
cannot resolve '`20003-0.0`' given input columns:
[20003-0.0, 20003-0.1, 20003-0.2, 20003-0.3];;
What is going on?
Upvotes: 0
Views: 6045
Reputation: 761
I had to include backticks like this:
var medData = data.select(colNames.map(c => col(s"`$c`")): _*).rdd
spark is for some reason adding the backticks
Upvotes: 4