Reputation: 861
I get some Feature Vector using SparkML TF-IDF algorithm. Now I want to get the Vector in the column of "idfFeatures".
My code is:
val vectors = allDF.select("idfFeatures").map{
case Row(vector: Vector) =>
vector
}
vectors.foreach(println(_))
There is a bug in console:
Error:(38, 24) type Vector takes type parameters
case Row(vector: Vector) =>
^
If I change Vector to String, there is another bug:
scala.MatchError: [(262144,[622,4200,7303,8501......,2.1972245773362196,1.2809338454620642])] (of class org.apache.spark.sql.catalyst.expressions.GenericRowWithSchema)
at scala.TFIDFTest2$$anonfun$1.apply(TFIDFTest2.scala:37)
How can I get the Vector?
Upvotes: 0
Views: 3259
Reputation:
Spark 1.x:
import org.apache.spark.mllib.linalg.Vector
Spark 2.0:
import org.apache.spark.ml.linalg.Vector
Example:
// https://spark.apache.org/docs/latest/ml-features.html#tf-idf
import org.apache.spark.ml.feature.{HashingTF, IDF, Tokenizer}
val sentenceData = spark.createDataFrame(Seq(
(0, "Hi I heard about Spark"),
(0, "I wish Java could use case classes"),
(1, "Logistic regression models are neat")
)).toDF("label", "sentence")
val tokenizer = new Tokenizer().setInputCol("sentence").setOutputCol("words")
val wordsData = tokenizer.transform(sentenceData)
val hashingTF = new HashingTF()
.setInputCol("words").setOutputCol("rawFeatures").setNumFeatures(20)
val featurizedData = hashingTF.transform(wordsData)
val idf = new IDF().setInputCol("rawFeatures").setOutputCol("features")
val idfModel = idf.fit(featurizedData)
val idf = new IDF().setInputCol("rawFeatures").setOutputCol("features")
val idfModel = idf.fit(featurizedData)
val rescaledData = idfModel.transform(featurizedData)
import org.apache.spark.ml.linalg.Vector
import org.apache.spark.sql.Row
rescaledData.select("features").rdd.map { case Row(v: Vector) => v}.first
Upvotes: 2