pratyush
pratyush

Reputation: 36

Using Custom spark transformers in pyspark

How can I use a custom transformer written in scala in a pyspark pipeline.

class UpperTransformer(override val uid: String)
    extends UnaryTransformer[String, String, UpperTransformer] {

  def this() = this(Identifiable.randomUID("upper"))

  override protected def validateInputType(inputType: DataType): Unit = {
    require(inputType == StringType)
  }

  protected def createTransformFunc: String => String = {
    _.toUpperCase
  }

  protected def outputDataType: DataType = StringType
}

Use this transformer in pyspark pipeline.

Upvotes: 2

Views: 371

Answers (0)

Related Questions