user2405703
user2405703

Reputation: 87

riak spark connector doesn't work

My Riak-Spark connector doesn't work.

I can launch Spark with:

    /opt/spark/bin/spark-shell \
    --jars com.fasterxml.jackson.module_jackson-module-scala_2.11-2.4.4.jar \
    --conf spark.riak.connection.host=127.0.0.1:8087 \
    --packages com.basho.riak:spark-riak-connector_2.11:1.6.3

But when I run:

    import com.basho.riak.spark._
    val data = Array(1, 2, 3, 4, 5)
    val testRDD = sc.parallelize(data)

I got the error:

    scala> import com.basho.riak.spark._

    scala> val data = Array(1, 2, 3, 4, 5)
    data: Array[Int] = Array(1, 2, 3, 4, 5)

    scala> val testRDD = sc.parallelize(data)
    java.lang.VerifyError: class com.fasterxml.jackson.module.scala.ser.ScalaIteratorSerializer overrides final method withResolved.(Lcom/fasterxml/jackson/databind/BeanProperty;Lcom/fasterxml/jackson/databind/jsontype...

Can anyone help on installing and using this connector???

Many thanks in advance, J

Upvotes: 1

Views: 32

Answers (1)

user2405703
user2405703

Reputation: 87

I think I managed to clean my jars and now it seems to work pass this point. Tks anyway. J

Upvotes: 0

Related Questions