S.D.
S.D.

Reputation: 1201

datastax - spark shell launch error

i have enabled the spark on couple of nodes using datastax enterprise version. Once enabled i have restarted the dse services & following is my nodetool configuration ,

dsetool status:

[user@server ~]$ dsetool ring
Address          DC     Rack    Workload        Graph  Status  State    Load             Owns     VNodes    Health [0,1]
192.168.1.130    dc1    rack1   Analytics(SM)   no     Up      Normal   666.47 MiB       ?        128        0.00
192.168.1.131    dc1    rack1   Analytics(SW)   no     Up      Normal   672.09 MiB       ?        128        0.00
192.168.1.132    dc1    rack1   Search          no     Up      Normal   658.48 MiB       ?        128        0.90

When i try to launch spark shell, i'm getting following error...

The log file is at /root/.spark-shell.log
WARN  2017-05-09 14:09:15,215 org.apache.spark.deploy.SparkNodeConfiguration: Failed to fetch dynamic configuration from DSE, retrying in 1 seconds...
WARN  2017-05-09 14:09:18,459 org.apache.spark.deploy.SparkNodeConfiguration: Failed to fetch dynamic configuration from DSE, retrying in 2 seconds...
WARN  2017-05-09 14:09:22,698 org.apache.spark.deploy.SparkNodeConfiguration: Failed to fetch dynamic configuration from DSE, retrying in 4 seconds...
WARN  2017-05-09 14:09:28,941 org.apache.spark.deploy.SparkNodeConfiguration: Failed to fetch dynamic configuration from DSE, retrying in 8 seconds...
WARN  2017-05-09 14:09:39,234 org.apache.spark.deploy.SparkNodeConfiguration: Failed to fetch dynamic configuration from DSE, retrying in 16 seconds...
ERROR 2017-05-09 14:09:57,476 org.apache.spark.deploy.DseSparkSubmitBootstrapper: Failed to start or submit Spark application
WARN  2017-05-09 14:09:59,869 org.apache.spark.deploy.SparkNodeConfiguration: Failed to fetch dynamic configuration from DSE, retrying in 1 seconds...
WARN  2017-05-09 14:10:03,099 org.apache.spark.deploy.SparkNodeConfiguration: Failed to fetch dynamic configuration from DSE, retrying in 2 seconds...
WARN  2017-05-09 14:10:07,346 org.apache.spark.deploy.SparkNodeConfiguration: Failed to fetch dynamic configuration from DSE, retrying in 4 seconds...
WARN  2017-05-09 14:10:13,678 org.apache.spark.deploy.SparkNodeConfiguration: Failed to fetch dynamic configuration from DSE, retrying in 8 seconds...
WARN  2017-05-09 14:10:23,913 org.apache.spark.deploy.SparkNodeConfiguration: Failed to fetch dynamic configuration from DSE, retrying in 16 seconds...
ERROR 2017-05-09 14:10:42,247 org.apache.spark.deploy.DseSparkSubmitBootstrapper: Failed to cancel delegation token

Exception from Log file:

2017-05-09 16:10:49 [main] ERROR o.a.s.d.DseSparkSubmitBootstrapper - Failed to start or submit Spark application
java.io.IOException: Failed to fetch dynamic configuration from DSE
        at org.apache.spark.deploy.SparkNodeConfiguration$.fetch(SparkNodeConfiguration.scala:86) ~[dse-spark-5.1.0.jar:5.1.0]
        at org.apache.spark.deploy.SparkNodeConfiguration$.fetch(SparkNodeConfiguration.scala:84) ~[dse-spark-5.1.0.jar:5.1.0]
        at org.apache.spark.deploy.SparkNodeConfiguration$.fetch(SparkNodeConfiguration.scala:84) ~[dse-spark-5.1.0.jar:5.1.0]
        at org.apache.spark.deploy.SparkNodeConfiguration$.fetch(SparkNodeConfiguration.scala:84) ~[dse-spark-5.1.0.jar:5.1.0]
        at org.apache.spark.deploy.SparkNodeConfiguration$.fetch(SparkNodeConfiguration.scala:84) ~[dse-spark-5.1.0.jar:5.1.0]
        at org.apache.spark.deploy.SparkNodeConfiguration$.fetch(SparkNodeConfiguration.scala:84) ~[dse-spark-5.1.0.jar:5.1.0]
        at org.apache.spark.deploy.SparkNodeConfiguration$.apply(SparkNodeConfiguration.scala:43) ~[dse-spark-5.1.0.jar:5.1.0]
        at org.apache.spark.deploy.SparkConfigurator.x$4$lzycompute(SparkConfigurator.scala:85) ~[dse-spark-5.1.0.jar:5.1.0]
        at org.apache.spark.deploy.SparkConfigurator.x$4(SparkConfigurator.scala:71) ~[dse-spark-5.1.0.jar:5.1.0]
        at org.apache.spark.deploy.SparkConfigurator.sparkNodeConfiguration$lzycompute(SparkConfigurator.scala:71) ~[dse-spark-5.1.0.jar:5.1.0]
        at org.apache.spark.deploy.SparkConfigurator.sparkNodeConfiguration(SparkConfigurator.scala:71) ~[dse-spark-5.1.0.jar:5.1.0]
        at org.apache.spark.deploy.SparkConfigurator.dseDriverProps$lzycompute(SparkConfigurator.scala:180) ~[dse-spark-5.1.0.jar:5.1.0]
        at org.apache.spark.deploy.SparkConfigurator.dseDriverProps(SparkConfigurator.scala:149) ~[dse-spark-5.1.0.jar:5.1.0]
        at org.apache.spark.deploy.SparkConfigurator.dseSparkConfEntries$lzycompute(SparkConfigurator.scala:124) ~[dse-spark-5.1.0.jar:5.1.0]
        at org.apache.spark.deploy.SparkConfigurator.dseSparkConfEntries(SparkConfigurator.scala:124) ~[dse-spark-5.1.0.jar:5.1.0]
        at org.apache.spark.deploy.DseSparkArgsPreprocessor.updatedArgs$lzycompute(DseSparkArgsPreprocessor.scala:79) ~[dse-spark-5.1.0.jar:5.1.0]
        at org.apache.spark.deploy.DseSparkArgsPreprocessor.updatedArgs(DseSparkArgsPreprocessor.scala:68) ~[dse-spark-5.1.0.jar:5.1.0]
        at org.apache.spark.deploy.DseSparkSubmitBootstrapper$.main(DseSparkSubmitBootstrapper.scala:67) ~[dse-spark-5.1.0.jar:5.1.0]
        at org.apache.spark.deploy.DseSparkSubmitBootstrapper.main(DseSparkSubmitBootstrapper.scala) [dse-spark-5.1.0.jar:5.1.0]
Caused by: java.io.IOException: Failed to open native connection to Cassandra at {192.168.1.132}:9042
        at com.datastax.spark.connector.cql.CassandraConnector$.com$datastax$spark$connector$cql$CassandraConnector$$createSession(CassandraConnector.scala:168) ~[spark-cassandra-connector-unshaded_2.11-2.0.1.jar:2.0.1]
        at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$8.apply(CassandraConnector.scala:154) ~[spark-cassandra-connector-unshaded_2.11-2.0.1.jar:2.0.1]
        at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$8.apply(CassandraConnector.scala:154) ~[spark-cassandra-connector-unshaded_2.11-2.0.1.jar:2.0.1]
        at com.datastax.spark.connector.cql.RefCountedCache.createNewValueAndKeys(RefCountedCache.scala:32) ~[spark-cassandra-connector-unshaded_2.11-2.0.1.jar:2.0.1]
        at com.datastax.spark.connector.cql.RefCountedCache.syncAcquire(RefCountedCache.scala:69) ~[spark-cassandra-connector-unshaded_2.11-2.0.1.jar:2.0.1]
        at com.datastax.spark.connector.cql.RefCountedCache.acquire(RefCountedCache.scala:57) ~[spark-cassandra-connector-unshaded_2.11-2.0.1.jar:2.0.1]
        at com.datastax.spark.connector.cql.CassandraConnector.openSession(CassandraConnector.scala:79) ~[spark-cassandra-connector-unshaded_2.11-2.0.1.jar:2.0.1]
        at com.datastax.spark.connector.cql.CassandraConnector.withSessionDo(CassandraConnector.scala:111) ~[spark-cassandra-connector-unshaded_2.11-2.0.1.jar:2.0.1]
        at org.apache.spark.deploy.SparkNodeConfiguration$.fetch(SparkNodeConfiguration.scala:51) ~[dse-spark-5.1.0.jar:5.1.0]
        ... 18 common frames omitted
Caused by: com.datastax.driver.core.exceptions.AuthenticationException: Authentication error on host /192.168.1.132:9042: Host /192.168.1.132:9042 requires authentication, but no authenticator found in Cluster configuration
        at com.datastax.driver.core.AuthProvider$1.newAuthenticator(AuthProvider.java:31) ~[dse-java-driver-core-1.2.2.jar:na]
        at com.datastax.driver.core.Connection$5.apply(Connection.java:248) ~[dse-java-driver-core-1.2.2.jar:na]
        at com.datastax.driver.core.Connection$5.apply(Connection.java:233) ~[dse-java-driver-core-1.2.2.jar:na]
        at com.google.common.util.concurrent.Futures$ChainingListenableFuture.run(Futures.java:906) ~[guava-18.0.jar:na]
        at com.google.common.util.concurrent.Futures$1$1.run(Futures.java:635) ~[guava-18.0.jar:na]
        at com.google.common.util.concurrent.MoreExecutors$DirectExecutorService.execute(MoreExecutors.java:299) ~[guava-18.0.jar:na]
        at com.google.common.util.concurrent.Futures$1.run(Futures.java:632) ~[guava-18.0.jar:na]
        at com.google.common.util.concurrent.MoreExecutors$DirectExecutor.execute(MoreExecutors.java:457) ~[guava-18.0.jar:na]
        at com.google.common.util.concurrent.ExecutionList.executeListener(ExecutionList.java:156) ~[guava-18.0.jar:na]
        at com.google.common.util.concurrent.ExecutionList.execute(ExecutionList.java:145) ~[guava-18.0.jar:na]
        at com.google.common.util.concurrent.AbstractFuture.set(AbstractFuture.java:185) ~[guava-18.0.jar:na]
        at com.datastax.driver.core.Connection$Future.onSet(Connection.java:1293) ~[dse-java-driver-core-1.2.2.jar:na]
        at com.datastax.driver.core.Connection$Dispatcher.channelRead0(Connection.java:1074) ~[dse-java-driver-core-1.2.2.jar:na]
        at com.datastax.driver.core.Connection$Dispatcher.channelRead0(Connection.java:991) ~[dse-java-driver-core-1.2.2.jar:na]
        at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105) ~[netty-all-4.0.42.Final.jar:4.0.42.Final]
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367) ~[netty-all-4.0.42.Final.jar:4.0.42.Final]
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353) ~[netty-all-4.0.42.Final.jar:4.0.42.Final]
        at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:346) ~[netty-all-4.0.42.Final.jar:4.0.42.Final]
        at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:266) ~[netty-all-4.0.42.Final.jar:4.0.42.Final]
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367) ~[netty-all-4.0.42.Final.jar:4.0.42.Final]
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353) ~[netty-all-4.0.42.Final.jar:4.0.42.Final]
        at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:346) ~[netty-all-4.0.42.Final.jar:4.0.42.Final]
        at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102) ~[netty-all-4.0.42.Final.jar:4.0.42.Final]
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367) ~[netty-all-4.0.42.Final.jar:4.0.42.Final]
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353) ~[netty-all-4.0.42.Final.jar:4.0.42.Final]
        at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:346) ~[netty-all-4.0.42.Final.jar:4.0.42.Final]
        at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:293) ~[netty-all-4.0.42.Final.jar:4.0.42.Final]
        at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:267) ~[netty-all-4.0.42.Final.jar:4.0.42.Final]
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367) ~[netty-all-4.0.42.Final.jar:4.0.42.Final]
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353) ~[netty-all-4.0.42.Final.jar:4.0.42.Final]
        at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:346) ~[netty-all-4.0.42.Final.jar:4.0.42.Final]
        at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1294) ~[netty-all-4.0.42.Final.jar:4.0.42.Final]
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:367) ~[netty-all-4.0.42.Final.jar:4.0.42.Final]
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:353) ~[netty-all-4.0.42.Final.jar:4.0.42.Final]
        at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:911) ~[netty-all-4.0.42.Final.jar:4.0.42.Final]
        at io.netty.channel.epoll.AbstractEpollStreamChannel$EpollStreamUnsafe.epollInReady(AbstractEpollStreamChannel.java:934) ~[netty-all-4.0.42.Final.jar:4.0.42.Final]
        at io.netty.channel.epoll.EpollEventLoop.processReady(EpollEventLoop.java:405) ~[netty-all-4.0.42.Final.jar:4.0.42.Final]
        at io.netty.channel.epoll.EpollEventLoop.run(EpollEventLoop.java:310) ~[netty-all-4.0.42.Final.jar:4.0.42.Final]
        at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:140) ~[netty-all-4.0.42.Final.jar:4.0.42.Final]
        at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144) ~[netty-all-4.0.42.Final.jar:4.0.42.Final]
        at java.lang.Thread.run(Unknown Source) ~[na:1.8.0_121]

Upvotes: 2

Views: 3252

Answers (2)

S.D.
S.D.

Reputation: 1201

Appreciate all your responses. I have figured this out & I'm able to start the dse spark shell by issuing following command,

sudo dse -u <cassandra_username> -p <cassandra_password> spark

This is because I have internal authentication enabled on cassandra cluster.

Note: My setup is done using Datastax Enterprise binaries. It could be different if you have installed apache cassandra & apache spark separately.

Upvotes: 2

RussS
RussS

Reputation: 16576

There is a communication error with DSE. This is probably because your DC is mixed workload which is unsupported. Either set all nodes to Analytics, all nodes to Search, or all nodes to both Search And Analytics.

Upvotes: 1

Related Questions