Reputation: 593
im following the amazon doc on submiting spark jobs to emr cluster https://aws.amazon.com/premiumsupport/knowledge-center/emr-submit-spark-job-remote-cluster/
after following the instructions, with the frecuent troubleshoot it fails due to unresolved address with a message similar to.
ERROR spark.SparkContext: Error initializing SparkContext. java.lang.IllegalArgumentException: java.net.UnknownHostException: ip-172-32-1-231.us-east-2.compute.internal at org.apache.hadoop.security.SecurityUtil.buildTokenService(SecurityUtil.java:374) at org.apache.hadoop.hdfs.NameNodeProxies.createNonHAProxy(NameNodeProxies.java:310) at org.apache.hadoop.hdfs.NameNodeProxies.createProxy(NameNodeProxies.java:176)
as i saw that the IP it was trying to resolve was the master node one, i changed it with sed to the public one in the configurations files (the ones obtained from the /etc/hadoop/conf directory in the master node). but then the error is connecting to the datanodes
INFO hdfs.DFSClient: Exception in createBlockOutputStream org.apache.hadoop.net.ConnectTimeoutException: 60000 millis timeout while waiting for channel to be ready for connect. ch : java.nio.channels.SocketChannel[connection-pending remote=/172.32.1.41:50010] at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:533) at org.apache.hadoop.hdfs.DFSOutputStream.createSocketForPipeline(DFSOutputStream.java:1606) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1404) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1357) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:587) 19/02/08 13:54:58 INFO hdfs.DFSClient: Abandoning BP-1960505320-172.32.1.231-1549632479324:blk_1073741907_1086
finally i tried the same solution as this question = Spark HDFS Exception in createBlockOutputStream while uploading resource file
which was to add to the hdfs-site.xml file the following:
<property>
<name>dfs.client.use.datanode.hostname</name>
<value>true</value>
</property>
but the error persist as unresolved address exception
19/02/08 13:58:06 WARN hdfs.DFSClient: DataStreamer Exception
java.nio.channels.UnresolvedAddressException
at sun.nio.ch.Net.checkAddress(Net.java:101)
at sun.nio.ch.SocketChannelImpl.connect(SocketChannelImpl.java:622)
at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:192)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:530)
at org.apache.hadoop.hdfs.DFSOutputStream.createSocketForPipeline(DFSOutputStream.java:1606)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1404)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1357)
at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:587)
could somebody help me set up spark in my local machine to do spark-submit to remote EMR?
Upvotes: 0
Views: 1200
Reputation: 48
Besides following the answer on the linked question, you should also add the (public) IPs and (private) DNS of the worker nodes to your /etc/hosts file.
Upvotes: 2