Reputation: 2210
I am trying to setup hadoop-connectors on my local Ubuntu 20.04 and running the test command hadoop fs -ls gs://my-bucket
but I keep getting errors like the following:
$ hadoop fs -ls gs://my-bucket
2020-08-22 03:29:06,976 WARN fs.FileSystem: Cannot load filesystem: java.util.ServiceConfigurationError: org.apache.hadoop.fs.FileSystem: com.google.cloud.hadoop.fs.gcs.GoogleHadoopFileSystem Unable to get public no-arg constructor
2020-08-22 03:29:06,977 WARN fs.FileSystem: java.lang.NoClassDefFoundError: com/google/api/client/http/HttpRequestInitializer
2020-08-22 03:29:06,977 WARN fs.FileSystem: java.lang.ClassNotFoundException: com.google.api.client.http.HttpRequestInitializer
ls: No FileSystem for scheme "gs"
Note that I can access the bucket using gsutil ls gs://my-bucket
.
I have downloaded gcs-connector-hadoop3-latest.jar
from here and placed it inside /usr/local/hadoop/share/hadoop/common/lib
. I hope this is the right place for this jar file?
I've configured core-site.xml
with the properties listed here and also set GOOGLE_APPLICATION_CREDENTIALS
to my service account key file. In hadoop-env.sh
I've exported
export JAVA_HOME=/usr/lib/jvm/java-11-openjdk-amd64/
export HADOOP_CLASSPATH+="$HADOOP_CLASSPATH:$HADOOP_HOME/share/hadoop/common/lib/*.jar:$HADOOP_HOME/lib/*.jar"
Not sure if I've set HADOOP_CLASSPATH
correctly and if hadoop
recognizes the jar files inside /usr/local/hadoop/share/hadoop/common/lib
? And what is the difference to /usr/local/hadoop/lib
?
Here is the relevant content of core-site.xml
:
<configuration>
<property>
<name>fs.AbstractFileSystem.gs.impl</name>
<value>com.google.cloud.hadoop.fs.gcs.GoogleHadoopFS</value>
<description>The AbstractFileSystem for gs: uris.</description>
</property>
<property>
<name>fs.gs.project.id</name>
<value>my-project-id</value>
<description>
Optional. Google Cloud Project ID with access to GCS buckets.
Required only for list buckets and create bucket operations.
</description>
</property>
<property>
<name>google.cloud.auth.service.account.enable</name>
<value>true</value>
<description>
Whether to use a service account for GCS authorization.
Setting this property to `false` will disable use of service accounts for
authentication.
</description>
</property>
<property>
<name>google.cloud.auth.service.account.json.keyfile</name>
<value>/path/to/service-account.json</value>
<description>
The JSON key file of the service account used for GCS
access when google.cloud.auth.service.account.enable is true.
</description>
</property>
</configuration>
$ java --version
openjdk 11.0.8 2020-07-14
OpenJDK Runtime Environment (build 11.0.8+10-post-Ubuntu-0ubuntu120.04)
OpenJDK 64-Bit Server VM (build 11.0.8+10-post-Ubuntu-0ubuntu120.04, mixed mode, sharing)
$ hadoop version
Hadoop 3.3.0
Source code repository https://gitbox.apache.org/repos/asf/hadoop.git -r aa96f1871bfd858f9bac59cf2a81ec470da649af
Compiled by brahma on 2020-07-06T18:44Z
Compiled with protoc 3.7.1
From source with checksum 5dc29b802d6ccd77b262ef9d04d19c4
This command was run using /usr/local/hadoop/share/hadoop/common/hadoop-common-3.3.0.jar
bashrc:
...
export PDSH_RCMD_TYPE=ssh
export HADOOP_HOME="/usr/local/hadoop"
export PATH=$PATH:$HADOOP_HOME/bin
export PATH=$PATH:$HADOOP_HOME/sbin
export HADOOP_MAPRED_HOME=${HADOOP_HOME}
export HADOOP_COMMON_HOME=${HADOOP_HOME}
export HADOOP_HDFS_HOME=${HADOOP_HOME}
export YARN_HOME=${HADOOP_HOME}
Upvotes: 1
Views: 1573
Reputation: 2210
It seems that rebooting helped to solve the issue. After a reboot the command hadoop fs -ls gs://my-bucket
works and lists the content of the bucket as expected.
Thanks to @IgorDvorzhak providing the command: hadoop classpath --glob
to check if the gcs-connector-hadoop3-latest.jar
can be found. I used:
hadoop classpath --glob | grep gcs-connector
Upvotes: 1