Anuj jain
Anuj jain

Reputation: 513

Flink Reading a S3 file causing Jackson dependency issue

I a reading a config YAML file in my link application. I want to keep this config file on S3 filesystem but when using aws-sdk in my pom and trying to read I am getting this error. I know this is due to Jackson's conflict dependency but I am not able to resolve it. Please help me to get it to resolve.

java.lang.NoSuchMethodError: com.fasterxml.jackson.databind.ObjectMapper.enable([Lcom/fasterxml/jackson/core/JsonParser$Feature;)Lcom/fasterxml/jackson/databind/ObjectMapper; at com.amazonaws.partitions.PartitionsLoader.(PartitionsLoader.java:54) at com.amazonaws.regions.RegionMetadataFactory.create(RegionMetadataFactory.java:30) at com.amazonaws.regions.RegionUtils.initialize(RegionUtils.java:65) at com.amazonaws.regions.RegionUtils.getRegionMetadata(RegionUtils.java:53) at com.amazonaws.regions.RegionUtils.getRegion(RegionUtils.java:107) at com.amazonaws.services.s3.AmazonS3Client.createSigner(AmazonS3Client.java:4016) at com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:4913) at com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:4872) at com.amazonaws.services.s3.AmazonS3Client.getObject(AmazonS3Client.java:1472) at com.bounce.processor.EventProcessor.main(EventProcessor.java:71) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:576)

This this code that I am using to read the file

        AmazonS3 amazonS3Client = new AmazonS3Client(credentials);

        S3Object object = amazonS3Client.getObject(new GetObjectRequest(S3_PROD_BUCKET, para.get("topology")));
        InputStream awsinputStream = object.getObjectContent();

This is my pom.xml

        <!-- Flink dependencies -->

        <dependency>
            <groupId>io.confluent</groupId>
            <artifactId>kafka-avro-serializer</artifactId>
            <version>5.3.0</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-streaming-java_${scala.binary.version}</artifactId>
            <version>${flink.version}</version>
        </dependency>
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-connector-kafka_${scala.binary.version}</artifactId>
            <version>${flink.version}</version>
        </dependency>
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-connector-filesystem_${scala.binary.version}</artifactId>
            <version>${flink.version}</version>
        </dependency>
        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-common</artifactId>
            <version>${hadoop.version}</version>
            <exclusions>
                <exclusion>
                    <groupId>commons-httpclient</groupId>
                    <artifactId>commons-httpclient</artifactId>
                </exclusion>
                <exclusion>
                    <groupId>org.apache.httpcomponents</groupId>
                    <artifactId>httpclient</artifactId>
                </exclusion>
                <exclusion>
                    <groupId>org.apache.httpcomponents</groupId>
                    <artifactId>httpcore</artifactId>
                </exclusion>
            </exclusions>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-avro</artifactId>
            <version>${flink.version}</version>
        </dependency>

        <dependency>
            <groupId>org.apache.parquet</groupId>
            <artifactId>parquet-avro</artifactId>
            <version>${flink.format.parquet.version}</version>
        </dependency>

        <dependency>
            <groupId>org.slf4j</groupId>
            <artifactId>slf4j-log4j12</artifactId>
            <version>1.7.7</version>
            <scope>runtime</scope>
        </dependency>
        <dependency>
            <groupId>log4j</groupId>
            <artifactId>log4j</artifactId>
            <version>1.2.17</version>
            <scope>runtime</scope>
        </dependency>
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-parquet_2.11</artifactId>
            <version>${flink.version}</version>
        </dependency>

        <dependency>
            <groupId>joda-time</groupId>
            <artifactId>joda-time</artifactId>
            <version>2.10.5</version>
        </dependency>

        <dependency>
            <groupId>com.fasterxml.jackson.dataformat</groupId>
            <artifactId>jackson-dataformat-yaml</artifactId>
            <version>2.10.2</version>
        </dependency>

        <!-- https://mvnrepository.com/artifact/com.esotericsoftware.yamlbeans/yamlbeans -->
        <dependency>
            <groupId>com.esotericsoftware.yamlbeans</groupId>
            <artifactId>yamlbeans</artifactId>
            <version>1.13</version>
        </dependency>

        <!-- https://mvnrepository.com/artifact/com.uber/h3 -->
        <dependency>
            <groupId>com.uber</groupId>
            <artifactId>h3</artifactId>
            <version>3.6.3</version>
        </dependency>

        <dependency>
            <groupId>com.github.davidmoten</groupId>
            <artifactId>geo</artifactId>
            <version>0.7.7</version>
        </dependency>

        <!-- https://mvnrepository.com/artifact/org.apache.flink/flink-connector-elasticsearch6 -->
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-connector-elasticsearch7_2.11</artifactId>
            <version>${flink.version}</version>
        </dependency>

        <dependency>
            <groupId>tech.allegro.schema.json2avro</groupId>
            <artifactId>converter</artifactId>
            <version>0.2.9</version>
        </dependency>


        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-statebackend-rocksdb_2.12</artifactId>
            <version>1.10.0</version>
            <scope>provided</scope>
        </dependency>

        <dependency>
            <groupId>junit</groupId>
            <artifactId>junit</artifactId>
            <version>4.13</version>
            <scope>test</scope>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-avro-confluent-registry</artifactId>
            <version>${flink.version}</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-connector-elasticsearch7_2.11</artifactId>
            <version>${flink.version}</version>
        </dependency>


        <dependency>
            <groupId>com.amazonaws</groupId>
            <artifactId>aws-java-sdk-bundle</artifactId>
            <version>1.11.756</version>
        </dependency>

        <!-- https://mvnrepository.com/artifact/com.fasterxml.jackson.core/jackson-databind -->
        <dependency>
            <groupId>com.fasterxml.jackson.core</groupId>
            <artifactId>jackson-databind</artifactId>
            <version>2.10.0</version>
        </dependency>

Upvotes: 1

Views: 732

Answers (1)

Dominik Wosiński
Dominik Wosiński

Reputation: 3864

Getting the answer about conflict solely from POM is highly unlikely. Instead, You should refer to the maven dependency plugin, by invoking the following command:

mvn dependency:tree

This will print all the dependencies and the dependencies of those dependencies, this way You will be able to locate which of libraries You are importing has the transitive dependency on Jackson and You will be able to mark it as excluded.

Note: What You are really looking for in this dependency tree are the Jackson dependencies with different versions, so You don't need to exclude them all.

Upvotes: 1

Related Questions