Error: Exception in thread "main" java.lang.ClassCastException: sun.nio.fs.UnixPath cannot be cast to org.apache.parquet.io.OutputFile

I am trying to convert a xml file to avro and then to parquet file format without using big data tools. I am able to convert it up to avro, but getting an error after that:

Exception in thread "main" java.lang.ClassCastException: sun.nio.fs.UnixPath cannot be cast to org.apache.parquet.io.OutputFile at Parquet.ConversionToParquet.main(ConversionToParquet.java:65)

Below is my code:

import java.io.ByteArrayOutputStream;
import java.io.FileOutputStream;
import java.io.InputStream;
import java.nio.file.Path;
import java.nio.file.Paths;
import org.apache.avro.Schema;
import org.apache.avro.generic.GenericData;
import org.apache.hadoop.conf.Configuration;
import org.apache.parquet.avro.AvroParquetWriter;
import org.apache.parquet.hadoop.ParquetWriter;
import org.apache.parquet.hadoop.metadata.CompressionCodecName;
import org.apache.parquet.io.OutputFile;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;

import javax.xml.XMLConstants;
import javax.xml.bind.JAXBContext;
import javax.xml.validation.SchemaFactory;


public class ConversionToParquet {

    private static final Logger LOGGER = LoggerFactory.getLogger(ConversionToParquet.class);

    private static final Path inputPath = Paths.get("/home/lucky/output.avro");
    private static final Path outputPath = Paths.get("/home/lucky/Desktop/sample.parquet");

    public static void main(String[] args) throws Exception {

        JAXBContext jaxbContext = JAXBContext.newInstance(ObjectFactory.class);
        javax.xml.validation.Schema newSchema = SchemaFactory
                .newInstance(XMLConstants.W3C_XML_SCHEMA_NS_URI)
                .newSchema(ConversionToParquet.class.getResource("/question.xsd"));
        XmlSerializer<Question> xmlSerializer = new XmlSerializer<>(
                jaxbContext, newSchema);
        InputStream questionStream = ConversionToParquet.class.getResourceAsStream(
                "/question.xml");
        Question question = xmlSerializer.readFromXml(questionStream);

        AvroSchemaGenerator schemaGenerator = new AvroSchemaGenerator();
        Schema questionSchema = schemaGenerator.generateSchema(Question.class);

        AvroSerializer<Question> avroSerializer = schemaGenerator.createAvroSerializer(questionSchema);
        ByteArrayOutputStream avroByteStream = new ByteArrayOutputStream();

        avroSerializer.writeToAvro(avroByteStream, question);
        byte[] avroBytes = avroByteStream.toByteArray();

        avroSerializer.writeToAvro(new FileOutputStream("/home/lucky/output.avro"), question);

        System.out.println("File Converted to Avro");


        try (ParquetWriter writer = AvroParquetWriter
                .builder((OutputFile) outputPath)
                .withSchema(questionSchema)
                .withConf(new Configuration())
                .withCompressionCodec(CompressionCodecName.SNAPPY)
                .build()){

            for (Path record : inputPath) {
                writer.write(record);
            }
        }
        System.out.println("File Convereted Successfully");
    }
}

Upvotes: 0

Views: 1496

Answers (1)

J&#246;rn Horstmann
J&#246;rn Horstmann

Reputation: 34034

The error is in this line:

.builder((OutputFile) outputPath)

The easiest solution would be to use the deprecated builder method taking a org.apache.hadoop.fs.Path parameter, which is different from the java.nio.file.Path class you are currently importing.

Upvotes: 0

Related Questions