Justin Wilson
Justin Wilson

Reputation: 340

Java Read Parquet File to JSON Output

Reading parquet file is working but getting indented format instead of a desired JSON output format. Any ideas? I was thinking that I may need to change GroupRecordConverter but wasn't able to find much documentation. If can point me to that, would also be helpful. Thanks very much for the help.

long num = numLines;
try {
  ParquetMetadata readFooter = ParquetFileReader.readFooter(conf, path, ParquetMetadataConverter.NO_FILTER);
  MessageType schema = readFooter.getFileMetaData().getSchema();
  ParquetFileReader r = new ParquetFileReader(conf,path,readFooter);

  PageReadStore pages = null;
  try{
    while(null != (pages = r.readNextRowGroup())) {
      final long rows = pages.getRowCount();
      System.out.println("Number of rows: " + rows);

      final MessageColumnIO columnIO = new ColumnIOFactory().getColumnIO(schema);
      final RecordReader recordReader = columnIO.getRecordReader(pages, new GroupRecordConverter(schema));
      String sTemp = "";
      for(int i=0; i<rows && num-->0; i++) {
        System.out.println(recordReader.read().toString())
      }
    }
  }
}

Current indented output:

data1: value1
data2: value2
models
  map
    key: data3
    value
      array: value3
  map
    key: data4
    value
      array: value4
data5: value5
...

Desired JSON output:

"data1": "value1",
"data2": "value2",
"models": {
    "data3": [
        "value3"
    ],
    "data4": [
        "value4"
    ]
},
"data5": "value5"
...

Upvotes: 6

Views: 9847

Answers (3)

Samarth Singhal
Samarth Singhal

Reputation: 9

By using SimpleRecordMaterializer as the RecordMaterializer, we can achieve the output in JSON Form and then using JsonRecordFormatter.JsonGroupFormatter

Here is a sample snipet, via which we can achieve this:

List<String> data = new ArrayList<>();
ParquetFileReader reader = ParquetFileReader.open(HadoopInputFile.fromPath(new Path(filePath), new Configuration()));
MessageType schema = reader.getFooter().getFileMetaData().getSchema();
JsonRecordFormatter.JsonGroupFormatter formatter = JsonRecordFormatter.fromSchema(schema);
PageReadStore pages;
while ((pages = reader.readNextRowGroup()) != null) {
  long rows = pages.getRowCount();
  MessageColumnIO columnIO = new ColumnIOFactory().getColumnIO(schema);
  RecordReader recordReader = columnIO.getRecordReader(pages, new SimpleRecordMaterializer(schema));

  for (int i = 0; i < rows; i++) {
      SimpleRecord simpleRecord = (SimpleRecord) recordReader.read();
      String record = formatter.formatRecord(simpleRecord);
      ObjectMapper objectMapper = new ObjectMapper();
      String recordPretty = objectMapper.writerWithDefaultPrettyPrinter().writeValueAsString(objectMapper.readTree(record));
      data.add(recordPretty);
  }
}
reader.close();

Upvotes: 0

tracy
tracy

Reputation: 1

I modify the source code of SimpleRecord toJsonObject method


protected Object toJsonObject() {
    Map<String, Object> result = Maps.newLinkedHashMap();

    if (arrayElement()) {
      return handleArrayElement(result);
    }

    for (NameValue value : values) {
      result.put(value.getName(), toJsonValue(value.getValue()));
    }

    return result;
  }

Upvotes: 0

J&#246;rg
J&#246;rg

Reputation: 2494

The java parquet lib's cat command tool code, might perhaps serve you as an example... containing the line:

org.apache.parquet.tools.json.JsonRecordFormatter.JsonGroupFormatter formatter = JsonRecordFormatter.fromSchema(metadata.getFileMetaData().getSchema());

See here for full source.

Upvotes: 1

Related Questions