Reputation: 1040
I have a Stream of JsonNode that I want to convert to Stream<String[]> like CSV, where the first row is headers and then we have values. The TextDataUtils.sample
is expecting the same. I'm currently using Stream.concat
to achieve this but the first row will always be []
because I'm extracting headers only after we start processing the values stream.
GoogleAdsGetValuesResponse getValuesResponse = client.getValues(getValuesRequest);
AtomicLong consumedRows = new AtomicLong();
List<String> headers = new ArrayList<>();
Stream<String[]> valueStream =
getValuesResponse
.getValuesStream()
.limit(request.getLimit())
.map(
rowNode -> {
if (consumedRows.get() == 0) {
extractAllKeys(rowNode, "", headers);
headers.removeIf(header -> !header.contains("."));
}
String[] row =
headers.stream()
.map(header -> rowNode.at(getJsonPointerString(header)).asText())
.toArray(String[]::new);
consumedRows.getAndIncrement();
return row;
});
Stream<String[]> headerStream = Stream.of(headers.toArray(String[]::new), new String[] {});
Stream<String[]> dataStream = Stream.concat(headerStream, valueStream);
return TextDataUtils.sample(dataStream, Collections.emptyMap(), request.getLimit(), false);
How do I achieve this? Add the first element to the Stream from the same Stream. Appreciate any suggestions. I really appreciate any help you can provide.
Upvotes: 1
Views: 76
Reputation: 324
Java does not allow you to alter collections while iterating over them, so what you want to do is not possible.
I would suggest first processing the headers with limit(1)
and then proceed to concatenate that with the remaining processed items using skip(1)
.
Roughly:
Stream<String> headerStream = newResponseStream.limit(1).getHeaders()
Stream<String[]> valueStream = newResponseStream.skip(1).getRows()
Stream<String[]> dataStream = Stream.concat(headerStream, valueStream);
return TextDataUtils.sample(dataStream, Collections.emptyMap(), request.getLimit(), false);
Note that you cannot reuse the starting response stream. This will still be efficient if the underlying stream is properly lazily fetching.
As a side note, I would avoid writing map functions with side effects. It makes the code more difficult to reason about.
Upvotes: 1