Reputation: 566
How can I read from a Volatile Layer using a Batch pipeline ?
public IntermediateData compileInFn(Pair<Key, Meta> in, LogContext logContext) {
String partitionID = in.getKey().partition().toString();
try {
if (!partitionID.isEmpty()) {
// Retrieve the partition.
Payload payload = retriever.getPayload(in.getKey(), in.getValue(), logContext);
}
}
} catch (Exception e) {
}
return new IntermediateData(in.getKey(), testResults);
}
Upvotes: 1
Views: 157
Reputation: 141
Well, it's actually no different than reading from a versioned catalog. However, this probably isn't what you actually want to do. A scheduled batch pipeline running a DPL compiler only triggers when a new version is published. In volatile layers, metadata isn't necessarily published (which triggers a version update) whenever new data is pushed to the volatile layer. So, unless the data provider is actually updating metadata every time they publish to a volatile layer, the version might not update and the batch pipeline may not get triggered. You could manually initiate a batch pipeline job and set the processing type to "reprocess" which will read the entire catalog but this will only run the compiler once.
Upvotes: 3