Reputation: 1047
Flink version 1.6.1
In the following example, I want to connect two unkeyed streams. But it seems the two streams can't share states correctly. I don't know what's the right way to achieve it.
Code:
public class TransactionJob {
public static void main(String[] args) throws Exception {
final StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
DataStream<String> stream1 = env.fromElements("1", "2");
DataStream<Integer> stream2 = env.fromElements(3, 4, 5);
ConnectedStreams<String, Integer> connectedStreams = stream1.connect(stream2);
DataStream<String> resultStream = connectedStreams.process(new StringIntegerCoProcessFunction());
resultStream.print().setParallelism(1);
env.execute();
}
private static class StringIntegerCoProcessFunction extends CoProcessFunction<String, Integer, String> implements CheckpointedFunction {
private transient ListState<String> state1;
private transient ListState<Integer> state2;
@Override
public void processElement1(String value, Context ctx, Collector<String> out) throws Exception {
state1.add(value);
print(value);
}
@Override
public void processElement2(Integer value, Context ctx, Collector<String> out) throws Exception {
state2.add(value);
print(value.toString());
}
private void print(String value) throws Exception {
StringBuilder builder = new StringBuilder();
builder.append("input value is " + value + ".");
builder.append("state1 has ");
for (String str : state1.get()) {
builder.append(str + ",");
}
builder.append("state2 has ");
for (Integer integer : state2.get()) {
builder.append(integer.toString() + ",");
}
System.out.println(builder.toString());
}
@Override
public void snapshotState(FunctionSnapshotContext context) throws Exception {
}
@Override
public void initializeState(FunctionInitializationContext context) throws Exception {
ListStateDescriptor<String> descriptor1 =
new ListStateDescriptor<>(
"state1",
TypeInformation.of(new TypeHint<String>() {
}));
ListStateDescriptor<Integer> descriptor2 =
new ListStateDescriptor<>(
"state2",
TypeInformation.of(new TypeHint<Integer>() {
}));
state1 = context.getOperatorStateStore().getListState(descriptor1);
state2 = context.getOperatorStateStore().getListState(descriptor2);
}
}
}
Output:
input value is 4.state1 has state2 has 4,
input value is 2.state1 has 2,state2 has 4,
input value is 3.state1 has state2 has 3,
input value is 1.state1 has 1,state2 has 3,
input value is 5.state1 has state2 has 5,
I expect that the last piece of output would be
input value is XX .state1 has 1,2 state2 has 3,4,5
But actually the output looks like the input items are partitioned. 4 and 2 are in a partition, 3 and 1 are in another partition. I want to access all the data stored in state1 and state2 in both processElement1
and processElement2
.
Upvotes: 1
Views: 321
Reputation: 43707
You should modify the beginning of your job, like this:
final StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
env.setParallelism(1);
...
This will cause the whole job to run with a parallelism of 1. You do have
resultStream.print().setParallelism(1);
which has the effect of setting the print sink to have a parallelism of 1, but the rest of the job is running with the default parallelism, which is clearly greater than 1.
Alternatively, you could key both streams by the same constant key, and then use keyed state.
Upvotes: 3