randomuser1
randomuser1

Reputation: 2803

How can I skip the limit(number) call with a stream when the number equals 0?

I have some Java code that provides objects from items. It limits them based on the maxNumber:

items.stream()
     .map(this::myMapper)
     .filter(item -> item != null)
     .limit(maxNumber)
     .collect(Collectors.toList());

It works properly, but the question is this: Is there a way of skipping the limiting when the maxNumber == 0?

I know I could do this:

if (maxNumber == 0) {
    items.stream()
         .map(this::myMapper)
         .filter(item -> item != null)
         .collect(Collectors.toList());
} else {
    items.stream()
         .map(this::myMapper)
         .filter(item -> item != null)
         .limit(maxNumber)
         .collect(Collectors.toList());
}

But perhaps there's a better way, does anything come to your mind?

Upvotes: 22

Views: 1698

Answers (2)

Kayaman
Kayaman

Reputation: 73548

No, the stream pipeline doesn't allow to actually skip around any part of the pipeline, so you're forced to work with either conditional logic inside the steps and including the limit() always in the pipeline, or building the stream in parts which would be a bit more legible (IMHO) than the if/else in the question

Stream<Item> s = items.stream()
         .map(this::myMapper)
         .filter(Objects::nonNull);

if(maxNumber > 0) {
    s = s.limit(maxNumber);
}

List<Item> l = s.collect(Collectors.toList());

In a simple case like here it doesn't make much difference, but you often see in regular code collections being passed through methods, converted to streams and then back to collections. In such cases it might be a better idea to work with streams in parts until you really need to collect().

Upvotes: 21

Jean-Baptiste Yun&#232;s
Jean-Baptiste Yun&#232;s

Reputation: 36401

I suppose that

.limit(maxNumber == 0 ? Long.MAX_VALUE : maxNumber)

will do the trick, as it is highly non probable that you are going to tackle a stream with more than 2^63-1 elements...

At least be careful with parallel streams on this... A note in API docs says:

API Note: While limit() is generally a cheap operation on sequential stream pipelines, it can be quite expensive on ordered parallel pipelines, especially for large values of maxSize...

Upvotes: 18

Related Questions