Reputation: 357
What is the best way to find the maximum number of consecutive repeated nan in a numpy array?
Examples:
from numpy import nan
Input 1: [nan, nan, nan, 0.16, 1, 0.16, 0.9999, 0.0001, 0.16, 0.101, nan, 0.16]
Output 1: 3
Input 2: [nan, nan, 2, 1, 1, nan, nan, nan, nan, 0.101, nan, 0.16]
Output 2: 4
Upvotes: 8
Views: 2222
Reputation: 5031
This can be done quite efficiently in NumPy without using any loops.
If we call the sequence x
then we can find the maximum number of consequtive nan
's with:
np.max(np.diff(np.concatenate(([-1], np.where(-np.isnan(x))[0], [len(x)]))) - 1)
Upvotes: 0
Reputation: 18628
A performance improvement is possible, especially when a long nan sequences exists. In these cases there is no need to test all the values .
with @MSeifert approach and notations, the array can be sweeped by step of max_
instead of one, if any hole appear in a max_
length block :
@nb.njit
def max_consecutive_nan2(arr):
max_ = 0
idx = 0
while idx < arr.size:
while idx < arr.size and math.isnan(arr[idx]): # amelioration
max_ += 1
idx += 1
while idx < arr.size - max_:
idx2 = idx + max_
while idx2>idx and math.isnan(arr[idx2]):
idx2 -=1
if idx2==idx: # record reached.
idx = idx + max_ +1
break # goto amelioration
idx=idx2 # skip unuseful tests
else : return max_
return max_ #case record at end.
Results :
arr = np.random.rand(10000)
arr[np.random.choice(range(len(arr)),size=4000,replace=0)] = np.nan
In [25]: max_consecutive_nan(arr)
Out[25]: 14
In [26]: max_consecutive_nan2(arr)
Out[26]: 14
And performances :
In [27]: %timeit max_consecutive_nan2(arr)
100000 loops, best of 3: 3.29 µs per loop
In [28]: %timeit max_consecutive_nan(arr) # MSeifert
10000 loops, best of 3: 68.5 µs per loop
Upvotes: 1
Reputation: 18628
An other easy to read and understand way is to use strings, then str.split
:
array2 = [nan, nan, 2, 1, 1, nan, nan, nan, nan, 0.101, nan, 0.16]
thestring=isnan(array2).tobytes().decode()
#'\x01\x01\x00\x00\x00\x01\x01\x01\x01\x00\x01\x00'
m=max(len(c) for c in thestring.split('\x00'))
# 4
Upvotes: 0
Reputation: 152647
I don't know if you have numba but it's very handy (and fast) for such exceptional problems:
import numba as nb
import math
@nb.njit # also works without but then it's several orders of magnitudes slower
def max_consecutive_nan(arr):
max_ = 0
current = 0
idx = 0
while idx < arr.size:
while idx < arr.size and math.isnan(arr[idx]):
current += 1
idx += 1
if current > max_:
max_ = current
current = 0
idx += 1
return max_
For your examples:
>>> from numpy import nan
>>> max_consecutive_nan(np.array([nan, nan, 2, 1, 1, nan, nan, nan, nan, 0.101, nan, 0.16]))
4
>>> max_consecutive_nan(np.array([nan, nan, nan, 0.16, 1, 0.16, 0.9999, 0.0001, 0.16, 0.101, nan, 0.16]))
3
>>> max_consecutive_nan(np.array([0.16, 0.16, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan]))
22
Using the benchmark proposed by @Divarkar and ordered by performance (the complete code for the benchmarks can be found in this gist):
arr = np.random.rand(10000)
arr[np.random.choice(range(len(arr)),size=1000,replace=0)] = np.nan
%timeit mine(arr) # 10000 loops, best of 3: 67.7 µs per loop
%timeit Divakar_v2(arr) # 1000 loops, best of 3: 196 µs per loop
%timeit Divakar(arr) # 1000 loops, best of 3: 252 µs per loop
%timeit Tagc(arr) # 100 loops, best of 3: 6.92 ms per loop
%timeit Kasramvd(arr) # 10 loops, best of 3: 38.2 ms per loop
%timeit pltrdy(arr) # 10 loops, best of 3: 70.9 ms per loop
Upvotes: 4
Reputation: 221534
Here's one approach -
def max_repeatedNaNs(a):
# Mask of NaNs
mask = np.concatenate(([False],np.isnan(a),[False]))
if ~mask.any():
return 0
else:
# Count of NaNs in each NaN group. Then, get max count as o/p.
c = np.flatnonzero(mask[1:] < mask[:-1]) - \
np.flatnonzero(mask[1:] > mask[:-1])
return c.max()
Here's an improved version -
def max_repeatedNaNs_v2(a):
mask = np.concatenate(([False],np.isnan(a),[False]))
if ~mask.any():
return 0
else:
idx = np.nonzero(mask[1:] != mask[:-1])[0]
return (idx[1::2] - idx[::2]).max()
Benchmarking in response to @pltrdy's comment
-
In [77]: a = np.random.rand(10000)
In [78]: a[np.random.choice(range(len(a)),size=1000,replace=0)] = np.nan
In [79]: %timeit contiguous_NaN(a) #@pltrdy's solution
100 loops, best of 3: 15.8 ms per loop
In [80]: %timeit max_repeatedNaNs(a)
10000 loops, best of 3: 103 µs per loop
In [81]: %timeit max_repeatedNaNs_v2(a)
10000 loops, best of 3: 86.4 µs per loop
Upvotes: 6
Reputation: 2109
Here's my solution.
Computational complexity is O(n)
with n = len(arr)
, space is O(1)
def contiguous_NaN(arr):
count, max_count = 0, 0
for e in arr:
if np.isnan(e):
count += 1
max_count = max(max_count, count)
else:
count = 0
return max_count
Edit: Please keep in mind that the point of your code is:
Upvotes: 0
Reputation: 9076
I posted another answer based on itertools
, but I believe this one is better:
from itertools import groupby
from numpy import nan
def longest_nan_run(sequence):
return max((sum(1 for _ in group) for key, group in groupby(sequence) if key is nan), default=0)
if __name__ == '__main__':
array1 = [nan, nan, nan, 0.16, 1, 0.16, 0.9999, 0.0001, 0.16, 0.101, nan, 0.16]
array2 = [nan, nan, 2, 1, 1, nan, nan, nan, nan, 0.101, nan, 0.16]
print(longest_nan_run(array1)) # 3
print(longest_nan_run(array2)) # 4
print(longest_nan_run([])) # 0
print(longest_nan_run([1, 2])) # 0
Edit: This now handles the case where no nan
values are present (thanks to MSeifert for pointing it out).
Upvotes: 1