Reputation: 59519
I stumbled upon this odd behavior when trying to check if a DataFrame
has values above a certain date, while that DataFrame may also contain pd.NaT
Comparisons of values behaves as expected:
import pandas as pd
pd.NaT > pd.to_datetime('2018-10-15')
# False
Comparisons with a Series
also behave as expected:
s = pd.Series([pd.NaT, pd.to_datetime('2018-10-16')])
s > pd.to_datetime('2018-10-15')
#0 False
#1 True
#dtype: bool
But the DataFrame
comparison isn't correct:
s.to_frame() > pd.to_datetime('2018-10-15')
# 0
#0 True
#1 True
It seems to me the issue is that the comparison initially returns NaN
which is (at some point?) coerced to True
given the behavior of:
df = pd.DataFrame([[pd.NaT, pd.to_datetime('2018-10-16')],
[pd.to_datetime('2018-10-16'), pd.NaT]])
df >= pd.to_datetime('2018-10-15')
# 0 1
#0 True True
#1 True True
df.ge(pd.to_datetime('2018-10-15'))
# 0 1
#0 NaN 1.0
#1 1.0 NaN
So can we really not use the > < >= <=
operators when comparing for a DataFrame
and need to rely on .lt .gt .le .ge
followed by a .fillna(0)
?
df.ge(pd.to_datetime('2018-10-15')).fillna(0)
# 0 1
#0 0.0 1.0
#1 1.0 0.0
Upvotes: 4
Views: 4239
Reputation: 33793
This was a bug that will be fixed in the next release of pandas (0.24.0):
In [1]: import pandas as pd; pd.__version__
Out[1]: '0.24.0.dev0+1504.g9642fea9c'
In [2]: s = pd.Series([pd.NaT, pd.to_datetime('2018-10-16')])
In [3]: s > pd.to_datetime('2018-10-15')
Out[3]:
0 False
1 True
dtype: bool
In [4]: s.to_frame() > pd.to_datetime('2018-10-15')
Out[4]:
0
0 False
1 True
In [5]: df = pd.DataFrame([[pd.NaT, pd.to_datetime('2018-10-16')],
...: [pd.to_datetime('2018-10-16'), pd.NaT]])
...:
In [6]: df >= pd.to_datetime('2018-10-15')
Out[6]:
0 1
0 False True
1 True False
In [7]: df.ge(pd.to_datetime('2018-10-15'))
Out[7]:
0 1
0 False True
1 True False
For the corresponding GitHub issue, see: https://github.com/pandas-dev/pandas/issues/22242
Upvotes: 4