Reputation: 457
Can anybody reproduce this behavior of xarray when saving times with large values? I'm a bit at a loss as to what is happening here.
Edit: It seems xarray is doing something wrong if the numeric value of "time" exceeds a certain threshold. Note that this does only occur for "days since" and not, e.g., for "seconds since".
I'm am using Python 3 and xarray version 0.10.7.
import numpy as np
import xarray as xr
# print('xarray version: {}'.format(xr.__version__))
ds = xr.Dataset(coords={'time': (
'time',
np.arange(106300.5, 106665.5+5*365, 365),
{'units': 'days since 1800-01-01 00:00:00'})})
# print(ds.time)
ds = xr.decode_cf(ds)
# print(ds.time)
ds.to_netcdf('./test.nc')
ds = xr.open_dataset('./test.nc', decode_cf=False)
print(ds.time)
Out:
<xarray.DataArray 'time' (time: 6)>
array([ 106300.5 , 106665.5 , -106473.482335, -106108.482335,
-105743.482335, -105378.482335])
Coordinates:
* time (time) float64 1.063e+05 1.067e+05 -1.065e+05 -1.061e+05 ...
Attributes:
_FillValue: nan
units: days since 1800-01-01
calendar: proleptic_gregorian
Edit: Here is the is the file content with ncdump:
netcdf test {
dimensions:
time = 6 ;
variables:
double time(time) ;
time:_FillValue = NaN ;
time:units = "days since 1800-01-01" ;
time:calendar = "proleptic_gregorian" ;
// global attributes:
:_NCProperties = "version=1|netcdflibversion=4.4.1.1|hdf5libversion=1.10.1" ;
data:
time = 106300.5, 106665.5, -106473.482334601, -106108.482334601,
-105743.482334601, -105378.482334601 ;
}
Upvotes: 1
Views: 1422
Reputation: 2097
Yes, I can reproduce this. This could be considered a bug in xarray; you might consider raising an issue on GitHub.
When saving the file, under the hood xarray is taking the decoded dates and converting them to timedeltas since a reference date. The issue is that the dates in your example dataset straddle the borderline of being 292 years later than the reference date provided (1800-01-01).
In [1]: import numpy as np
In [2]: import xarray as xr
In [3]: ds = xr.Dataset(coords={'time': (
...: 'time',
...: np.arange(106300.5, 106665.5+5*365, 365),
...: {'units': 'days since 1800-01-01 00:00:00'})})
...:
In [4]: ds = xr.decode_cf(ds)
In [5]: ds.time
Out[5]:
<xarray.DataArray 'time' (time: 6)>
array(['2091-01-15T12:00:00.000000000', '2092-01-15T12:00:00.000000000',
'2093-01-14T12:00:00.000000000', '2094-01-14T12:00:00.000000000',
'2095-01-14T12:00:00.000000000', '2096-01-14T12:00:00.000000000'],
dtype='datetime64[ns]')
Coordinates:
* time (time) datetime64[ns] 2091-01-15T12:00:00 2092-01-15T12:00:00 ...
In [6]: ds.to_netcdf('so.nc')
In [7]: xr.open_dataset('so.nc', decode_times=False).time
so.nc
Out[7]:
<xarray.DataArray 'time' (time: 6)>
array([ 106300.5 , 106665.5 , -106473.482335, -106108.482335,
-105743.482335, -105378.482335])
Coordinates:
* time (time) float64 1.063e+05 1.067e+05 -1.065e+05 -1.061e+05 ...
Attributes:
units: days since 1800-01-01
calendar: proleptic_gregorian
292 years is the maximum length of time a np.timedelta64
object with nanosecond precision can represent (see here in the documentation); any more than that and you will run into overflow (which is the cause of the negative values).
A workaround you could use is to overwrite the units encoding associated with the times in your dataset with a new value:
In [8]: ds.time.encoding['units'] = 'days since 1970-01-01'
In [9]: ds.to_netcdf('so-workaround.nc')
In [10]: xr.open_dataset('so-workaround.nc', decode_times=False).time
Out[10]:
<xarray.DataArray 'time' (time: 6)>
array([44209.5, 44574.5, 44939.5, 45304.5, 45669.5, 46034.5])
Coordinates:
* time (time) float64 4.421e+04 4.457e+04 4.494e+04 4.53e+04 4.567e+04 ...
Attributes:
units: days since 1970-01-01
calendar: proleptic_gregorian
Here I have chosen 'days since 1970-01-01'
intentionally, since this is what np.datetime64
objects are centered around in NumPy.
Upvotes: 2