-
-
Notifications
You must be signed in to change notification settings - Fork 18.4k
Inconsistent behaviour in Timestamp.round #22591
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Timestamp.round
This is in pandas/pandas/_libs/tslibs/timestamps.pyx Line 60 in 3285bdc
|
By inspecting the code I see further problems... pandas/pandas/_libs/tslibs/timestamps.pyx Lines 83 to 86 in 3285bdc
unit is a divisor of 1_000_000 .
Therefore >>> a.round('27ns')
Timestamp('2000-01-01 00:00:00.124999351') is wrong. In fact I would expect Of course the semantics of
|
I've just noticed that one costly bit of integer arithmetic is already present: pandas/pandas/_libs/tslibs/timestamps.pyx Line 78 in 3285bdc
In fact once you compute values % unit floor and ceil are trivial:
floor = lambda values, unit: values - ( values % unit)
ceil = lambda values, unit: values + (-values % unit) plus some code for handling under/overflow. Rounding is a little bit more complicated, mainly because we first need to exactly define rounding semantics. If the maintainers confirm the present behaviour as a bug, (and not a feature), I can submit a PR. |
same problem for pandas/pandas/_libs/tslibs/timedeltas.pyx Lines 1161 to 1167 in 3285bdc
but of course precision loss is very unlikely in real world applications. |
This issue is exactly the same as #19206, only showing up on different edge case. Taking into account that the current code also patches #21262, I would suggest a entirely new implementation based on A similar rounding bug due to floating point arithmetic is numpy/numpy#11881 |
I have a proof of concept implementation in miccoli/pandas@1adfd41 Before issuing a PR I would like to receive some comments on following aspects:
Current code seems to me badly broken, but things concerning rounding are really complex, therefore a review of the proposed new semantics is necessary. |
so does your soln break any tests? seems a lot simpler though pls don’t change actual rounding semantics whether using numpy functions or not doesn’t matter |
Yes, a few tests are broken, e.g. _____________ TestTimestampUnaryOps.test_round_frequencies[20130104 12:00:00-D-20130105] _____________ self = timestamp = '20130104 12:00:00', freq = 'D', expected = Timestamp('2013-01-05 00:00:00') @pytest.mark.parametrize('timestamp, freq, expected', [ ('20130101 09:10:11', 'D', '20130101'), ('20130101 19:10:11', 'D', '20130102'), ('20130201 12:00:00', 'D', '20130202'), ('20130104 12:00:00', 'D', '20130105'), ('2000-01-05 05:09:15.13', 'D', '2000-01-05 00:00:00'), ('2000-01-05 05:09:15.13', 'H', '2000-01-05 05:00:00'), ('2000-01-05 05:09:15.13', 'S', '2000-01-05 05:09:15') ]) def test_round_frequencies(self, timestamp, freq, expected): dt = Timestamp(timestamp) result = dt.round(freq) expected = Timestamp(expected) > assert result == expected E AssertionError: assert Timestamp('2013-01-04 00:00:00') == Timestamp('2013-01-05 00:00:00') pandas/tests/scalar/timestamp/test_unary_ops.py:35: AssertionError The reason for dropping default
>>> pd.__version__
'0.23.4'
>>> pd.to_datetime('2018-01-01 12:00:00').round('1d')
Timestamp('2018-01-01 00:00:00')
>>> pd.to_datetime('2018-01-02 12:00:00').round('1d')
Timestamp('2018-01-03 00:00:00')
>>> pd.to_datetime('2018-01-03 12:00:00').round('1d')
Timestamp('2018-01-03 00:00:00')
>>> pd.to_datetime('2019-01-01 12:00:00').round('1d')
Timestamp('2019-01-02 00:00:00')
>>> pd.to_datetime('2019-01-02 12:00:00').round('1d')
Timestamp('2019-01-02 00:00:00')
>>> pd.to_datetime('2019-01-03 12:00:00').round('1d')
Timestamp('2019-01-04 00:00:00')
>>> d = pd.DatetimeIndex(start='2018-01-01 00:00:00', end='2018-12-31 23:59:59.999999999', freq='1h')
>>> (d.round('1d') - d).values.sum() / np.timedelta64(1, 'D')
-0.5
>>> pd.__version__
'0.24.0.dev0+564.g1adfd4155'
>>> pd.to_datetime('2018-01-01 12:00:00').round('1d')
Timestamp('2018-01-01 00:00:00')
>>> pd.to_datetime('2018-01-02 12:00:00').round('1d')
Timestamp('2018-01-02 00:00:00')
>>> pd.to_datetime('2018-01-03 12:00:00').round('1d')
Timestamp('2018-01-03 00:00:00')
>>> pd.to_datetime('2019-01-01 12:00:00').round('1d')
Timestamp('2019-01-01 00:00:00')
>>> pd.to_datetime('2019-01-02 12:00:00').round('1d')
Timestamp('2019-01-02 00:00:00')
>>> pd.to_datetime('2019-01-03 12:00:00').round('1d')
Timestamp('2019-01-03 00:00:00')
>>> d = pd.DatetimeIndex(start='2018-01-01 00:00:00', end='2018-12-31 23:59:59.999999999', freq='1h')
>>> (d.round('1d') - d).values.sum() / np.timedelta64(1, 'D')
-182.5 Here you see that noon is rounded up or down to the midnight of the same or the following day depending on whether the day number (counted starting from the unix epoch '1970-01-01') is odd or even. On the contrary with my proposed implementation noon is always rounded down to midnight of the same day, which is more intuitive. Of course a rigorous assessment of merits or demerits of both rounding modes requires an analysis of all the possible settings in which I understand the need for not breaking compatibility with previous behaviour, so I will implement also the |
I have miccoli/pandas@0416b06 almost ready for PR, implementing a rounding mode compatible with the current flawed implementation which is based on floating point rounding semantics and I have just a point I'm unable to resolve: I'm using Line 27 in 4612a82
Can I leave the faster |
Code Sample
Problem description
According to the standard definition of rounding to a multiple
b.round('us')
in the example above gives the wrong answer.I suspect that there is loss of precision due to a floating point conversion.
Expected Output
Output of
pd.show_versions()
INSTALLED VERSIONS
commit: None
python: 3.7.0.final.0
python-bits: 64
OS: Darwin
OS-release: 17.7.0
machine: x86_64
processor: i386
byteorder: little
LC_ALL: None
LANG: en_US.UTF-8
LOCALE: en_US.UTF-8
pandas: 0.23.4
pytest: None
pip: 18.0
setuptools: 39.0.1
Cython: None
numpy: 1.15.1
scipy: None
pyarrow: None
xarray: None
IPython: None
sphinx: None
patsy: None
dateutil: 2.7.3
pytz: 2018.5
blosc: None
bottleneck: None
tables: None
numexpr: None
feather: None
matplotlib: None
openpyxl: None
xlrd: None
xlwt: None
xlsxwriter: None
lxml: None
bs4: None
html5lib: None
sqlalchemy: None
pymysql: None
psycopg2: None
jinja2: None
s3fs: None
fastparquet: None
pandas_gbq: None
pandas_datareader: None
The text was updated successfully, but these errors were encountered: