-
-
Notifications
You must be signed in to change notification settings - Fork 18.4k
dt.total_seconds() stores float with appending 0.00000000000001 #34290
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
I cant reproduce this on OSX (py37) or Ubuntu (py36). @sjvdm can you get this in any other python versions? |
Ok so I reproduced this at 2 other versions:
The result of total_seconds() seems to be stored as 120.00000000000001
|
May also be related to floating point precision as a colleague pointed out: https://www.python.org/dev/peps/pep-0485/ |
I can "reproduce" this (on linux). Using a simpler example, with Timedelta scalar it doesn't give the issue, with TimedeltaArray ti does:
But, it's quite probably indeed a floating point precision issue that can be ignored.
|
The
|
Looks like I just hit the same issue with pandas 1.4.3 under OSX 12.4 and Anaconda. Below a MWE adapted from the case that got me to realize this. Looks like this issue is still relevant ! Sample code:
|
I get 15.0 in main. IIRC there was a PR last week that touched DatetimeArray.total_seconds that might have fixed this |
I guess could use a unit test to confirm. |
Simple test to show that if I have two datetime columns and use dt.total_seconds() to calc the difference, values are stored with an offset of 0.00000000000001.
iPython code:
Output:
The text was updated successfully, but these errors were encountered: