-
-
Notifications
You must be signed in to change notification settings - Fork 18.6k
Losing Nanosecond precision upon conversion to DatetimeIndex #2252
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Joe, I think your point is obscured by the markup behavior on < and >. I see the same puzzling behavior :
the last two timestamps are not the same |
Looks like a bug. Will have to wait for 0.9.2 as I just cut 0.9.1 release-- will fix in the dev version as soon as we can |
Kajdogg, thanks for fixing. Wes, thanks for the quick response! I'll keep an eye out for it. |
Not sure how big your series is, but getting the raw epoch nanoseconds out of the Timestamps works, if you have time for the extraction: import pandas => Timestamp: 2012-11-14 23:06:30.001001001 |
This should be fixed on master now. Do you want to give it a shot? |
Works great now. Thanks for the quick fix!! |
sweet. thanks for the report |
All,
I am experiencing a problem when converting from a TimeSeries to a DatetimeIndex. In the code below, I create a TimeSeries of nanosecond precision TimeStamps. I then convert it to a DatetimeIndex and lose the nanosecond precision (but maintain the microseconds). This appears to be a bug as the resulting elements of DatetimeIndex are of time TimeStamp and can handle nanosecond precision. Please see the snippet below.
Any help/resolution would be greatly appreciated.
Thanks,
Joe
The text was updated successfully, but these errors were encountered: