-
-
Notifications
You must be signed in to change notification settings - Fork 18.4k
Integers coerced to float on Series construction from dictionary. #8211
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
For a Series to be performant, it needs to contain data with a homogeneous type. So this is indeed working as intended. If you really want to faithfully preserve type distinctions like float/int, you can pass in an numpy.ndarray with |
I tend to think this question is rather about the fact that in Python 2.X: In [3]: type(sys.maxint)
Out[3]: int
In [4]: type(sys.maxint + 1)
Out[4]: long and the fact that the two cases are inconsistent between each other. |
Sorry, I should have put this in the post, but the truncation behavior is occurs even when I fully expect that I also realize that mathematical operations on a heterogenously-typed Series will be slower. The values I'm loading here are being loaded out of a database, and the particular values that are being truncated are integer-representations of Timestamps, which will eventually become a DatetimeIndex on a DataFrame built from a sequence of these Series objects. |
This is related to the maxint rollover on osx (only), see here: #3922 Not sure what if anything can be done about this. osx is just weird here and doesn't behave properly (in its python impl). |
When constructing a Series from a dictionary, if the dictionary contains floats, all integer values in the dictionary are coerced to
float
unless the dictionary contains an integer value >sys.maxint
.Minimal repro case:
Output:
Stepping through the code, the coercion looks like it's happening in the call to
lib.fast_multiget
at https://github.com/pydata/pandas/blob/v0.14.1/pandas/core/series.py#L191.The text was updated successfully, but these errors were encountered: