BUG: Inconsistent behavior while constructing a Series with large integers in a int64 masked array #56566
Open
3 tasks done
Labels
Bug
Constructors
Series/DataFrame/Index/pd.array Constructors
Dtype Conversions
Unexpected or buggy dtype conversions
NA - MaskedArrays
Related to pd.NA and nullable extension arrays
Pandas version checks
I have checked that this issue has not already been reported.
I have confirmed this bug exists on the latest version of pandas.
I have confirmed this bug exists on the main branch of pandas.
Reproducible Example
Issue Description
While creating a series object from a masked array of large-ish integers (less than max of
Int64
), the output doesn't match the input. This has probably something to do with float downcast/upcast somewhere.Probably related (?) #30268, #50757
Expected Behavior
mx.data - pd.Series(mx, dtype='Int64')
should be all zero with<NA>
for the mask.Installed Versions
INSTALLED VERSIONS
commit : a671b5a
python : 3.11.6.final.0
python-bits : 64
OS : Darwin
OS-release : 23.2.0
Version : Darwin Kernel Version 23.2.0: Wed Nov 15 21:53:18 PST 2023; root:xnu-10002.61.3~2/RELEASE_ARM64_T6000
machine : arm64
processor : arm
byteorder : little
LC_ALL : None
LANG : en_US.UTF-8
LOCALE : en_US.UTF-8
pandas : 2.1.4
numpy : 1.26.1
pytz : 2023.3.post1
dateutil : 2.8.2
setuptools : 68.2.2
pip : 23.3.1
Cython : 3.0.5
pytest : 7.4.3
hypothesis : 6.88.3
sphinx : None
blosc : None
feather : None
xlsxwriter : None
lxml.etree : None
html5lib : 1.1
pymysql : None
psycopg2 : None
jinja2 : 3.1.2
IPython : 8.17.2
pandas_datareader : None
bs4 : 4.12.2
bottleneck : None
dataframe-api-compat: None
fastparquet : None
fsspec : 2023.12.2
gcsfs : None
matplotlib : None
numba : None
numexpr : None
odfpy : None
openpyxl : None
pandas_gbq : None
pyarrow : 14.0.1
pyreadstat : None
pyxlsb : None
s3fs : None
scipy : None
sqlalchemy : None
tables : None
tabulate : None
xarray : None
xlrd : None
zstandard : None
tzdata : 2023.3
qtpy : None
pyqt5 : None
The text was updated successfully, but these errors were encountered: