Skip to content

Commit 871a10e

Browse files
audersonim-vinicius
authored and
im-vinicius
committed
use is_integer in df.insert (pandas-dev#53194)
1 parent b2da826 commit 871a10e

File tree

3 files changed

+10
-2
lines changed

3 files changed

+10
-2
lines changed

doc/source/whatsnew/v2.1.0.rst

+1
Original file line numberDiff line numberDiff line change
@@ -351,6 +351,7 @@ Conversion
351351
- Bug in :meth:`ArrowDtype.numpy_dtype` returning nanosecond units for non-nanosecond ``pyarrow.timestamp`` and ``pyarrow.duration`` types (:issue:`51800`)
352352
- Bug in :meth:`DataFrame.__repr__` incorrectly raising a ``TypeError`` when the dtype of a column is ``np.record`` (:issue:`48526`)
353353
- Bug in :meth:`DataFrame.info` raising ``ValueError`` when ``use_numba`` is set (:issue:`51922`)
354+
- Bug in :meth:`DataFrame.insert` raising ``TypeError`` if ``loc`` is ``np.int64`` (:issue:`53193`)
354355
-
355356

356357
Strings

pandas/core/frame.py

+3-2
Original file line numberDiff line numberDiff line change
@@ -4800,9 +4800,10 @@ def insert(
48004800
if not allow_duplicates and column in self.columns:
48014801
# Should this be a different kind of error??
48024802
raise ValueError(f"cannot insert {column}, already exists")
4803-
if not isinstance(loc, int):
4803+
if not is_integer(loc):
48044804
raise TypeError("loc must be int")
4805-
4805+
# convert non stdlib ints to satisfy typing checks
4806+
loc = int(loc)
48064807
if isinstance(value, DataFrame) and len(value.columns) > 1:
48074808
raise ValueError(
48084809
f"Expected a one-dimensional object, got a DataFrame with "

pandas/tests/frame/indexing/test_insert.py

+6
Original file line numberDiff line numberDiff line change
@@ -110,3 +110,9 @@ def test_insert_frame(self):
110110
)
111111
with pytest.raises(ValueError, match=msg):
112112
df.insert(1, "newcol", df)
113+
114+
def test_insert_int64_loc(self):
115+
# GH#53193
116+
df = DataFrame({"a": [1, 2]})
117+
df.insert(np.int64(0), "b", 0)
118+
tm.assert_frame_equal(df, DataFrame({"b": [0, 0], "a": [1, 2]}))

0 commit comments

Comments
 (0)