Skip to content

Commit 8004eff

Browse files
thismakessandjorisvandenbossche
authored andcommitted
DOC: Update pandas.Series.copy docstring (#20261)
1 parent cdda1bb commit 8004eff

File tree

1 file changed

+95
-9
lines changed

1 file changed

+95
-9
lines changed

pandas/core/generic.py

+95-9
Original file line numberDiff line numberDiff line change
@@ -5005,22 +5005,108 @@ def astype(self, dtype, copy=True, errors='raise', **kwargs):
50055005

50065006
def copy(self, deep=True):
50075007
"""
5008-
Make a copy of this objects data.
5008+
Make a copy of this object's indices and data.
5009+
5010+
When ``deep=True`` (default), a new object will be created with a
5011+
copy of the calling object's data and indices. Modifications to
5012+
the data or indices of the copy will not be reflected in the
5013+
original object (see notes below).
5014+
5015+
When ``deep=False``, a new object will be created without copying
5016+
the calling object's data or index (only references to the data
5017+
and index are copied). Any changes to the data of the original
5018+
will be reflected in the shallow copy (and vice versa).
50095019
50105020
Parameters
50115021
----------
5012-
deep : boolean or string, default True
5022+
deep : bool, default True
50135023
Make a deep copy, including a copy of the data and the indices.
5014-
With ``deep=False`` neither the indices or the data are copied.
5015-
5016-
Note that when ``deep=True`` data is copied, actual python objects
5017-
will not be copied recursively, only the reference to the object.
5018-
This is in contrast to ``copy.deepcopy`` in the Standard Library,
5019-
which recursively copies object data.
5024+
With ``deep=False`` neither the indices nor the data are copied.
50205025
50215026
Returns
50225027
-------
5023-
copy : type of caller
5028+
copy : Series, DataFrame or Panel
5029+
Object type matches caller.
5030+
5031+
Notes
5032+
-----
5033+
When ``deep=True``, data is copied but actual Python objects
5034+
will not be copied recursively, only the reference to the object.
5035+
This is in contrast to `copy.deepcopy` in the Standard Library,
5036+
which recursively copies object data (see examples below).
5037+
5038+
While ``Index`` objects are copied when ``deep=True``, the underlying
5039+
numpy array is not copied for performance reasons. Since ``Index`` is
5040+
immutable, the underlying data can be safely shared and a copy
5041+
is not needed.
5042+
5043+
Examples
5044+
--------
5045+
>>> s = pd.Series([1, 2], index=["a", "b"])
5046+
>>> s
5047+
a 1
5048+
b 2
5049+
dtype: int64
5050+
5051+
>>> s_copy = s.copy()
5052+
>>> s_copy
5053+
a 1
5054+
b 2
5055+
dtype: int64
5056+
5057+
**Shallow copy versus default (deep) copy:**
5058+
5059+
>>> s = pd.Series([1, 2], index=["a", "b"])
5060+
>>> deep = s.copy()
5061+
>>> shallow = s.copy(deep=False)
5062+
5063+
Shallow copy shares data and index with original.
5064+
5065+
>>> s is shallow
5066+
False
5067+
>>> s.values is shallow.values and s.index is shallow.index
5068+
True
5069+
5070+
Deep copy has own copy of data and index.
5071+
5072+
>>> s is deep
5073+
False
5074+
>>> s.values is deep.values or s.index is deep.index
5075+
False
5076+
5077+
Updates to the data shared by shallow copy and original is reflected
5078+
in both; deep copy remains unchanged.
5079+
5080+
>>> s[0] = 3
5081+
>>> shallow[1] = 4
5082+
>>> s
5083+
a 3
5084+
b 4
5085+
dtype: int64
5086+
>>> shallow
5087+
a 3
5088+
b 4
5089+
dtype: int64
5090+
>>> deep
5091+
a 1
5092+
b 2
5093+
dtype: int64
5094+
5095+
Note that when copying an object containing Python objects, a deep copy
5096+
will copy the data, but will not do so recursively. Updating a nested
5097+
data object will be reflected in the deep copy.
5098+
5099+
>>> s = pd.Series([[1, 2], [3, 4]])
5100+
>>> deep = s.copy()
5101+
>>> s[0][0] = 10
5102+
>>> s
5103+
0 [10, 2]
5104+
1 [3, 4]
5105+
dtype: object
5106+
>>> deep
5107+
0 [10, 2]
5108+
1 [3, 4]
5109+
dtype: object
50245110
"""
50255111
data = self._data.copy(deep=deep)
50265112
return self._constructor(data).__finalize__(self)

0 commit comments

Comments
 (0)