Skip to content

BUG: Fix issue with inserting duplicate columns in a dataframe (#14291) #14431

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 1 commit into from

Conversation

paul-mannino
Copy link
Contributor

@paul-mannino paul-mannino commented Oct 15, 2016

Resubmitting PR (@jorisvandenbossche)

@codecov-io
Copy link

codecov-io commented Oct 16, 2016

Current coverage is 85.25% (diff: 100%)

Merging #14431 into master will not change coverage

@@             master     #14431   diff @@
==========================================
  Files           140        140          
  Lines         50655      50655          
  Methods           0          0          
  Messages          0          0          
  Branches          0          0          
==========================================
  Hits          43185      43185          
  Misses         7470       7470          
  Partials          0          0          

Powered by Codecov. Last update 233d51d...2698005

Parameters
----------
keys : object
value : int, Series, or array-like
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

int -> scalar

keys : object
value : int, Series, or array-like
broadcast : bool
Indicates whether all columns with the given key should be
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this is not obvious what you mean here. pls reword. (also this is not typically what broadcast means)

Parameters
----------
keys : object
value : int, Series, or array-like
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

same as above


Notes
-----
The "broadcast" parameter was added to match the method signature of
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

then don't add it, just use **kwargs

Copy link
Contributor Author

@paul-mannino paul-mannino Oct 17, 2016

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just to be clear, you want **kwargs for SparseDataFrame._sanitize_column and an optional parameter for DataFrame._sanitize_column? Should I add *args too so there's no risk of error if someone calls _sanitize_column without specifying the optional argument name?

key isn't used in SparseDataFrame._sanitize_column either. Do you want me to get rid of that, too?

@jreback jreback added Bug Reshaping Concat, Merge/Join, Stack/Unstack, Explode labels Oct 16, 2016
----------
key : object
value : scalar, Series, or array-like
propagate : bool, default True
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

no broadcast is the correct name. I just wanted the explanation to be what you have now.

@jorisvandenbossche
Copy link
Member

This looks good to me (restarted the failing mac build on travis, should be unrelated)

@jorisvandenbossche
Copy link
Member

@paul-mannino Can you rebase (there is a merge conflict, probably in the whatsnew file)

@jreback
Copy link
Contributor

jreback commented Oct 24, 2016

thanks!

jorisvandenbossche pushed a commit to jorisvandenbossche/pandas that referenced this pull request Nov 2, 2016
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Bug Reshaping Concat, Merge/Join, Stack/Unstack, Explode
Projects
None yet
Development

Successfully merging this pull request may close these issues.

BUG: DataFrame.insert with allow_duplicates=True fails when already duplicates present
4 participants