Skip to content

Added capture_stderr decorator to test_validate_docstrings #22543

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 3 commits into from
Sep 4, 2018

Conversation

WillAyd
Copy link
Member

@WillAyd WillAyd commented Aug 30, 2018

Originally thought it would be nice to have this as a class decorator, but usage of this conflicts with the capsys fixture used by test_bad_examples so had to pick and choose where to apply. Additionally the decorator doesn't work out of the box to wrap classes, so it would take a decent amount more code.

cc @gfyoung will present minor merge conflicts with your work in #22531

@WillAyd WillAyd added Testing pandas testing functions or related to the test suite Docs labels Aug 30, 2018
@codecov
Copy link

codecov bot commented Aug 30, 2018

Codecov Report

Merging #22543 into master will not change coverage.
The diff coverage is 100%.

Impacted file tree graph

@@           Coverage Diff           @@
##           master   #22543   +/-   ##
=======================================
  Coverage   92.04%   92.04%           
=======================================
  Files         169      169           
  Lines       50787    50787           
=======================================
  Hits        46745    46745           
  Misses       4042     4042
Flag Coverage Δ
#multiple 90.45% <100%> (ø) ⬆️
#single 42.29% <100%> (ø) ⬆️
Impacted Files Coverage Δ
pandas/util/testing.py 85.75% <100%> (ø) ⬆️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 100906e...bf3b929. Read the comment docs.

@@ -673,7 +673,7 @@ def capture_stderr(f):
AssertionError: assert 'foo\n' == 'bar\n'
"""

@wraps(f)
@compat.wraps(f)
Copy link
Member Author

@WillAyd WillAyd Aug 30, 2018

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks like wraps wasn't preserving the signature in Py27 which would cause this to choke when combined with a parametrized function. Simply switched to the function from compat for now - providing as reference in case anyone runs into a similar issue in the future

@jreback jreback added this to the 0.24.0 milestone Aug 31, 2018
@jreback
Copy link
Contributor

jreback commented Aug 31, 2018

looks fine. whare are they printing to stderr now?

@WillAyd
Copy link
Member Author

WillAyd commented Aug 31, 2018

@jorisvandenbossche can you point me to where you were seeing this in Travis before? Worth double checking there that this has the intended effect

Copy link
Member

@jorisvandenbossche jorisvandenbossche left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Once rebased, good to merge for me

See eg this build (on master) for where the output is currently displayed: https://travis-ci.org/pandas-dev/pandas/jobs/423640681. I can confirm this is fixed now.

@WillAyd
Copy link
Member Author

WillAyd commented Sep 4, 2018

@jorisvandenbossche all green after rebase

@gfyoung
Copy link
Member

gfyoung commented Sep 4, 2018

Looks clean after rebase. I think we have enough approvals to merge this. 😄

@gfyoung gfyoung merged commit a5fe9cf into pandas-dev:master Sep 4, 2018
Sup3rGeo pushed a commit to Sup3rGeo/pandas that referenced this pull request Oct 1, 2018
@WillAyd WillAyd deleted the capture-stdout branch January 16, 2020 00:34
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Docs Testing pandas testing functions or related to the test suite
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Capture STDOUT For Validations Script Tests
5 participants