Skip to content

@network decorator can mask failed tests #5816

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
ghost opened this issue Jan 1, 2014 · 15 comments · Fixed by #6111
Closed

@network decorator can mask failed tests #5816

ghost opened this issue Jan 1, 2014 · 15 comments · Fixed by #6111
Labels
CI Continuous Integration Testing pandas testing functions or related to the test suite
Milestone

Comments

@ghost
Copy link

ghost commented Jan 1, 2014

See discussion #5812

The skip behaviour was motivated by travis failing tests due to
wonky net cinnectivity which can mask actual problems such as broken urls.

Perhaps make the behavior conditional on environment (travis can be detected via
environment vars)

@jtratner
Copy link
Contributor

jtratner commented Jan 1, 2014

I made a decorator for that with_connectivity_check, which would handle this fine, at the cost of an extra network connection if the test fails with an IOError.

@ghost
Copy link
Author

ghost commented Jan 3, 2014

That's close but not quite it, I can't think of a case where it's useful given the
qualification in the docstring:

  In comparison to ``network``, this assumes an added contract to your test:
    you must assert that, under normal conditions, your test will ONLY fail if
    it does not have network connectivity.

Isn't that another way of saying "if your test is a connectivity check", since I can't
test+fail for anything else?

What's needed is to catch only network-related exception and qualify them by a connectivity check.
That would tie the functionality to the specific exceptions raised by urllib (vs. say requests or urllib2).

OTOH, we could do a connectivity check prior to any network test, perhaps via a singleton
so we don't launch 800 requests against google everytime travis runs.

#5812 just exposed the problem, we either need to apply a fix everywhere or let it go.

thoughts?

@jtratner
Copy link
Contributor

jtratner commented Jan 3, 2014

I wrote the qualification poorly. The point is that, if the test fails, it
bubbles up the failure if it is able to connect to the URL (meaning that it
has Internet connectivity)

@jreback
Copy link
Contributor

jreback commented Jan 15, 2014

@y-p @jtratner what's status of this?

@ghost
Copy link
Author

ghost commented Jan 15, 2014

if @jtratner 's clarification is true, I say we just fold the check into
the network decorator by default.

@jreback
Copy link
Contributor

jreback commented Jan 17, 2014

@jtratner you have this one?

@jtratner
Copy link
Contributor

let me make sure this actually works the way it does. Sorry I've been snowed under with work right now and so I've gone a bit incommunicado, catching up a bit tonight.

@jtratner
Copy link
Contributor

btw - @y-p do you have an example where this failed inappropriately (and, if so, I'll try and fix, otherwise I'll clarify the docstring)

@jreback
Copy link
Contributor

jreback commented Jan 18, 2014

this should fail with a non existent URL

eg yahoo changes it

@jtratner
Copy link
Contributor

I see what you mean - let me put together a PR that removes current @network and replaces with @with_connectivity_check instead.

@ghost
Copy link
Author

ghost commented Jan 18, 2014

lost you, doesn't #5812 count as failed inappropriately?

@ghost
Copy link
Author

ghost commented Jan 18, 2014

That would mess up the nosetests selector functionality.
Also, why rename the decorator on 200 seperate tests when you just
always want different behavior from it ? @network needs to be changed.

@jtratner
Copy link
Contributor

@y-p I think we're saying the same thing: you okay if I remove @network and rename @with_connectivity_check to @network?

@ghost
Copy link
Author

ghost commented Jan 18, 2014

yep.

@ghost
Copy link
Author

ghost commented Jan 23, 2014

bumped to 0.14.0

This issue was closed.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CI Continuous Integration Testing pandas testing functions or related to the test suite
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants