Skip to content

TST: XFAIL Travis read_html tests #30544

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 7 commits into from
Dec 31, 2019

Conversation

alimcmaster1
Copy link
Member

@alimcmaster1 alimcmaster1 commented Dec 29, 2019

@alimcmaster1 alimcmaster1 added the Testing pandas testing functions or related to the test suite label Dec 29, 2019
@jbrockmendel
Copy link
Member

can't make it strict?

# https://github.com/pandas-dev/pandas/issues/29622
if self.read_html.keywords.get("flavor") == "bs4":
pytest.xfail("fails for bs4 version >= 4.8.0")

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

did you change this from the decorator in response to my comment? if so, thats not what i meant. the decorator is preferred, but without the strict=False

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah just want to xfail one particular parameterization - bs4 and the parameterization is at class scope - agree decorator is usually preferred.

Updated now - think its easiest to do this in the function itself for this case - unless you know a more elegant way?

Copy link
Member Author

@alimcmaster1 alimcmaster1 Dec 29, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just seen I can potentially go for the same approach as you did here https://github.com/pandas-dev/pandas/pull/30521/files? Will update unless you think otherwise?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

if the test doesnt use the class fixture(s), could it just be moved out of the class?

If not then the approach in 30521 would be my preference, yes.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Some class fixtures are used so easier to keep where is. Updated - thanks!

@jbrockmendel
Copy link
Member

looks like there was an xpass on the old_np build. possibly related: i think bs4 had a release yesterday

@alimcmaster1
Copy link
Member Author

looks like there was an xpass on the old_np build. possibly related: i think bs4 had a release yesterday

Due to the fact we pin bs4 to 4.6.0 in the py36_locale_slow_old_np build. I've unpinned - let me know if this is not appropriate?

Other option is to make this not strict?

@jbrockmendel
Copy link
Member

I've unpinned - let me know if this is not appropriate?

@datapythonista preference for this pin?

Other option is to make this not strict?

If all else fails, yah. I appreciate you putting in the effort to avoid this last resort.

@alimcmaster1
Copy link
Member Author

Can confirm travis-36-slow build is now passing - https://travis-ci.org/pandas-dev/pandas/jobs/631033419?utm_medium=notification&utm_source=github_status

We can remove this from allowed failures in .travis.yml if desired

@datapythonista
Copy link
Member

@datapythonista preference for this pin?

We've got bs4 also pinned at 4.6.0 in the minimum versions build. I think ideally we should have a minimum versions build for the slow tests too. I guess if we had it, it'd fail like the old np build. So I guess unpinning will avoid the failure, but won't really fix the problem.

Do we want to increase the minimum required version instead?

@alimcmaster1
Copy link
Member Author

We've got bs4 also pinned at 4.6.0 in the minimum versions build. I think ideally we should have a > minimum versions build for the slow tests too. I guess if we had it, it'd fail like the old np build. So I > guess unpinning will avoid the failure, but won't really fix the problem.

Thanks for the info!

Should the minimum version build just also run the slow tests opposed to a sep build?

Do we want to increase the minimum required version instead?

Okay shall we bump to min beautifulsoup4 == 4.8.0 ? (azure-36-minimum_versions.yml and azure-36-locale_slow.yaml) Therefore this xfail can be strict across all builds?

@datapythonista
Copy link
Member

Should the minimum version build just also run the slow tests opposed to a sep build?

My understanding is that a build with both the "normal" and the slow tests would be too slow. That's why we have the slow marker, so they are called in separate builds.

Okay shall we bump to min beautifulsoup4 == 4.8.0 ?

This is from just few months ago. I think we're usually compatible with dependencies that are at least couple of years old.

@@ -13,7 +13,7 @@ dependencies:
- pytest-azurepipelines

# pandas dependencies
- beautifulsoup4==4.6.0
- beautifulsoup4=4.6.0
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This should be single =

@alimcmaster1
Copy link
Member Author

Should the minimum version build just also run the slow tests opposed to a sep build?

My understanding is that a build with both the "normal" and the slow tests would be too slow. That's why we have the slow marker, so they are called in separate builds.

Okay shall we bump to min beautifulsoup4 == 4.8.0 ?

This is from just few months ago. I think we're usually compatible with dependencies that are at least couple of years old.

Thanks @datapythonista for the info!

Okay so i've adjust the xfail to take into account the version of bs4 installed + removed the travis 36 slow build from the allowed failures.

Happy to create a min version build for the slow tests too as you suggested (sep PR). If that's a useful for us?

@datapythonista
Copy link
Member

Happy to create a min version build for the slow tests too as you suggested (sep PR). If that's a useful for us?

I think it makes more sense to discuss and add this as part of #29685. I don't think we really understand well what's being tested. I think creating builds from zero in an organized way is better than add this individually. I'm on it.

@alimcmaster1
Copy link
Member Author

All green - @jbrockmendel / @datapythonista mind reviewing?

Copy link
Member

@datapythonista datapythonista left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Didn't check the details, but if we can't fix the error properly, this looks good.

@jbrockmendel jbrockmendel merged commit 6322b8f into pandas-dev:master Dec 31, 2019
@jbrockmendel
Copy link
Member

Thanks @alimcmaster1

hweecat pushed a commit to hweecat/pandas that referenced this pull request Jan 1, 2020
* XFAIL Tests

* XFAIL Tests

* Update as per brock suggestion

* unpin beautiful soup version

* Dont xfail for bs4 < 4.8.0

* Remove slow tests from allowed failures

* black pandas
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Testing pandas testing functions or related to the test suite
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants