Skip to content

Manual Backport PR #45946 on branch 1.4.x (CI: Debug min build timing out) #45974

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 2 additions & 1 deletion .github/workflows/datamanger.yml
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,7 @@ jobs:
data_manager:
name: Test experimental data manager
runs-on: ubuntu-latest
timeout-minutes: 120
services:
moto:
image: motoserver/moto
Expand All @@ -43,7 +44,7 @@ jobs:
- name: Run tests
env:
PANDAS_DATA_MANAGER: array
PATTERN: "not network and not clipboard"
PATTERN: "not network and not clipboard and not single_cpu"
PYTEST_WORKERS: "auto"
PYTEST_TARGET: pandas
run: |
Expand Down
19 changes: 10 additions & 9 deletions .github/workflows/posix.yml
Original file line number Diff line number Diff line change
Expand Up @@ -22,18 +22,19 @@ jobs:
defaults:
run:
shell: bash -l {0}
timeout-minutes: 120
strategy:
matrix:
settings: [
[actions-38-downstream_compat.yaml, "not slow and not network", "", "", "", "", ""],
[actions-38-minimum_versions.yaml, "", "", "", "", "", ""],
[actions-38.yaml, "not slow and not network", "language-pack-it", "it_IT.utf8", "it_IT.utf8", "", ""],
[actions-38.yaml, "not slow and not network", "language-pack-zh-hans", "zh_CN.utf8", "zh_CN.utf8", "", ""],
[actions-38.yaml, "", "", "", "", "", ""],
[actions-pypy-38.yaml, "not slow", "", "", "", "", "--max-worker-restart 0"],
[actions-39.yaml, "", "", "", "", "", ""],
[actions-310-numpydev.yaml, "not slow and not network", "", "", "", "deprecate", "-W error"],
[actions-310.yaml, "", "", "", "", "", ""],
[actions-38-downstream_compat.yaml, "not slow and not network and not single_cpu", "", "", "", "", ""],
[actions-38-minimum_versions.yaml, "not single_cpu", "", "", "", "", ""],
[actions-38.yaml, "not slow and not network and not single_cpu", "language-pack-it", "it_IT.utf8", "it_IT.utf8", "", ""],
[actions-38.yaml, "not slow and not network and not single_cpu", "language-pack-zh-hans", "zh_CN.utf8", "zh_CN.utf8", "", ""],
[actions-38.yaml, "not single_cpu", "", "", "", "", ""],
[actions-pypy-38.yaml, "not slow and not single_cpu", "", "", "", "", "--max-worker-restart 0"],
[actions-39.yaml, "not single_cpu", "", "", "", "", ""],
[actions-310-numpydev.yaml, "not slow and not network and not single_cpu", "", "", "", "deprecate", "-W error"],
[actions-310.yaml, "not single_cpu", "", "", "", "", ""],
]
fail-fast: false
env:
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/python-dev.yml
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ on:
env:
PYTEST_WORKERS: "auto"
PANDAS_CI: 1
PATTERN: "not slow and not network and not clipboard"
PATTERN: "not slow and not network and not clipboard and not single_cpu"
COVERAGE: true
PYTEST_TARGET: pandas

Expand Down
2 changes: 1 addition & 1 deletion azure-pipelines.yml
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ pr:
variables:
PYTEST_WORKERS: auto
PYTEST_TARGET: pandas
PATTERN: "not slow and not high_memory and not db and not network"
PATTERN: "not slow and not high_memory and not db and not network and not single_cpu"
PANDAS_CI: 1

jobs:
Expand Down
12 changes: 6 additions & 6 deletions ci/azure/posix.yml
Original file line number Diff line number Diff line change
Expand Up @@ -11,32 +11,32 @@ jobs:
py38_macos_1:
ENV_FILE: ci/deps/azure-macos-38.yaml
CONDA_PY: "38"
PATTERN: "not slow"
PATTERN: "not slow and not single_cpu"
PYTEST_TARGET: "pandas/tests/[a-h]*"
py38_macos_2:
ENV_FILE: ci/deps/azure-macos-38.yaml
CONDA_PY: "38"
PATTERN: "not slow"
PATTERN: "not slow and not single_cpu"
PYTEST_TARGET: "pandas/tests/[i-z]*"
py39_macos_1:
ENV_FILE: ci/deps/azure-macos-39.yaml
CONDA_PY: "39"
PATTERN: "not slow"
PATTERN: "not slow and not single_cpu"
PYTEST_TARGET: "pandas/tests/[a-h]*"
py39_macos_2:
ENV_FILE: ci/deps/azure-macos-39.yaml
CONDA_PY: "39"
PATTERN: "not slow"
PATTERN: "not slow and not single_cpu"
PYTEST_TARGET: "pandas/tests/[i-z]*"
py310_macos_1:
ENV_FILE: ci/deps/azure-macos-310.yaml
CONDA_PY: "310"
PATTERN: "not slow"
PATTERN: "not slow and not single_cpu"
PYTEST_TARGET: "pandas/tests/[a-h]*"
py310_macos_2:
ENV_FILE: ci/deps/azure-macos-310.yaml
CONDA_PY: "310"
PATTERN: "not slow"
PATTERN: "not slow and not single_cpu"
PYTEST_TARGET: "pandas/tests/[i-z]*"

steps:
Expand Down
2 changes: 2 additions & 0 deletions pandas/compat/pyarrow.py
Original file line number Diff line number Diff line change
Expand Up @@ -13,10 +13,12 @@
pa_version_under4p0 = _palv < Version("4.0.0")
pa_version_under5p0 = _palv < Version("5.0.0")
pa_version_under6p0 = _palv < Version("6.0.0")
pa_version_under7p0 = _palv < Version("7.0.0")
except ImportError:
pa_version_under1p01 = True
pa_version_under2p0 = True
pa_version_under3p0 = True
pa_version_under4p0 = True
pa_version_under5p0 = True
pa_version_under6p0 = True
pa_version_under7p0 = True
2 changes: 1 addition & 1 deletion pandas/tests/base/test_unique.py
Original file line number Diff line number Diff line change
Expand Up @@ -104,7 +104,7 @@ def test_nunique_null(null_obj, index_or_series_obj):
assert obj.nunique(dropna=False) == max(0, num_unique_values)


@pytest.mark.single
@pytest.mark.single_cpu
@pytest.mark.xfail(
reason="Flaky in the CI. Remove once CI has a single build: GH 44584", strict=False
)
Expand Down
2 changes: 1 addition & 1 deletion pandas/tests/frame/methods/test_rank.py
Original file line number Diff line number Diff line change
Expand Up @@ -327,7 +327,7 @@ def test_rank_pct_true(self, method, exp):
expected = DataFrame(exp)
tm.assert_frame_equal(result, expected)

@pytest.mark.single
@pytest.mark.single_cpu
@pytest.mark.high_memory
def test_pct_max_many_rows(self):
# GH 18271
Expand Down
2 changes: 2 additions & 0 deletions pandas/tests/io/excel/test_readers.py
Original file line number Diff line number Diff line change
Expand Up @@ -775,6 +775,7 @@ def test_read_from_http_url(self, read_ext):
tm.assert_frame_equal(url_table, local_table)

@td.skip_if_not_us_locale
@pytest.mark.single_cpu
def test_read_from_s3_url(self, read_ext, s3_resource, s3so):
# Bucket "pandas-test" created in tests/io/conftest.py
with open("test1" + read_ext, "rb") as f:
Expand All @@ -786,6 +787,7 @@ def test_read_from_s3_url(self, read_ext, s3_resource, s3so):
local_table = pd.read_excel("test1" + read_ext)
tm.assert_frame_equal(url_table, local_table)

@pytest.mark.single_cpu
def test_read_from_s3_object(self, read_ext, s3_resource, s3so):
# GH 38788
# Bucket "pandas-test" created in tests/io/conftest.py
Expand Down
1 change: 1 addition & 0 deletions pandas/tests/io/json/test_compression.py
Original file line number Diff line number Diff line change
Expand Up @@ -38,6 +38,7 @@ def test_read_zipped_json(datapath):


@td.skip_if_not_us_locale
@pytest.mark.single_cpu
def test_with_s3_url(compression, s3_resource, s3so):
# Bucket "pandas-test" created in tests/io/conftest.py

Expand Down
2 changes: 2 additions & 0 deletions pandas/tests/io/json/test_pandas.py
Original file line number Diff line number Diff line change
Expand Up @@ -1222,6 +1222,7 @@ def test_read_inline_jsonl(self):
expected = DataFrame([[1, 2], [1, 2]], columns=["a", "b"])
tm.assert_frame_equal(result, expected)

@pytest.mark.single_cpu
@td.skip_if_not_us_locale
def test_read_s3_jsonl(self, s3_resource, s3so):
# GH17200
Expand Down Expand Up @@ -1749,6 +1750,7 @@ def test_json_multiindex(self, dataframe, expected):
result = series.to_json(orient="index")
assert result == expected

@pytest.mark.single_cpu
def test_to_s3(self, s3_resource, s3so):
import time

Expand Down
2 changes: 1 addition & 1 deletion pandas/tests/io/parser/conftest.py
Original file line number Diff line number Diff line change
Expand Up @@ -110,7 +110,7 @@ def all_parsers(request):
pytest.importorskip("pyarrow", VERSIONS["pyarrow"])
# Try setting num cpus to 1 to avoid hangs on Azure MacOS/Windows builds
# or better yet find a way to disable threads
# TODO(GH#44584) pytest.mark.single these tests
# TODO(GH#44584) pytest.mark.single_cpu these tests
import pyarrow

pyarrow.set_cpu_count(1)
Expand Down
3 changes: 3 additions & 0 deletions pandas/tests/io/parser/test_network.py
Original file line number Diff line number Diff line change
Expand Up @@ -66,6 +66,7 @@ def tips_df(datapath):
return read_csv(datapath("io", "data", "csv", "tips.csv"))


@pytest.mark.single_cpu
@pytest.mark.usefixtures("s3_resource")
@td.skip_if_not_us_locale()
class TestS3:
Expand Down Expand Up @@ -242,6 +243,7 @@ def test_write_s3_parquet_fails(self, tips_df, s3so):
storage_options=s3so,
)

@pytest.mark.single_cpu
def test_read_csv_handles_boto_s3_object(self, s3_resource, tips_file):
# see gh-16135

Expand All @@ -257,6 +259,7 @@ def test_read_csv_handles_boto_s3_object(self, s3_resource, tips_file):
expected = read_csv(tips_file)
tm.assert_frame_equal(result, expected)

@pytest.mark.single_cpu
@pytest.mark.skipif(
is_ci_environment(),
reason="This test can hang in our CI min_versions build "
Expand Down
10 changes: 7 additions & 3 deletions pandas/tests/io/parser/test_parse_dates.py
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,10 @@

from pandas._libs.tslibs import parsing
from pandas._libs.tslibs.parsing import parse_datetime_string
from pandas.compat.pyarrow import pa_version_under6p0
from pandas.compat.pyarrow import (
pa_version_under6p0,
pa_version_under7p0,
)

import pandas as pd
from pandas import (
Expand Down Expand Up @@ -948,10 +951,11 @@ def test_parse_dates_custom_euro_format(all_parsers, kwargs):
)


@xfail_pyarrow
def test_parse_tz_aware(all_parsers):
def test_parse_tz_aware(all_parsers, request):
# See gh-1693
parser = all_parsers
if parser.engine == "pyarrow" and pa_version_under7p0:
request.node.add_marker(pytest.mark.xfail(reason="Fails for pyarrow < 7.0"))
data = "Date,x\n2012-06-13T01:39:00Z,0.5"

result = parser.read_csv(StringIO(data), index_col=0, parse_dates=True)
Expand Down
2 changes: 1 addition & 1 deletion pandas/tests/io/pytables/test_append.py
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@
ensure_clean_store,
)

pytestmark = pytest.mark.single
pytestmark = pytest.mark.single_cpu


@pytest.mark.filterwarnings("ignore:object name:tables.exceptions.NaturalNameWarning")
Expand Down
2 changes: 1 addition & 1 deletion pandas/tests/io/pytables/test_categorical.py
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@
)

pytestmark = [
pytest.mark.single,
pytest.mark.single_cpu,
# pytables https://github.com/PyTables/PyTables/issues/822
pytest.mark.filterwarnings(
"ignore:a closed node found in the registry:UserWarning"
Expand Down
2 changes: 1 addition & 1 deletion pandas/tests/io/pytables/test_errors.py
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@
_maybe_adjust_name,
)

pytestmark = pytest.mark.single
pytestmark = pytest.mark.single_cpu


def test_pass_spec_to_storer(setup_path):
Expand Down
2 changes: 1 addition & 1 deletion pandas/tests/io/pytables/test_file_handling.py
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@
Term,
)

pytestmark = pytest.mark.single
pytestmark = pytest.mark.single_cpu


@pytest.mark.parametrize("mode", ["r", "r+", "a", "w"])
Expand Down
2 changes: 1 addition & 1 deletion pandas/tests/io/pytables/test_keys.py
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@
tables,
)

pytestmark = pytest.mark.single
pytestmark = pytest.mark.single_cpu


def test_keys(setup_path):
Expand Down
2 changes: 1 addition & 1 deletion pandas/tests/io/pytables/test_put.py
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@
)
from pandas.util import _test_decorators as td

pytestmark = pytest.mark.single
pytestmark = pytest.mark.single_cpu


def test_format_type(setup_path):
Expand Down
2 changes: 1 addition & 1 deletion pandas/tests/io/pytables/test_read.py
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@

from pandas.io.pytables import TableIterator

pytestmark = pytest.mark.single
pytestmark = pytest.mark.single_cpu


def test_read_missing_key_close_store(setup_path):
Expand Down
2 changes: 1 addition & 1 deletion pandas/tests/io/pytables/test_retain_attributes.py
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@
ensure_clean_store,
)

pytestmark = pytest.mark.single
pytestmark = pytest.mark.single_cpu


def test_retain_index_attributes(setup_path):
Expand Down
2 changes: 1 addition & 1 deletion pandas/tests/io/pytables/test_round_trip.py
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,7 @@
_default_compressor = "blosc"


pytestmark = pytest.mark.single
pytestmark = pytest.mark.single_cpu


def test_conv_read_write(setup_path):
Expand Down
2 changes: 1 addition & 1 deletion pandas/tests/io/pytables/test_select.py
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@

from pandas.io.pytables import Term

pytestmark = pytest.mark.single
pytestmark = pytest.mark.single_cpu


def test_select_columns_in_where(setup_path):
Expand Down
2 changes: 1 addition & 1 deletion pandas/tests/io/pytables/test_store.py
Original file line number Diff line number Diff line change
Expand Up @@ -40,7 +40,7 @@
read_hdf,
)

pytestmark = pytest.mark.single
pytestmark = pytest.mark.single_cpu


def test_context(setup_path):
Expand Down
2 changes: 1 addition & 1 deletion pandas/tests/io/pytables/test_time_series.py
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@
)
from pandas.tests.io.pytables.common import ensure_clean_store

pytestmark = pytest.mark.single
pytestmark = pytest.mark.single_cpu


def test_store_datetime_fractional_secs(setup_path):
Expand Down
2 changes: 1 addition & 1 deletion pandas/tests/io/test_clipboard.py
Original file line number Diff line number Diff line change
Expand Up @@ -148,7 +148,7 @@ def test_mock_clipboard(mock_clipboard):
assert result == "abc"


@pytest.mark.single
@pytest.mark.single_cpu
@pytest.mark.clipboard
@pytest.mark.usefixtures("mock_clipboard")
class TestClipboard:
Expand Down
2 changes: 1 addition & 1 deletion pandas/tests/io/test_feather.py
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@


@filter_sparse
@pytest.mark.single
@pytest.mark.single_cpu
@pytest.mark.filterwarnings("ignore:CategoricalBlock is deprecated:DeprecationWarning")
class TestFeather:
def check_error_on_write(self, df, exc, err_msg):
Expand Down
3 changes: 3 additions & 0 deletions pandas/tests/io/test_fsspec.py
Original file line number Diff line number Diff line change
Expand Up @@ -199,6 +199,7 @@ def test_fastparquet_options(fsspectest):
assert fsspectest.test[0] == "parquet_read"


@pytest.mark.single_cpu
@td.skip_if_no("s3fs")
def test_from_s3_csv(s3_resource, tips_file, s3so):
tm.assert_equal(
Expand All @@ -215,6 +216,7 @@ def test_from_s3_csv(s3_resource, tips_file, s3so):
)


@pytest.mark.single_cpu
@pytest.mark.parametrize("protocol", ["s3", "s3a", "s3n"])
@td.skip_if_no("s3fs")
def test_s3_protocols(s3_resource, tips_file, protocol, s3so):
Expand All @@ -224,6 +226,7 @@ def test_s3_protocols(s3_resource, tips_file, protocol, s3so):
)


@pytest.mark.single_cpu
@td.skip_array_manager_not_yet_implemented # TODO(ArrayManager) fastparquet
@td.skip_if_no("s3fs")
@td.skip_if_no("fastparquet")
Expand Down
Loading