Skip to content

CI: Fix Flakey GBQ Tests #30630

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 14 commits into from
Jan 3, 2020
14 changes: 7 additions & 7 deletions pandas/tests/io/test_gbq.py
Original file line number Diff line number Diff line change
Expand Up @@ -68,6 +68,10 @@ def _get_client():
return bigquery.Client(project=project_id, credentials=credentials)


def generate_rand_str(length: int = 10) -> str:
return "".join(random.choices(string.ascii_lowercase, k=length))


def make_mixed_dataframe_v2(test_size):
# create df to test for all BQ datatypes except RECORD
bools = np.random.randint(2, size=(1, test_size)).astype(bool)
Expand Down Expand Up @@ -153,19 +157,15 @@ def gbq_dataset(self):
_skip_if_no_project_id()
_skip_if_no_private_key_path()

dataset_id = "pydata_pandas_bq_testing_py31"
dataset_id = "pydata_pandas_bq_testing_" + generate_rand_str()

self.client = _get_client()
self.dataset = self.client.dataset(dataset_id)
try:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

hmm, are we supposed to clean up these datasets? @tswast

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We now just do this here:

self.client.delete_dataset(self.dataset, delete_contents=True)

# Clean-up previous test runs.
self.client.delete_dataset(self.dataset, delete_contents=True)
except api_exceptions.NotFound:
pass # It's OK if the dataset doesn't already exist.

# Create the dataset
self.client.create_dataset(bigquery.Dataset(self.dataset))

table_name = "".join(random.choices(string.ascii_lowercase, k=10))
table_name = generate_rand_str()
destination_table = f"{dataset_id}.{table_name}"
yield destination_table

Expand Down