Skip to content

to_gbq fails when trying to save a Pandas datetime64[ns] to a BQ DATE field #362

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
swt2c opened this issue Apr 16, 2021 · 4 comments · Fixed by #420
Closed

to_gbq fails when trying to save a Pandas datetime64[ns] to a BQ DATE field #362

swt2c opened this issue Apr 16, 2021 · 4 comments · Fixed by #420
Assignees
Labels
api: bigquery Issues related to the googleapis/python-bigquery-pandas API. priority: p2 Moderately-important priority. Fix may not be included in next release. 🚨 This issue needs some love. type: bug Error or flaw in code with unintended results or allowing sub-optimal usage patterns.

Comments

@swt2c
Copy link

swt2c commented Apr 16, 2021

Traceback (most recent call last):
  File "/tmp/test-bq_20210416_36390b15/test_bq.py", line 8, in <module>
    df.to_gbq(
  File "/opt/conda/default/lib/python3.8/site-packages/pandas/core/frame.py", line 1710, in to_gbq
    gbq.to_gbq(
  File "/opt/conda/default/lib/python3.8/site-packages/pandas/io/gbq.py", line 211, in to_gbq
    pandas_gbq.to_gbq(
  File "/opt/conda/default/lib/python3.8/site-packages/pandas_gbq/gbq.py", line 1093, in to_gbq
    connector.load_data(
  File "/opt/conda/default/lib/python3.8/site-packages/pandas_gbq/gbq.py", line 580, in load_data
    self.process_http_error(ex)
  File "/opt/conda/default/lib/python3.8/site-packages/pandas_gbq/gbq.py", line 380, in process_http_error
    raise GenericGBQException("Reason: {0}".format(ex))
pandas_gbq.gbq.GenericGBQException: Reason: 400 Error while reading data, error message: Could not parse '2021-04-17 00:00:00.000000' as DATE for field for_date (position 1) starting at location 0  with message 'Unable to parse'

I seem to be able to work around this by forcing the datetime64[ns] column to the .dt.date of the datetime64, but it seems like pandas-gbq should be able to figure this out when I supply the schema with a 'DATE' column in it.

@tebesfinwo
Copy link

tebesfinwo commented Apr 28, 2021

I, too, encountered the same problem. I also tried to state schema explicitly like the following but to no avail.

pandas_gbq.to_gbq(df, "table", "project_id", table_schema=[{"name": "foo", "field": "DATE"}])

@RomikimoR
Copy link

I am facing the exact same situation with a byte field.

@parthea parthea added type: bug Error or flaw in code with unintended results or allowing sub-optimal usage patterns. priority: p2 Moderately-important priority. Fix may not be included in next release. labels Jul 17, 2021
@product-auto-label product-auto-label bot added the api: bigquery Issues related to the googleapis/python-bigquery-pandas API. label Jul 17, 2021
@yoshi-automation yoshi-automation added the 🚨 This issue needs some love. label Nov 3, 2021
@tswast
Copy link
Collaborator

tswast commented Nov 10, 2021

I recently did some work with the db-dtypes package, which has a well-defined serialization in parquet (which is now the default). I'll have to do some investigation to see what it does with CSV serialization.

I agree that when the table_schema is provided, we could use that as a hint when serializing the data to write to BigQuery.

@tswast tswast self-assigned this Nov 10, 2021
@tswast tswast removed the 🚨 This issue needs some love. label Nov 10, 2021
@yoshi-automation yoshi-automation added the 🚨 This issue needs some love. label Nov 10, 2021
@tswast
Copy link
Collaborator

tswast commented Nov 10, 2021

Looks like this was fixed by #413, but I'll keep this open while I add some system tests for this issue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
api: bigquery Issues related to the googleapis/python-bigquery-pandas API. priority: p2 Moderately-important priority. Fix may not be included in next release. 🚨 This issue needs some love. type: bug Error or flaw in code with unintended results or allowing sub-optimal usage patterns.
Projects
None yet
Development

Successfully merging a pull request may close this issue.

6 participants