-
Notifications
You must be signed in to change notification settings - Fork 125
Feature Request: Add support for 'Allow Large Results' to BigQuery connector #15
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
People have posted some pretty elaborate workarounds on StackOverflow. https://stackoverflow.com/questions/34201923/python-bigquery-allowlargeresults-with-pandas-io-gbq/34203369 |
Can now do this in open PR by passing it via configuration setting: #25
This uses new google-cloud-python api. |
@jasonqng you also have to add destinationTable. |
Actually, I think the current API does support this, even without #25. The read_gbq function accepts a Standard SQL:
Legacy SQL:
Admittedly this is a bit onerous to do. We may wish to provide a friendlier interface for options such as these. |
The updated answer on stack overflow suggests just using dialect='standard' like tswast did but more simply as and notes AllowLargeResults: For standard SQL queries, this flag is ignored and large results are always allowed. This worked for me but maybe it is not generic. |
I'm glad that worked for you. I believe there may be some size threshold where a destination table is required, even with standard SQL, but perhaps the threshold is larger than it was for legacy SQL. |
Closing, as this can be passed in via the |
xref pandas-dev/pandas#10474
gbq.py currently returns an error if the result of a query is what Google considers to be 'Large'. The google api allows jobs to be sent with a flag to allow large results. It would be very beneficial to provide this as an option in the BigQuery connector.
The text was updated successfully, but these errors were encountered: