-
Notifications
You must be signed in to change notification settings - Fork 102
Please add async/await support #176
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Thanks for raising this issue. I agree that This is a nontrivial project though, namely because we use urllib3 for our HTTP handling and urllib3 doesn't support async. In the short term you can work around this with I'm self-assigning this issue for now. I'd like to see more signal from other users so we can gauge the demand for this feature. Single-threaded usage comprises the majority of customer feature requests with regard to this connector (the connector is 18 months old and this request has only come up once before here). The more people ask for it the easier it will be to prioritise |
Similar Request here. Key concern is to handle timeout when cluster boots up . |
I need to run 30 async queries in parallel to see if they will parse in UC with 1 minute timeout to consider it a parseable query in Unity Catalog. I'm using
|
@vhrichfernandez that error is pretty straightforward. Here's a related StackOverflow answer. As for the topic of this issue: we're actively planning out the introduction of both async/await and a blocking but async execution method for this connector. |
Adding the following code to my module: import pickle, copyreg, ssl
def save_sslcontext(obj):
return obj.__class__, (obj.protocol,)
copyreg.pickle(ssl.SSLContext, save_sslcontext)
context = ssl.create_default_context() results in the following error:
|
I also would like to have async support |
Same here |
Also interested in async/await support. |
Our team is also interested |
Thanks for the signal, everyone. This feature is now being developed. I'll post updates on this issue as we move closer to release. Pull requests implementing this behaviour should begin to pop up in the next couple weeks. |
Hello @susodapop - Please share any latest updates on this? |
Hello, @susodapop - InMobi customer and MSFT have been following up on this. Can you please let me know if this issue is fixed. |
Echoing the interest here. This would be a huge add! |
I have a requirement for this building an LLM web app with 400+ concurrent users. |
Just chiming in here that I'd also really like async sypport. As a user of the To support users of both Using AnyIO automatically gives you support for both async-frameworks and has the additional benefit of providing a clean structured-concurrency (SC) api which should be easier to develop against than Many of the ideas developed by |
@susodapop checking in to see if you can share the latest status/ETA? |
Hi there! Any updates on this? |
Adding async support is not trivial. We need to prioritize and do the design. There is no ETA to provide at this time. |
Also interested in async/await support 🙏🏻 |
Hi @yunbodeng-db, I understand async support is non trivial but is this even being considered at the time? |
Not the async APIs at this moment, but it's possible to expose an async handler for the client to poll the status of a long running query. I cannot provide an ETA yet. |
@thetadweller We have added support for the async handler in v3.7.0 cc @deeksha-db |
I don't think that this should have been closed as #485 is easily reproducible. The async methods don't appear to work. |
I am writing a web app that needs to run multiple concurrent queries against Databricks SQL Warehouse. Due to existing library is synchronous my processes tend to get locked for a duration of SQL query so that subsequent calls from other clients end up being queued. As such, I am forced to run multiple Python processes to handle multiple concurrent calls even though all of them are I/O bound and would have been handled by a handful of processes had I been able to write queries using async / await.
I tried to find a workaround using SQLAlchemy and async I/O wrappers but returned a message that connection is not asynchronous:
InvalidRequestError: The asyncio extension requires an async driver to be used. The loaded 'databricks-sql-python' is not async.
The text was updated successfully, but these errors were encountered: