We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
from influxdb_client.client.query_api import QueryOptions from influxdb_client.client.influxdb_client_async import InfluxDBClientAsync import asyncio async def main(): client = InfluxDBClientAsync( url=INFLUX_SERVER, token=INFLUX_TOKEN, org=INFLUX_ORG, verify_ssl=False, ) def callback(records): print(records.values) api = client.query_api( QueryOptions( profilers=["operator", "query"], profiler_callback=callback, ) ) flux = """ import "json" from(bucket: "my_bucket") |> range(start: -1y) |> keep(columns: ["_time"]) |> limit(n: 1) """ async for line in await api.query_stream(flux): print(line) if __name__ == "__main__": asyncio.run(main())
Be able to use the query records and the profiling information without triggering an exception.
Records and profiling information were printed to stdout then the client raised:
Traceback (most recent call last): File "/tmp/minimal-influx-bug/bug.py", line 44, in <module> asyncio.run(main()) File "/home/tim/.pyenv/versions/3.10.6/lib/python3.10/asyncio/runners.py", line 44, in run return loop.run_until_complete(main) File "/home/tim/.pyenv/versions/3.10.6/lib/python3.10/asyncio/base_events.py", line 646, in run_until_complete return future.result() File "/tmp/minimal-influx-bug/bug.py", line 39, in main async for line in await api.query_stream(flux): File "/home/tim/.pyenv/versions/newbackend-poc/lib/python3.10/site-packages/influxdb_client/client/flux_csv_parser.py", line 140, in _parse_flux_response_async for val in self._parse_flux_response_row(metadata, csv[0]): File "/home/tim/.pyenv/versions/newbackend-poc/lib/python3.10/site-packages/influxdb_client/client/flux_csv_parser.py", line 228, in _parse_flux_response_row flux_record = self.parse_record(metadata.table_index - 1, metadata.table, csv) File "/home/tim/.pyenv/versions/newbackend-poc/lib/python3.10/site-packages/influxdb_client/client/flux_csv_parser.py", line 264, in parse_record str_val = csv[fluxColumn.index + 1] IndexError: list index out of range
The sync client seems to be working fine.
The text was updated successfully, but these errors were encountered:
Hi @tlebrize,
thanks for using our client.
The problem is caused by parsing profiler response into CSV:
#datatype,string,long,string,long,long,long,long,long,long,long,long,long,string,long,string,long #group,false,false,true,false,false,false,false,false,false,false,false,false,false,false,false,false #default,_profiler,,,,,,,,,,,,,,, ,result,table,_measurement,TotalDuration,CompileDuration,QueueDuration,PlanDuration,RequeueDuration,ExecuteDuration,Concurrency,MaxAllocated,TotalAllocated,RuntimeErrors,influxdb/scanned-values,flux/query-plan,influxdb/scanned-bytes ,,0,profiler/query,7889958,393500,70250,0,0,7352083,0,0,0,,0,"digraph { ""ReadRange2"" } ",0
influxdb-client-python/influxdb_client/client/flux_csv_parser.py
Line 133 in 940d5d3
As a workaround you can use await api.query_raw(flux) for profiling query.
await api.query_raw(flux)
Regards
Sorry, something went wrong.
I have prepared a fixed version within #497. If you would like to use this fixed version before regular release, please install client via:
pip install git+https://github.com/influxdata/influxdb-client-python.git@fix-async-query
Hello @bednar, Thanks for the quick fix !
Successfully merging a pull request may close this issue.
Specifications
Code sample to reproduce problem
Expected behavior
Be able to use the query records and the profiling information without triggering an exception.
Actual behavior
Records and profiling information were printed to stdout then the client raised:
Additional info
The sync client seems to be working fine.
The text was updated successfully, but these errors were encountered: