-
Notifications
You must be signed in to change notification settings - Fork 185
Python access to InfluxDB parameters via JSON fie #444
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Hi @rslippert, thanks for using our client. How is generated your Regards |
Did not know about influx auth create, I will check that out. The point is to generate a simple point of entry for access to InfluxDB data. with InfluxDBClient( access= 'stock_data_rslippert.json' ) as client: |
As an alternative: with InfluxDBClient( access= token) as client: I think a json file would be better because you could add things without the user needing to know about it. |
Yes, the json file would be an alternative to the creation of a token |
Iceberg/Sagent is Artificial General Intelligence AGI. with InfluxDBClient( context= 'stock_data_rslippert' ) as client: |
Sorry for so many words, I am writing the AGI "Sagent Standard" documentation: All processes will soon operate using smart agents (Sagent) that collect and control data within the process. |
@rslippert thanks for the explanation. Currently the client supports initialisation from with InfluxDBClient.from_config_file('./config.ini') as client:
... The initialisation from JSON context file could be next option, something like: with InfluxDBClient.from_json_file('./stock_data_rslippert.json') as client:
... |
bednar thanks, The important issue is automatic generation of this access data from the InfluxDB site point of access. JSON is more suited to processes (API), external access. Config is more suited to an application (APP), internal access. You should also add "timezone" to that config.ini to allow simple pandas date formats. |
It looks like it already supports .ini and .toml formats, just add .json to that. |
The generation of the access configuration files is already present in
The format of the influx config --json {
"url": "http://localhost:8086",
"token": "my-token",
"org": "my-org",
"active": true
}
|
Proposal:
Access to InfluxDB parameters were unnecessarily difficult to determine and use as magic numbers.
The Influx could save all those parameters to a JSON file, thus allow multiple use cases for multiple datasets.
Then access to the correct parameters for a given dataset comes from dictionary items.
Current behavior:
url = "http://localhost:8086" # must find an these paste magic numbers into my code here
token = "my-token" #token generated online and pasted here
bucket = "my-bucket"
org = "my-org"
with InfluxDBClient( url=url, token=token, org=org) as client:
(more code here)
Desired behavior:
import json
jfile = open('stock_data_rslippert.json',)
my = json.load( jfile)
with InfluxDBClient( url= my['influx_url'] token=my['influx_token'], org=my['influx_org']) as client:
(more code here)
Alternatives considered:
Would need to be some other way that automates collection of the parameters without magic numbers.
Use case:
Rather than searching and copy pasting parameters as magic numbers, for many different use cases,
use a similar method as when generating a token, but all needed parameters are generated to an easy to use json file
this simplifies access to the correct parameters, IE. it automates the access for many different use cases.
This is useful because engineers will be working on many different datasets, but the code should be the same.
so rather that pasting in magic numbers, access to the data is automatic. just provide a json filename as reference.
The file.json would look like this:
{
"influx_bucket": "rslippert's Bucket",
"influx_org": "d9ae8eef6f1bd6da",
"influx_token": "E3zGsMqgMrDYthxjfT921FXhipsOfGOIOxpcfvn61SnBE4D1mdgYKF5aYefaCbroy0UpBIsOEEMZr2XM96dtFg==",
"influx_url": "https://us-east-1-1.aws.cloud2.influxdata.com",
"timezone": "EST",
"influxdb_version": "1.0"
}
The text was updated successfully, but these errors were encountered: