We have created a jupyter notebook that allows a user to run several common data fetching methods including get sensors as CSV and get datapoints in CSV or JSON.
conda install -c conda-forge notebook |
Or
pip install notebook |
Download this file geostreams_jupyter.ipynb, move it in a directory of your choice, from a terminal in the chosen directory .
task | curl | inputs | instructions | returns |
---|---|---|---|---|
get all sensors | curl -X GET --compressed https://greatlakestogulf.org/geostreams/api/sensors | |||
authenticate | curl -X POST -H 'Content-Type: application/json' -d '{"password": "****", "identifier": "email"}' --compressed -i https://greatlakestogulf.org/geostreams/api/authenticate |
| In the response will be X-Auth-Token: followed by the alphanumeric security token | X-Auth-Token |
get all datapoints for a single sensor | curl -X GET -H 'Content-Encoding: application/json' -H 'x-auth-token:token' --compressed 'https://greatlakestogulf.org/geostreams/api/datapoints?sensor_id=22&since=2018-06-01' |
| Use X-Auth-Token from authentication | json |