As of January 2020 GLTG has 32,122,836 datapoints.
Please don't fetch all of them at once.
You can acquire the data from API by using curl command or python library (we provide the example in jupyter notebook in attached)
Currently, pulling sensors does not require authentication.
Inputs | Output type | Output Example |
---|---|---|
url | JSON |
curl -X GET --compressed https://greatlakestogulf.org/geostreams/api/sensors |
Inputs | |
---|---|
|
curl -X POST -H 'Content-Type: application/json' -d '{"password": "****", "identifier": "email"}' --compressed -i https://greatlakestogulf.org/geostreams/api/authenticate |
task | curl | inputs | instructions | returns |
---|---|---|---|---|
authenticate | curl -X POST -H 'Content-Type: application/json' -d '{"password": "****", "identifier": "email"}' --compressed -i https://greatlakestogulf.org/geostreams/api/authenticate |
| In the response will be X-Auth-Token: followed by the alphanumeric security token | X-Auth-Token |
get all datapoints for a single sensor | curl -X GET -H 'Content-Encoding: application/json' -H 'x-auth-token:token' --compressed 'https://greatlakestogulf.org/geostreams/api/datapoints?sensor_id=22&since=2018-06-01' |
| Use X-Auth-Token from authentication | json |
<< insert code example here>>
<< need to have instruction how to get pyGeotemporal >>
Jupyter notebook example can be download here <link>
We have created a jupyter notebook that allows a user to run several common data fetching methods including get sensors as CSV and get datapoints in CSV or JSON.
conda install -c conda-forge notebook |
Or
pip install notebook |
Download this file geostreams_jupyter.ipynb, move it in a directory of your choice, from a terminal in the chosen directory .
jupyter notebook |