As of January 2020 GLTG has 32,122,836 datapoints.
WARNING Please don't fetch all of them at once.
Step 1: Create an Account
- go to: https://greatlakestogulf.org/geostreams
- Click "Sign Up"
- Fill out form
- Ignore check email
- You're done
Step 2: Acquire Data from API
You can acquire the data from API by using curl command or python library (we provide the example in jupyter notebook in attached)
Using CURL
Get all Sensors in JSON format
Currently, pulling sensors does not require authentication.
Inputs | Outputs type | Example |
---|---|---|
url | JSON |
curl -X GET --compressed https://greatlakestogulf.org/geostreams/api/sensors
task | curl | inputs | instructions | returns |
---|---|---|---|---|
authenticate | curl -X POST -H 'Content-Type: application/json' -d '{"password": "****", "identifier": "email"}' --compressed -i https://greatlakestogulf.org/geostreams/api/authenticate |
| In the response will be X-Auth-Token: followed by the alphanumeric security token | X-Auth-Token |
get all datapoints for a single sensor | curl -X GET -H 'Content-Encoding: application/json' -H 'x-auth-token:token' --compressed 'https://greatlakestogulf.org/geostreams/api/datapoints?sensor_id=22&since=2018-06-01' |
| Use X-Auth-Token from authentication | json |
Using Python Library (pyGeotemporal)
<< insert code example here>>
<< need to have instruction how to get pyGeotemporal >>
Jupyter notebook example can be download here <link>
Jupyter Notebook
We have created a jupyter notebook that allows a user to run several common data fetching methods including get sensors as CSV and get datapoints in CSV or JSON.
Install Jupyter
conda install -c conda-forge notebook
Or
pip install notebook
Download and Run Notebook
Download this file geostreams_jupyter.ipynb, move it in a directory of your choice, from a terminal in the chosen directory .
jupyter notebook