As of January 2020 GLTG has 32,122,836 datapoints.
Please don't fetch all of them at once.
Step 1: Create an Account
- go to: https://greatlakestogulf.org/geostreams
- Click "Sign Up"
- Fill out form
- Ignore check email
- You're done
Step 2: Acquire Data from API by using CURL
You can acquire the data from API by using curl command or python library (we provide the example in jupyter notebook in attached)
Using CURL
Get all Sensors in JSON format
Currently, pulling sensors does not require authentication.
Outputs Output type | Output Example |
---|
url | JSON |
Code Block |
---|
language | js |
---|
title | Output JSON example |
---|
collapse | true |
---|
| { "sensors":[
{ "id":1445,
"name":"03254520",
"created":"2018-03-23T15:48:32Z",
"geoType":"Feature",
"geometry":{ "type":"Point", "coordinates":[ -84.44799549,38.9203417,0]
},
"properties":{ "name":"03254520",
"huc":{ "huc8":{ "code":"05100101"},"huc2":{"code":"05" },
"huc4":{"code":"0510"}, "huc6":{"code":"051001"},
"huc_name":"Licking"
},
"region":"0510",
"location":"LICKING RIVER AT HWY 536 NEAR ALEXANDRIA, KY",
"type":{
"title":"United States Geological Survey",
"network":"NWIS",
"id":"usgs"
},
"popupContent":"03254520",
"online_status":"online",
"id":1445
},
"min_start_time":"2007-10-01T06:00:00Z",
"max_end_time":"2020-02-05T12:30:00Z",
"parameters":[
"discharge-ft3s",
"discharge-ft3s-qc",
"dissolved-oxygen-mgl",
"dissolved-oxygen-mgl-qc",
"nitrate-nitrite-as-n-mgl",
"nitrate-nitrite-as-n-mgl-qc",
"pH",
"pH-qc",
"specific-conductance-uScm",
"specific-conductance-uScm-qc",
"turbidity-fnu",
"turbidity-fnu-qc",
"water-temperature-c",
"water-temperature-c-qc"
],
....
]
}} |
|
Code Block |
---|
|
curl -X GET --compressed https://greatlakestogulf.org/geostreams/api/sensors |
...
Authenticate
Inputs | Output | Details |
---|
| X-Auth-Token | Use the token for fetching datapoints |
Code Block |
---|
|
curl -X POST -H |
...
'Content-Type: application/json' -d '{"password": "****", "identifier": "email"}' --compressed -i https://greatlakestogulf.org/geostreams/api/authenticate |
...
...
Get all Datapoints for Single Sensor
We request that a user not try to pull all datapoints concurrently. It is preferred that datapoints be pulled in series by sensor id.
Inputs | Output Type | Details | Example Return |
---|
| JSON | Use X-Auth-Token from authentication |
Code Block |
---|
title | Example Output |
---|
collapse | true |
---|
| [
{ "id":96556536,
"created":"2019-09-27T20:45:42Z",
"start_time":"2018-06-25T00:00:00Z",
"end_time":"2018-06-25T00:00:00Z",
"properties":{
"nitrate-nitrite-inorganic-total-as-n-mgl":"4.16"
},
"type":"Feature",
"geometry":{ "type":"Point",
"coordinates":[ -90.645,42.5408333,0 ]
},
"stream_id":"28",
"sensor_id":"22",
"sensor_name":"IL_EPA_WQX-M-13"
},
...
] |
|
Code Block |
---|
title | Get Datapoints for Single Sensor |
---|
|
curl -X GET -H 'Content-Encoding: application/json' -H |
...
'x-auth-token:token' --compressed 'https://greatlakestogulf.org/geostreams/api/datapoints?sensor_id=22&since= |
...
...
...
Using Python Library (pyGeotemporal)
<< insert code example here>>
<< need to have instruction how to get pyGeotemporal >>
Jupyter notebook example can be download here <link>
Jupyter Notebook
We have created a jupyter notebook that allows a user to run several common data fetching methods including get sensors as CSV and get datapoints in CSV or JSON.
Install Jupyter
Code Block |
---|
|
conda install -c conda-forge notebook |
Or
Code Block |
---|
|
pip install notebook |
Download and Run Notebook
Download this file geostreams_jupyter.ipynb, move it in a directory of your choice, from a terminal in the chosen directory .
Code Block |
---|
|
jupyter notebook |