As of January 2020 GLTG has 32,122,836 datapoints. 

 Please don't fetch all of them at once.


Step 1: Create an Account 

Step 2: Acquire Data from API

You can acquire the data from API by using curl command or python library (we provide the example in jupyter notebook in attached)

Using CURL

Get all Sensors in JSON format

Currently, pulling sensors does not require authentication.

InputsOutput typeOutput Example
url

JSON



curl -X GET --compressed https://greatlakestogulf.org/geostreams/api/sensors


Authenticate

InputsOutputDetails
  • url
  • email
  • password
X-Auth-TokenUse the token for fetching datapoints




curl -X POST -H 'Content-Type: application/json' -d '{"password": "****", "identifier": "email"}' --compressed -i https://greatlakestogulf.org/geostreams/api/authenticate


Get all Datapoints for Single Sensor

We request that a user not try to pull all datapoints concurrently.  It is preferred that datapoints be pulled in series by sensor id.

InputsOutput TypeDetailsExample Return
  • token
  • sensor_id
  • since
JSONUse X-Auth-Token from authenticationUse the token for fetching datapoints

[    

    { "id":96556536,
      "created":"2019-09-27T20:45:42Z",
      "start_time":"2018-06-25T00:00:00Z",
      "end_time":"2018-06-25T00:00:00Z",
      "properties":{ 
         "nitrate-nitrite-inorganic-total-as-n-mgl":"4.16"

       },
      "type":"Feature",
      "geometry":{          "type":"Point",
         "coordinates":[ -90.645,42.5408333,0 ]
       },
      "stream_id":"28",
      "sensor_id":"22",
      "sensor_name":"IL_EPA_WQX-M-13"
   },
  ...
]


curl -X GET -H 'Content-Encoding: application/json' -H 'x-auth-token:token' --compressed 'https://greatlakestogulf.org/geostreams/api/datapoints?sensor_id=22&since=2018-06-01'





Using Python Library (pyGeotemporal)

<< insert code example here>>

<< need to have instruction how to get pyGeotemporal >>




Jupyter Notebook 

Jupyter notebook example can be download here geostreams_jupyter.ipynb