The Globus Python SDK offers a number of useful operations. At the moment, however, it does not allow the direct download of files from a remote endpoint. At a minimum, we will need a Globus Personel Connect Client running to which the transfer can be initiated. Fortunately there is a docker image already available to run this client in a headless configuration.
Here is an initial suggestion on how to architect support for Globus transfers to workbench.
- Workbench UI has button to launch Globus Personal Connect.
- Generate a name for this endpoint (Guid or maybe just named 'nds-workbench' ?
- Create the Globus endpoint by calling CreateEndpoint store the resulting setup key
- The resulting key and name must be provided to Workbench UI
- API Server launches an instance of docker-gcp with the provided key. It will pass in the user's home directory as a mounted volume and tell gcp to use that directory to store incoming transfers
- User requests a transfer
- They specify the endpoint name and path to the file
- Optionally, a relative path and filename for the destination
- Python task is kicked off in cluster? Do we need to write REST client in golang?It would be great to get back the task_id for monitoring...
- Poll the status of the task and show progress
- If success, then file will be sitting in the user's home dir in gluster.