The Globus Python SDK offers a number of useful operations. At the moment, however, it does not allow the direct download of files from a remote endpoint. At a minimum, we will need a Globus Personel Connect Client running to which the transfer can be initiated. Fortunately there is a docker image already available to run this client in a headless configuration.

Here is an initial suggestion on how to architect support for Globus transfers to workbench.

  1. Workbench UI has button to launch Globus Personal Connect. 
    1. Generate a name for this endpoint (Guid or maybe just named 'nds-workbench' ?
    2. Create the Globus endpoint by calling CreateEndpoint store the resulting setup key
    3. The resulting key and name must be provided to Workbench UI
    4. API Server launches an instance of docker-gcp with the provided key. It will pass in the user's home directory as a mounted volume and tell gcp to use that directory to store incoming transfers
  2. User requests a transfer
    1. They specify the endpoint name and path to the file
    2. Optionally, a relative path and filename for the destination
    3. Python task is kicked off in cluster? Do we need to write REST client in golang?It would be great to get back the task_id for monitoring...
    4. Poll the status of the task and show progress
    5. If success, then file will be sitting in the user's home dir in gluster.
  • No labels