- V2: Max fixed some bugs.
- Downloading still needs some work, perhaps because I was running multiple instances. We can provide a signature to a model and will convert Json for you. Tested front end in docker compose - seemed to be working!
- Adding notes to read.me
- We are using FastAPI for Swagger
- We need to create datasets and have been tested
- Name Alpha 1 as a temporary name 2.0-Alpha.1 to distinguish between V1 and V2
- Mike L notes that PR does not have good support for downloading file blobs - client makes the assumption that we are working with Json or Txt and does not return binary data. Options are to look at a different generator or use non-generated code for now
- Using a generated client, it's returning binary blobs.
- Working on how to sync a client and have put together a GitHub action and will check out the current FastAPI and run the latest docker image and generates the code.
- Can we watch other GitHub actions? For extractors, it runs once a day/once an hour, etc. to check for merges OR make an access token can commit to another repository. However, if you're not careful, the secret token could be hacked. It would be best to run a cron job in the front end. GitHub actions can run a cron job. https://github.com/clowder-framework/extractors-core/blob/master/.github/workflows/pyclowder.yml. If you run a complex job on Friday night, it will be updated over the weekend and on Monday there will be a completely updated version.
- Todd going to be 25% on clowder after the new year. The PI should fill in the form themselves, including the CFOP, but the developer should help with how much space etc you will need in Clowder and Radiant usage.
- Releasing V2 2.0-Alpha.1 this afternoon. Congrats to all! There will be difference in the backend
|Mike L.||WIP GitHub action to sync OpenAPI spec changes with generated client code (still testing / trying to think of better patterns) - Rob says look into turning this into a cron job action|
pull requests open
edit and increment file
upload multiple files