Page tree
Skip to end of metadata
Go to start of metadata



Discussion items


Started putting videos up on YouTube
Clowder All Paws from PEARC 2019 is done


maintain DOILuigisomeone published data to Sead web application(e.g.,
Looking at extractor ability to publish data in Sead project. Suggested way: some extractor like CKAN extractor to publish data.

Elastic SearchMaxtreat everything as string. reindex improvement in the future release: avoid downtime.
will merge the current pull request of elastic search.
deletetion on dataset related to elastic indexing requires to change the Clowder GUI.
budgets for RDA project and new xml format will need to introduce the new functionality to upload back the new format of files.
fixed the bugs for the swaping and this is a pull request for it.

Clowder Videos and DocumentsShannonall clowder videos are available on youtube. All presentation materials are ready, need to know where to put them. (Luigi suggested, e.g., webinars demo and materials are on the wiki page, or google docs/slides).

Clowder CatalogMarkClowder catalog ( has errors that would be due to dependency issues. It needs to find a way to populate the tool tables, and that is the right way to fix the errors. (Luigi suggested, it is better to fix the dependencies issues if there is some).
(Rob suggested that it is better to look at the dependencies package/library in

Clowder User Profile Page ImprovementMiketry to add dataset/collection under the user profile page. Needs to handle the alignment of user apikey and user pictures. (suggestion: add space, dataset and collection tabs to user profile page). With access permission, one user can see the dataset under another user name (already implemented).To use `sbt` command to run Clowder.

Geo/Pycsw extraction Bingbash script to find out all geo/pycsw files from database and submit to geo/pycsw servers on Industry dev cluster.

Simple Extraction TestBing

small changes based on Max's script and can run this script from local machine to remote Clowder instace. The script will take several env variables: clowder hostname, testing filepath, Slack token. The script can upload testing file to remote Clowder instance and trigger a specific extraction ont the testing file. The script will check the status of extraction (DONE, Processing) via Clowder endpoints. And report the testing message to Slack channel (Rob suggested that it is better to integrate with the current bd-test framework).