Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

Table of Contents

Clowder Spec

Latest Stable Tag: 0.9.2

See https://github.com/nds-org/ndslabs-specs/tree/master/clowder

Required Docker Images

The following images will be automatically pulled from Docker Hub if they are not present on your machine:

Docker Image Build

If you would rather build up your own version from source, rather than pulling from Docker Hub:

  • Clone the https://github.com/nds-org/ndslabs-clowder repository.
  • Change directories to dockerfiles and run the ./build.sh command.
  • You should see the images start building from the Dockerfiles present.
  • Coming Soon: Core Extractors!
    • audio-preview
    • audio-speech2text
    • image-metadata
    • pdf-preview

 

WARNING: plantcv may take 10 - 25 minutes to complete its build. Plan accordingly.

Test Setup

Start Tool Server

  • Start up a new toolserver using the following command:
    • NOTE: Ensure that the image is tagged as ncsa/clowder-toolserver
      • This is a manual step until the image gets tagged and pushed to Docker Hub.
    • Save the Public IP of the node where this is running.

      Code Block
      languagebash
      docker run -d -p 8082:8082 --name toolserver

TODO: Update this

NDSLabs Test Cases

Prerequisites

Start Tool Server

  • Start up a new toolserver using the following command:
    • Save the Public IP of the node where this is running.

      Code Block
      languagebash
      docker run -d -p 8082:8082 --name toolserver -v /var/run/docker.sock:/var/run/docker.sock ndslabs/toolserver:terra toolserver

Start NDSLabs

Start up NDSLabs as described below:

Run the NDSLabs System Shell:

...

languagebash

...

    •  -v /var/run/docker.sock:/var/run/docker.sock ndslabs/toolserver:0.9.2 toolserver

Start NDSLabs

  1. Start up NDSLabs as described below:

    • Run the NDSLabs System Shell:

      Code Block
      languagebash
      docker run  -it --volume=/:/rootfs:ro --volume=/sys:/sys:ro --volume=/var/lib/docker/:/var/lib/docker:rw --volume=/var/lib/kubelet/:/var/lib/kubelet:rw --volume=/var/run:/var/run:rw --net=host --pid=host --privileged=true -v /var/run/docker.sock:/var/run/docker.sock ndslabs/system-shell 
      Unable to find image 'ndslabs/system-shell:latest' locally
      latest: Pulling from ndslabs/system-shell
      ff12aecbe22a: Already exists
      287750ad6625: Already exists
      ca98bdf222fa: Already exists
      a3ed95caeb02: Already exists
      97ef68d67ea6: Pull complete
      8c53c989a967: Pull complete
      79d911a06f41: Pull complete
      807cecd8f466: Pull complete
      7f887f3746f8: Pull complete
      0cadab32de06: Pull complete
      aff97fd2a6c1: Pull complete
      Digest: sha256:4128fff8a0234ee6cc25d077b7f607358e681370e5a483b6c89fe1a3dfc3e77e
      Status: Downloaded newer image for ndslabs/system-shell:latest
      [root@default NDSLabsSystem ] / # 
  2. From the NDSLabsSystem shell:
    • Start Kubernetes by running kube-up.sh
      • This will pull the necessary Kubernetes images and start up a single-node (development) Kubernetes cluster.
    • Start NDSLabs by running ndslabs-up.sh
      • This will start the API Server and GUI in Kubernetes
      • NOTE: You may need to wait 30 seconds or more for the GUI server to start while npm and bower download the GUI's dependencies
    • Check the API server logs and verify that the specs were loaded correctly:
      • Code Block
        languagebash
        Cloning into '/specs'...
    system-shell Unable to find image 'ndslabs/system-shell:latest' locally latest: Pulling from ndslabs/system-shell ff12aecbe22a: Already exists 287750ad6625: Already exists ca98bdf222fa: Already exists a3ed95caeb02: Already exists 97ef68d67ea6: Pull complete 8c53c989a967: Pull complete 79d911a06f41: Pull complete 807cecd8f466: Pull complete 7f887f3746f8: Pull complete 0cadab32de06: Pull complete aff97fd2a6c1: Pull complete Digest: sha256:4128fff8a0234ee6cc25d077b7f607358e681370e5a483b6c89fe1a3dfc3e77e Status: Downloaded newer image for ndslabs/system-shell:latest [root@default NDSLabsSystem ] / # From the NDSLabsSystem shell:
    • Start Kubernetes by running kube-up.sh
    • Start NDSLabs by running ndslabs-up.sh
      • This will start the API Server and GUI in Kubernetes
      • NOTE: You may need to wait 30 seconds or more for the GUI server to start while npm and bower download the GUI's dependencies
    • Check the API server logs and verify that the specs were loaded correctly:
      • Code Block
        languagebash
        Cloning into '/specs'...
        I0325 01:52:58.670716      15 server.go:127] Starting NDS Labs API server (0.1alpha 2016-03-24 14:03)
        I0325 01:52:58.671129      15 server.go:128] etcd PRIVATE_IP:4001
        I0325 01:52:58.671145      15 server.go:129] kube-apiserver https://PRIVATE_IP:6443
        I0325 01:52:58.671159      15 server.go:130] volume dir /volumes
        I0325 01:52:58.671170670716      15 server.go:131127] specs dir /specsStarting NDS Labs API server (0.1alpha 2016-03-24 14:03)
        I0325 01:52:58.671181671129      15 server.go:132128] hostetcd PUBLICPRIVATE_IP:4001
        I0325 01:52:58.671192671145      15 server.go:133129] port 8083kube-apiserver https://PRIVATE_IP:6443
        I0325 01:52:58.671205671159      15 server.go:134130] V1 volume dir /volumes
        I0325 01:52:58.671216671170      15 server.go:135131] V2specs dir /specs
        I0325 01:52:58.671227671181      15 server.go:136132] host V3PUBLIC_IP
        I0325 01:52:58.671237671192      15 server.go:137133] V4port 8083
        I0325 01:52:58.671257671205      15 server.go:675134] GetEtcdClient PRIVATE_IP:4001V1
        I0325 01:52:58.672692671216      15 server.go:165135] Using local storageV2
        I0325 01:52:58.672746671227      15 server.go:175136] CORS origin http://PUBLIC_IP:30000
        I0325 01:52:58.673061V3
        I0325 01:52:58.671237      15 server.go:276137] Loading service specs from /specsV4
        I0325 01:52:58.673910671257      15 server.go:1875675] Adding /specs/clowder/clowder.jsonGetEtcdClient PRIVATE_IP:4001
        I0325 01:52:58.674621672692      15 server.go:1875165] Adding /specs/clowder/elastic.jsonUsing local storage
        I0325 01:52:58.675383672746      15 server.go:1875175] AddingCORS origin http:/specs/clowder/extractors/image-preview.json/PUBLIC_IP:30000
        I0325 01:52:58.675999673061      15 server.go:1875] Adding276] Loading service specs from /specs/clowder/extractors/plantcv.json
        I0325 01:52:58.676513673910      15 server.go:1875] Adding /specs/clowder/extractors/video-previewclowder.json
        I0325 01:52:58.677121674621      15 server.go:1875] Adding /specs/clowder/mongoelastic.json
        I0325 01:52:58.677516675383      15 server.go:1875] Adding /specs/clowder/extractors/rabbitmqimage-preview.json
        I0325 01:52:58.678111675999      15 server.go:1875] Adding /specs/dataverseclowder/dataverseextractors/plantcv.json
        I0325 01:52:58.679353676513      15 server.go:1875] Adding /specs/dataverse/postgresclowder/extractors/video-preview.json
        I0325 01:52:58.679768677121      15 server.go:1875] Adding /specs/dataverseclowder/rservemongo.json
        I0325 01:52:58.680253677516      15 server.go:1875] Adding /specs/dataverseclowder/solrrabbitmq.json
        I0325 01:52:58.682642678111      15 server.go:1875] Adding /specs/dataverse/tworavensdataverse.json
        I0325 01:52:58.684168679353      15 server.go:1875] Adding /specs/elkdataverse/elasticpostgres.json
        I0325 01:52:58.684700679768      15 server.go:1875] Adding /specs/elkdataverse/kibanarserve.json
        I0325 01:52:58.685192680253      15 server.go:1875] Adding /specs/elkdataverse/logspoutsolr.json
        I0325 01:52:58.686697682642      15 server.go:1875] Adding /specs/elkdataverse/logstashtworavens.json
        I0325 01:52:58.689697684168      15 server.go:1875] Adding /specs/irodselk/cloudbrowserelastic.json
        I0325 01:52:58.691582684700      15 server.go:1875] Adding /specs/irodselk/cloudbrowseruikibana.json
        I0325 01:52:58.692906685192      15 server.go:1875] Adding /specs/irodselk/icatlogspout.json
        I0325 01:52:58.693939686697      15 server.go:2831875] Listening on PUBLIC_IP:8083
  3. You should now be able to reach the GUI by navigating to http://CLUSTER_IP:30000
  4. Create an account on the NDSLabs GUI as described below:
    1. GUI Test Cases: Create an Account

TERRA Clowder Configuration 

Account Registration

  • Start Clowder (as described above) and navigate to its endpoint link
  • At the top right of the page, click Login and then choose Sign Up on the bottom of the panel.
  • Enter your e-mail address in the box and press Submit.
    • You should receive an e-mail with a link to confirm your account registration
  • Click the link in the e-mail to be brought back to Clowder.
  • Enter your First/Last name, enter/confirm your desired password, then click Submit.
  • You should now be able to log in with the credentials that you have entered (email / password).

Create a Dataset / Upload a File

  • After registering for an account (see above), create a new dataset by choosing Datasets > Create from the navbar at the top of the page.
  • Choose a picture file to upload to this dataset. The contents of the picture do not matter.
  • After choosing Start Upload, check the logs of the mongo container and you should see the uploaded files being added to the database

Extractor(s)

Now that you've seen the basic setup, let's try something a little more complex:

      • Adding /specs/elk/logstash.json
        I0325 01:52:58.689697      15 server.go:1875] Adding /specs/irods/cloudbrowser.json
        I0325 01:52:58.691582      15 server.go:1875] Adding /specs/irods/cloudbrowserui.json
        I0325 01:52:58.692906      15 server.go:1875] Adding /specs/irods/icat.json
        I0325 01:52:58.693939      15 server.go:283] Listening on PUBLIC_IP:8083
  1. You should now be able to reach the GUI by navigating to http://CLUSTER_IP:30000
  2. Create an account on the NDSLabs GUI as described below:
    1. GUI Test Cases: Create an Account

Test Cases

TERRA Clowder Configuration 

Account Registration

  • Start Clowder (as described above) and navigate to its endpoint link
  • At the top right of the page, click Login and then choose Sign Up on the bottom of the panel.
  • Enter your e-mail address in the box and press Submit.
    • You should receive an e-mail with a link to confirm your account registration
  • Click the link in the e-mail to be brought back to Clowder.
  • Enter your First/Last name, enter/confirm your desired password, then click Submit.
  • You should now be able to log in with the credentials that you have entered (email / password).

Testing Extractor(s): Upload a File

  • Once Clowder starts, register for an account (see above).
  • Verify that Once Clowder starts, verify that the extractors are present by navigating to http://YOUR_OPENSTACK_IP:30291/api/status
    • You should see rabbitmq: trueconnected listed under the "plugins" section.
    • You should see the extractors you specified listed at the bottom
  • Create a new dataset by choosing Datasets > Create from the navbar at the top of the page.
  • Upload a plantcv test file to this new dataset and watch the extractors work:
    • Check the logs of the mongo container and you should see the uploaded files being added to the database
    • View http://CLOWDER_IP/admin/extractions in your browser to verify that the extractors are working
    After uploading a file, as described above:
    • View http://CLOWDER_IP/admin/extractions in your browser to verify that the extractors are working
    • You should also be able to see per-image extraction events listed under each image in a dataset

PlantCV Extractor

  • From the NDSLabs Dashboard, Click "View Logs" next to the PlantCV Extractor
    • The logs should show the extractor reading the image and attempting to attach metadata:
    • Insert snippet
  • From the NDSLabs Dashboard, Click "View Logs" next to the Image Preview Extractor
    • The logs should show the extractor reading the image and attempting to create a thumbnail from it
    • Insert snippet
  • From the NDSLabs Dashboard, Click "View Logs" next to the Video Preview Extractor
    • The logs should show the extractor reading the image, realize it is a video and not an image, and skip processing the file
    • Insert snippet

Text-Based Search (ElasticSearch)

  • Once Clowder starts, verify that elasticsearch is enabled by navigating to its endpoint
    • You should see elasticsearch: true listed under the "plugins" section.
    • You should see a "Search" box at the top-right of the Clowder UI. This indicates that elasticsearch is enabled.
  • After uploading a file (as described above), attempt to search for the file extensions, such as "jpg" or "png".
    • You should see the file that you uploaded listed under the results of the search.

Clowder Docker Images

Docker Image Build

  • Clone the https://github.com/nds-org/ndslabs-clowder repository.
  • Change directories to dockerfiles.
  • From dockerfiles, run the make command.
  • You should see the images start building from the Dockerfiles present.
  • Images that will be built include:
    • python-base
    • clowder
    • image-preview
    • video-preview
    • plantcv

 

WARNING: plantcv may take up to 25 minutes to complete its build. Plan accodingly.

Archived Test Cases

These test cases are kept for historical purposes and can be used to run and test the Clowder stack in raw Kubernetes (without NDSLabs).

Basic Clowder Startup 

  • Run . ./start-clowder.sh with no arguments to spin up a vanilla Clowder, with only a MongoDB instance attached.
  • Navigate your browser to http://YOUR_OPENSTACK_IP:30291. You should see the Clowder homepage.
  • Verify MongoDB attachment by navigating to http://YOUR_OPENSTACK_IP:30291/api/status.
    • You should see mongodb: true listed under the "plugins" section.

Account Registration

  • Start Clowder (as described above)
  • At the top right of the page, click Login and then choose Sign Up on the bottom of the panel.
  • Enter your e-mail address in the box and press Submit.
    • You should receive an e-mail with a link to confirm your account registration
  • Click the link in the e-mail to be brought back to Clowder.
  • Enter your First/Last name, enter/confirm your desired password, then click Submit.
  • You should now be able to log in with the credentials that you have entered (email / password).

Create a Dataset / Upload a File

  • After registering for an account (see above), create a new dataset by choosing Datasets > Create from the navbar at the top of the page.
  • Choose a picture file to upload to this dataset. The contents of the picture do not matter.
  • After choosing Start Upload, check the logs of the mongo container and you should see

Extractor(s)

Now that you've seen the basic setup, let's try something a little more complex:

  • Stop any running Clowder / plugin instances: . ./stop-clowder.sh -m
  • Restart Clowder with some extractors: . ./start-clowder.sh -w image-preview plantcv video-preview
    • The script should automatically start RabbitMQ for you as well, since you have specified that you would like to utilize extractors.
  • Wait for everything to finish starting everything up (this may take up to ~1 minute)
  • Once Clowder starts, verify that the extractors are present by navigating tohttp://YOUR_OPENSTACK_IP:30291/api/status
    • You should see rabbitmq: true listed under the "plugins" section.
    • You should see the extractors you specified listed at the bottom
  • Create a Dataset and upload a file as described above.
    • View http://CLOWDER_IP/admin/extractions in your browser to verify that the extractors are working.
    • If anything strange appears on the UI, check the log(s) of each extractor and you should see it doing work on the file(s) you chose to upload

Text-Based Search (ElasticSearch)

  • Stop any running Clowder / plugin instances: . ./stop-clowder.sh -m
  • Restart Clowder with elasticsearch enabled: . ./start-clowder.sh elasticsearch
  • Once Clowder starts, verify that elasticsearch is enabled by navigating tohttp://YOUR_OPENSTACK_IP:30291/api/status
    • You should see elasticsearch: true listed under the "plugins" section.
    • You should see a "Search" box at the top-right of the Clowder UI. This indicates that elasticsearch is enabled.
  • After uploading a file (as described above), attempt to search for the file extensions, such as "jpg" or "png".
    • You should see the file that you uploaded listed under the results of the search.

Start up NDSLabs as described below:

...

Run the NDSLabs System Shell:

Code Block
languagebash
docker run  -it --volume=/:/rootfs:ro --volume=/sys:/sys:ro --volume=/var/lib/docker/:/var/lib/docker:rw --volume=/var/lib/kubelet/:/var/lib/kubelet:rw --volume=/var/run:/var/run:rw --net=host --pid=host --privileged=true -v /var/run/docker.sock:/var/run/docker.sock ndslabs/system-shell 
Unable to find image 'ndslabs/system-shell:latest' locally
latest: Pulling from ndslabs/system-shell
ff12aecbe22a: Already exists
287750ad6625: Already exists
ca98bdf222fa: Already exists
a3ed95caeb02: Already exists
97ef68d67ea6: Pull complete
8c53c989a967: Pull complete
79d911a06f41: Pull complete
807cecd8f466: Pull complete
7f887f3746f8: Pull complete
0cadab32de06: Pull complete
aff97fd2a6c1: Pull complete
Digest: sha256:4128fff8a0234ee6cc25d077b7f607358e681370e5a483b6c89fe1a3dfc3e77e
Status: Downloaded newer image for ndslabs/system-shell:latest
[root@default NDSLabsSystem ] / # 
    • Code Block
      languagebash
      2016-03-27 00:40:30,299 INFO    : pika.adapters.base_connection - Connecting to 10.0.0.56:5672
      2016-03-27 00:40:30,303 INFO    : pika.adapters.blocking_connection - Created channel=1
      2016-03-27 00:40:30,320 INFO    : pyclowder.extractors - Waiting for messages. To exit press CTRL+C
      2016-03-27 06:03:47,478 DEBUG   : pyclowder.extractors - [56f777c3e4b0eb6623c4c197] : Started processing file
      2016-03-27 06:03:47,479 INFO    : pyclowder.extractors - Starting a New Thread for Process File
      2016-03-27 06:03:47,479 DEBUG   : pyclowder.extractors - [56f777c3e4b0eb6623c4c197] : Downloading file.
      2016-03-27 06:03:47,553 INFO    : urllib3.connectionpool - Starting new HTTP connection (1): 141.142.209.154
      2016-03-27 06:03:47,644 INFO    : terra.plantcv - PARAMETERS: {'channel': <pika.adapters.blocking_connection.BlockingChannel object at 0x7ffb3416b950>,
       u'datasetId': u'56f777a3e4b0eb6623c4c192',
       u'fileSize': u'241928',
       'fileid': u'56f777c3e4b0eb6623c4c197',
       u'filename': u'VIS_SV_0_z500_389257.jpg',
       u'flags': u'',
       'header': <BasicProperties(['content_type=application\\json', 'correlation_id=3f98e98f-19db-46e7-bc57-cd3d02b10e85', 'reply_to=amq.gen-e62nBqNswj-FuX-paer3rg'])>,
       u'host': u'http://141.142.209.154:32408',
       u'id': u'56f777c3e4b0eb6623c4c197',
       'inputfile': u'/tmp/tmpq5B8w7.jpg',
       u'intermediateId': u'56f777c3e4b0eb6623c4c197',
       u'secretKey': u'r1ek3rs'}
      2016-03-27 06:03:47,644 INFO    : terra.plantcv - inputfile=/tmp/tmpq5B8w7.jpg filename=VIS_SV_0_z500_389257.jpg fileid=56f777c3e4b0eb6623c4c197
      2016-03-27 06:03:47,645 INFO    : terra.plantcv - EX-CMD: /home/clowder/extractors-plantcv/bin/extract.sh /tmp/tmpq5B8w7.jpg VIS_SV_0_z500_389257.jpg 56f777c3e4b0eb6623c4c197 /home/clowder/plantcv-output
      2016-03-27 06:04:02,158 INFO    : urllib3.connectionpool - Starting new HTTP connection (1): 141.142.209.154
      2016-03-27 06:04:02,218 DEBUG   : pyclowder.extractors - preview id = [56f777d2e4b0eb6623c4c1a8]
      2016-03-27 06:04:02,219 INFO    : urllib3.connectionpool - Starting new HTTP connection (1): 141.142.209.154
      2016-03-27 06:04:02,234 INFO    : urllib3.connectionpool - Starting new HTTP connection (1): 141.142.209.154
      2016-03-27 06:04:02,269 DEBUG   : pyclowder.extractors - preview id = [56f777d2e4b0eb6623c4c1ab]
      2016-03-27 06:04:02,270 INFO    : urllib3.connectionpool - Starting new HTTP connection (1): 141.142.209.154
      2016-03-27 06:04:02,283 INFO    : urllib3.connectionpool - Starting new HTTP connection (1): 141.142.209.154
      2016-03-27 06:04:02,334 DEBUG   : pyclowder.extractors - preview id = [56f777d2e4b0eb6623c4c1ad]
      2016-03-27 06:04:02,335 INFO    : urllib3.connectionpool - Starting new HTTP connection (1): 141.142.209.154
      2016-03-27 06:04:02,346 INFO    : urllib3.connectionpool - Starting new HTTP connection (1): 141.142.209.154
      2016-03-27 06:04:02,400 DEBUG   : pyclowder.extractors - preview id = [56f777d2e4b0eb6623c4c1b1]
      2016-03-27 06:04:02,401 INFO    : urllib3.connectionpool - Starting new HTTP connection (1): 141.142.209.154
      2016-03-27 06:04:02,424 INFO    : urllib3.connectionpool - Starting new HTTP connection (1): 141.142.209.154
      2016-03-27 06:04:02,443 DEBUG   : pyclowder.extractors - preview id = [56f777d2e4b0eb6623c4c1b4]
      2016-03-27 06:04:02,444 INFO    : urllib3.connectionpool - Starting new HTTP connection (1): 141.142.209.154
      2016-03-27 06:04:02,462 DEBUG   : pyclowder.extractors - [56f777c3e4b0eb6623c4c197] : Uploading file metadata.
      2016-03-27 06:04:02,463 INFO    : urllib3.connectionpool - Starting new HTTP connection (1): 141.142.209.154
      2016-03-27 06:04:02,764 DEBUG   : pyclowder.extractors - [56f777c3e4b0eb6623c4c197] : Uploading file tags.
      2016-03-27 06:04:02,766 INFO    : urllib3.connectionpool - Starting new HTTP connection (1): 141.142.209.154
      2016-03-27 06:04:02,847 DEBUG   : pyclowder.extractors - [56f777c3e4b0eb6623c4c197] : Done

Image Preview Extractor

  • From the NDSLabs Dashboard, Click "View Logs" next to the Image Preview Extractor
    • The logs should show the extractor reading the image and attempting to create a preview thumbnail from it:
    • Code Block
      languagebash
      2016-03-27 00:40:30,299 INFO    : pika.adapters.base_connection - Connecting to 10.0.0.56:5672
      2016-03-27 00:40:30,303 INFO    : pika.adapters.blocking_connection - Created channel=1
      2016-03-27 00:40:30,320 INFO    : pyclowder.extractors - Waiting for messages. To exit press CTRL+C
      2016-03-27 06:03:47,478 DEBUG   : pyclowder.extractors - [56f777c3e4b0eb6623c4c197] : Started processing file
      2016-03-27 06:03:47,479 INFO    : pyclowder.extractors - Starting a New Thread for Process File
      2016-03-27 06:03:47,479 DEBUG   : pyclowder.extractors - [56f777c3e4b0eb6623c4c197] : Downloading file.
      2016-03-27 06:03:47,543 INFO    : requests.packages.urllib3.connectionpool - Starting new HTTP connection (1): 141.142.209.154
      2016-03-27 06:03:48,467 INFO    : requests.packages.urllib3.connectionpool - Starting new HTTP connection (1): 141.142.209.154
      2016-03-27 06:03:48,606 DEBUG   : pyclowder.extractors - preview id = [56f777c4e4b0eb6623c4c1a0]
      2016-03-27 06:03:48,607 INFO    : requests.packages.urllib3.connectionpool - Starting new HTTP connection (1): 141.142.209.154
      2016-03-27 06:03:49,144 INFO    : requests.packages.urllib3.connectionpool - Starting new HTTP connection (1): 141.142.209.154
      2016-03-27 06:03:49,309 DEBUG   : pyclowder.extractors - preview id = [56f777c5e4b0eb6623c4c1a3]
      2016-03-27 06:03:49,310 INFO    : requests.packages.urllib3.connectionpool - Starting new HTTP connection (1): 141.142.209.154
      2016-03-27 06:03:49,372 DEBUG   : pyclowder.extractors - [56f777c3e4b0eb6623c4c197] : Done

Video Preview Extractor

  • From the NDSLabs Dashboard, Click "View Logs" next to the Video Preview Extractor
    • The logs should will show that this extractor has ignored this file upload, since it is not a video file:
    • Code Block
      languagebash
      2016-03-27 00:40:30,299 INFO    : pika.adapters.base_connection - Connecting to 10.0.0.56:5672
      2016-03-27 00:40:30,303 INFO    : pika.adapters.blocking_connection - Created channel=1
      2016-03-27 00:40:30,320 INFO    : pyclowder.extractors - Waiting for messages. To exit press CTRL+C

Testing Text-Based Search (ElasticSearch)

  • Verify that elasticsearch is enabled by navigating to Clowder's endpoint
    • You should see elasticsearch: connected listed under the "plugins" section of http://YOUR_OPENSTACK_IP:30291/api/status
    • You should see a "Search" box at the top-right of the Clowder UI. This indicates that elasticsearch is enabled.
  • After uploading a file (as described above), attempt to search for the file extensions, such as "jpg" or "png".
    • You should see any matching file(s) that you have uploaded listed under the results of the search.

Testing the Tool Server

  • Navigate to the Dataset that you created above.
  • On the right side of the page, you should see the Tool Manager section.
  • Choose a tool (Jupyter / Rstudio) from the drop-down and press "Launch"
  • Once the image downloads and the container starts (this may take several minutes):
    • Rstudio:
      • Navigate to and log into the Rstudio instance
        • username: rstudio
        • password: rstudio
      • You should see the Dataset that you uploaded listed here
    • Jupyter:
      • Navigate to the Jupyter instance
      • You should see the Dataset that you uploaded listed here

Archived Test Cases

These test cases are kept for historical purposes, but can be used to run / test the Clowder stack in raw Kubernetes (without running NDSLabs).

Getting Started

Basic Clowder Startup 

  • Run . ./start-clowder.sh with no arguments to spin up a vanilla Clowder, with only a MongoDB instance attached.
  • Navigate your browser to http://YOUR_OPENSTACK_IP:30291. You should see the Clowder homepage.
  • Verify MongoDB attachment by navigating to http://YOUR_OPENSTACK_IP:30291/api/status.
    • You should see mongodb: true listed under the "plugins" section.

Account Registration

  • Start Clowder (as described above)
  • At the top right of the page, click Login and then choose Sign Up on the bottom of the panel.
  • Enter your e-mail address in the box and press Submit.
    • You should receive an e-mail with a link to confirm your account registration
  • Click the link in the e-mail to be brought back to Clowder.
  • Enter your First/Last name, enter/confirm your desired password, then click Submit.
  • You should now be able to log in with the credentials that you have entered (email / password).

Create a Dataset / Upload a File

  • After registering for an account (see above), create a new dataset by choosing Datasets > Create from the navbar at the top of the page.
  • Choose a picture file to upload to this dataset. The contents of the picture do not matter.
  • After choosing Start Upload, check the logs of the mongo container and you should see

Extractor(s)

Now that you've seen the basic setup, let's try something a little more complex:

  • Stop any running Clowder / plugin instances: . ./stop-clowder.sh -m
  • Restart Clowder with some extractors: . ./start-clowder.sh -w image-preview plantcv video-preview
    • The script should automatically start RabbitMQ for you as well, since you have specified that you would like to utilize extractors.
  • Wait for everything to finish starting everything up (this may take up to ~1 minute)
  • Once Clowder starts, verify that the extractors are present by navigating tohttp://YOUR_OPENSTACK_IP:30291/api/status
    • You should see rabbitmq: true listed under the "plugins" section.
    • You should see the extractors you specified listed at the bottom
  • Create a Dataset and upload a file as described above.
    • View http://CLOWDER_IP/admin/extractions in your browser to verify that the extractors are working.
    • If anything strange appears on the UI, check the log(s) of each extractor and you should see it doing work on the file(s) you chose to upload

Text-Based Search (ElasticSearch)

  • Stop any running Clowder / plugin instances: . ./stop-clowder.sh -m
  • Restart Clowder with elasticsearch enabled: . ./start-clowder.sh elasticsearch
  • Once Clowder starts, verify that elasticsearch is enabled by navigating tohttp://YOUR_OPENSTACK_IP:30291/api/status
    • You should see elasticsearch: true listed under the "plugins" section.
    • You should see a "Search" box at the top-right of the Clowder UI. This indicates that elasticsearch is enabled.
  • After uploading a file (as described above), attempt to search for the file extensions, such as "jpg" or "png".
    • You should see the file that you uploaded listed under the results of the search.

 

...