Image result for team building images

Here is to a great week everyone!

 

WhoPlanned - MondayAccomplished - Friday

 enable pyclowder to get queuename fro env.

work on toolcatalog stop and remove services.

 

Learn about Terraform and how to use it to deploy kubernetes on OpenStack.

Learn NCSA best practice for provisioning servers to run Kubernetes.

Determine UX for binding queues to exchanges in Brown Dog

Help with autoscaling work in Brown Dog

Got Terraform to set up hosts on OpenStack. Ran into some issues with floating IPs. Working with the project committers to shape a PR.

Submitted Brown Dog PR for stopping deployed services.

Created confluence page to discuss ideas on how to make BD Tools Catalog a global resource along with editing Queue bindings for Rabbit.

  •  Cover Crop
    • Status meeting
    • code review
    • check status of running experimental field
  • Ergo/KISTI
    • Prepare for KISTI visit
  • IN-Core
    • v2 code review
    • Work with Yong Wook on ingesting results to data repo
    • Finish v2 building damage example
    • Discuss wrapping up v1 for release
    • Rebuild incore-datawolf VM using vm script for installing puppet/kerberos/etc
  • Other
    • Release datawolf 4.1
 
  • Whole Tale
    • Continued work on shared home/work directories for WT via OpenStack
    • Take over terraform for WT production deploy
  • TERRA
    • Finalized migration of Workbench
    • Nebula recovery?
    • Ongoing reprocessing
  • NDS
    • Get TERRA stitching case working on ROGER
  • Continued Nebula recovery and diagnosis across multiple projects
  • TERRA
    • Primarily Nebula recovery and troubleshooting scalability under re-processing
  • NDS
    • First pass at TERRA stitching case on ROGER
    • Planning/roadmap
    • Requested access to Comet
    • Setup campus AWS account
  • Whole Tale
    • Terraform design


  
Gregory Jansen
  • Setup Gatling.io tests again, driven this time by the Elasticsearch inventory of files. (not Mongo)
  • Get Gatling output into logstash pipeline.
  • Begin work on UMD fence install
  • Test REL7-based docker swarm on UMD DivIT test machine. (before physical host upgrade)
  • Compare BD coverage with CI-BER formats

Added a format/file extension index to local Elasticsearch, including listing of BD extractors/converters, file counts, byte counts.

Produced JSON report of most frequent formats that have no extractors/converters in BD. Made a recommended list of high value converters/extractors for upcoming student efforts.

Working on REL7 upgrade.

Did some initial Gatling testbed refactoring

Quarterly report

  • Write unit tests for EventSubscriber
  • EventSubscriber tests are up and running.
  • All of OCS components test are written, debugged and constantly passing.
  
  •  Review other people's pull requests
  • BD-1823 View number of scripts on ToolsCatalog
  • work with reviewers to get my pull request approved
  • (if enough time) BD-1843 Remove Deployment Message Page from Tools Catalog
  •  Keep testing for Data_Cleanup_Pipeline
  • Other support for dev testing and integration
  •  Added new logics to general_clustering_pipeline in data_cleanup_pipeline
  • Updated README for data_cleanup_pipeline
  • Worked with external users on setup their environment.
  • Created special image for samples_clustering_pipeline by removing quantile normalization matrix in calculation
  •  Hazard conference in North Dakota
 
  • Clowder/NDS cost models
  • SSA Newsletter
  • Brown Dog NIST engagement followup
  • NDS SC17 panel preparations
  • HR
  • SSA Newsletter
  • Brown Dog NIST engagement followup
  • NDS SC17 panel preparations
  • HR
  •  BD
    • sprint
    • authentication in Clowder/Fence
  • Agri
    • Clowder new instance setup
  • Clowder V2 planning
  • Earthcube report
  • GLM V3 pull requests and deployments
 
  • BD sprint planning review
  • Earthcube report
  • GLM V3 pull requests and deployments
  •  VBD
    • run/understand fortran model
  • GLTG
    • move production postgres to separate VM
 
  • finish plant_height plot update
  • bin2tif pipeline monitoring & query optimization
  • queue flir2tif for cleaning
  • run weather extractors 
  •  MDF
  1. datasets ingestion to the MDF/NIST database
  2. endpoint traffic for the last 2 months
  3. work on Forge code strings
    KISTI -
    • learn OpenSEES, examples
    • continue reading papers
  1. finish all the important dataset
  2. VM on Nebula is in Error, contacted Chris
  3. meeting with Melissa
  4. keywords from datasets done, shared

    KISTI -
    • installed on Mac, compiling on Ubuntu VM
  • NDS
    • Workbench beta maintenance and upkeep
  • KnowEnG
    • Finish review of open PRs
    • Consult sprint board/Matt for what to work on next
  • Crops in Silico
    • Determine what can/can't be committed pushed/externally, how to work around this, etc
    • Continue trying to get a working container for the leaf model
  • NDS
      • Preliminary testing uncovered a bug which we must address before we can perform the upgrade
  • KnowEnG
  • Crops in Silico
    • Forked NoFlo UI and started learning a bit more about it
    • Containerized LeM example, but still not quite sure if it's working
  • General
    • Pull Requests
    • Lightning Talk this week
  • GLM
    • Priority:
    • If Time/Paused:
  • GLTG 
    • Priority:
    • If Time/Paused:
  • IMLCZO
    • Priority:
      • Re-run Parser for Flux Tower
      • Re-run Parser for Allerton non-Decagon
      • Prepare for Meetings Next Week
    • If Time/Paused:
  • General
    • Pull Requests
    • Prepared for Lightning Talk
  • GLM
    • - Have a working code example to discuss
  • GLTG
  • IMLCZO
    • Prepared for Meetings Next Week
    • Re-ran Parser for Flux Tower
      • will need to run again after IMLCZO-198 is approved
    • Re-ran Parser for Allerton non-Decagon



    • Setup Virtual Machines
      • Windows Machine
      • Vocabulary Server
    • Vocabulary Training Session
    • Schema Service
  • VM Setup
    • Provisioned and setup new Windows Machine
    • Fixed Vocabulary Server
  • IRB Ethics Training Refresher Course
  • Python Client Update
  • Fragility Service Update
 KnowEnG
    • Billing API - AWS cost breakdown
    • KnowEnG Platform Ticketing System on HUBZero
    • Auto-scale 1.1 - Enhancements
  KnowEnG
    • Billing API - AWS cost breakdown (W.I.P.)
    • KnowEnG Platform Ticketing System on HUBZero (W.I.P.)
    • Auto-scale 1.1 - Enhancements (Dev + Testing)
  
  • BD
    • Make sure that ArcGIS extractors are up and running
    • Work on the task to setup LDAP suport in Brown Dog Fence
  • DEBOD
    • Submit final report
  • IARP
    • Submit quarterly report
    • Make final modifications and issue pull request for advanced search improvements
    • Start with gettin IARP Clowder instance cleaned up and populated with more data and metadata
  • CCROP
    • Work on parsing JSON result from DataWolf
    • Work on adding results for both with and without cover crop
  • MobileMe&You Conference
    • Prepare for demo and presentation
  •  BD
    • Fixed production extractor issues
  • DEBOD
    • Submitted final report sections
  • IARP
    • Submitted quarterly report sections
  • CCROP
    • Wrote code for parsing JSON
    • Integrated both workflows (with and without Cover crop) with the web application
  • MobileMe&You Conference
    • Presented Clowder
  • Catch up on email
  • Send IN-Core Report
  • Prepare for Brown Dog Quarterly Report
  • HR duties/tasks
  • Brown Dog - task review/Sprint overview to catch me up
  • lots of stuff I probably don't even know about yet as I catch up from a week of vacation (smile)
  • Email filed
  • In-core reports for meeting done - still waiting on BETA report
  • Brown Dog Quarterly report reminders sent
  • Lots of HR stuffs - check
  • Follow up on Nebula Machine for In-Core
  • Trainings - i now have 2 left to complete
  • Team Sprint Planning and reviews
  • Seating / organization
  •  BD
    • fix pecan
  • GLM
    • test for geodashborad v3
 
  • Finish the connection between data repository and datawolf result
  • Create the VM for data repository
  • Create data dump process to migrate the data from weddav to data repository
  • Updated data repository APIs to connect datawolf result.
  • Tested and created a datawolf VM using Rob's script

 








 

WhoPlanned - MondayAccomplished - Friday

 enable pyclowder to get queuename fro env.

work on toolcatalog stop and remove services.

 

Learn about Terraform and how to use it to deploy kubernetes on OpenStack.

Learn NCSA best practice for provisioning servers to run Kubernetes.

Determine UX for binding queues to exchanges in Brown Dog

Help with autoscaling work in Brown Dog

Got Terraform to set up hosts on OpenStack. Ran into some issues with floating IPs. Working with the project committers to shape a PR.

Submitted Brown Dog PR for stopping deployed services.

Created confluence page to discuss ideas on how to make BD Tools Catalog a global resource along with editing Queue bindings for Rabbit.

  •  Cover Crop
    • Status meeting
    • code review
    • check status of running experimental field
  • Ergo/KISTI
    • Prepare for KISTI visit
  • IN-Core
    • v2 code review
    • Work with Yong Wook on ingesting results to data repo
    • Finish v2 building damage example
    • Discuss wrapping up v1 for release
    • Rebuild incore-datawolf VM using vm script for installing puppet/kerberos/etc
  • Other
    • Release datawolf 4.1
 
  • Whole Tale
    • Continued work on shared home/work directories for WT via OpenStack
    • Take over terraform for WT production deploy
  • TERRA
    • Finalized migration of Workbench
    • Nebula recovery?
    • Ongoing reprocessing
  • NDS
    • Get TERRA stitching case working on ROGER
  • Continued Nebula recovery and diagnosis across multiple projects
  • TERRA
    • Primarily Nebula recovery and troubleshooting scalability under re-processing
  • NDS
    • First pass at TERRA stitching case on ROGER
    • Planning/roadmap
    • Requested access to Comet
    • Setup campus AWS account
  • Whole Tale
    • Terraform design


  
Gregory Jansen
  • Setup Gatling.io tests again, driven this time by the Elasticsearch inventory of files. (not Mongo)
  • Get Gatling output into logstash pipeline.
  • Begin work on UMD fence install
  • Test REL7-based docker swarm on UMD DivIT test machine. (before physical host upgrade)
  • Compare BD coverage with CI-BER formats

  • Write unit tests for EventSubscriber
  • EventSubscriber tests are up and running.
  • All of OCS components test are written, debugged and constantly passing.
  
  •  Review other people's pull requests
  • BD-1823 View number of scripts on ToolsCatalog
  • work with reviewers to get my pull request approved
  • (if enough time) BD-1843 Remove Deployment Message Page from Tools Catalog
  •  Keep testing for Data_Cleanup_Pipeline
  • Other support for dev testing and integration
  •  Added new logics to general_clustering_pipeline in data_cleanup_pipeline
  • Updated README for data_cleanup_pipeline
  • Worked with external users on setup their environment.
  • Created special image for samples_clustering_pipeline by removing quantile normalization matrix in calculation
  •  Hazard conference in North Dakota
 
  • Clowder/NDS cost models
  • SSA Newsletter
  • Brown Dog NIST engagement followup
  • NDS SC17 panel preparations
  • HR
  • SSA Newsletter
  • Brown Dog NIST engagement followup
  • NDS SC17 panel preparations
  • HR
  •  BD
    • sprint
    • authentication in Clowder/Fence
  • Agri
    • Clowder new instance setup
  • Clowder V2 planning
  • Earthcube report
  • GLM V3 pull requests and deployments
 
  •  VBD
    • run/understand fortran model
  • GLTG
    • move production postgres to separate VM
 
  • finish plant_height plot update
  • bin2tif pipeline monitoring & query optimization
  • queue flir2tif for cleaning
  • run weather extractors 
  •  MDF
  1. datasets ingestion to the MDF/NIST database
  2. endpoint traffic for the last 2 months
  3. work on Forge code strings
    KISTI -
    • learn OpenSEES, examples
    • continue reading papers
 
  • NDS
    • Workbench beta maintenance and upkeep
  • KnowEnG
    • Finish review of open PRs
    • Consult sprint board/Matt for what to work on next
  • Crops in Silico
    • Determine what can/can't be committed pushed/externally, how to work around this, etc
    • Continue trying to get a working container for the leaf model
 
  • General
    • Pull Requests
    • Lightning Talk this week
  • GLM
    • Priority:
    • If Time/Paused:
  • GLTG 
    • Priority:
    • If Time/Paused:
  • IMLCZO
    • Priority:
      • Re-run Parser for Flux Tower
      • Re-run Parser for Allerton non-Decagon
      • Prepare for Meetings Next Week
    • If Time/Paused:
  • General
    • Pull Requests
    • Prepared for Lightning Talk
  • GLM
    • - Have a working code example to discuss
  • GLTG
  • IMLCZO
    • Prepared for Meetings Next Week
    • Re-ran Parser for Flux Tower
      • will need to run again after IMLCZO-198 is approved
    • Re-ran Parser for Allerton non-Decagon



    • Setup Virtual Machines
      • Windows Machine
      • Vocabulary Server
    • Vocabulary Training Session
    • Schema Service
  • VM Setup
    • Provisioned and setup new Windows Machine
    • Fixed Vocabulary Server
  • IRB Ethics Training Refresher Course
  • Python Client Update
  • Fragility Service Update
 KnowEnG
    • Billing API - AWS cost breakdown
    • KnowEnG Platform Ticketing System on HUBZero
    • Auto-scale 1.1 - Enhancements
 
  
  • BD
    • Make sure that ArcGIS extractors are up and running
    • Work on the task to setup LDAP suport in Brown Dog Fence
  • DEBOD
    • Submit final report
  • IARP
    • Submit quarterly report
    • Make final modifications and issue pull request for advanced search improvements
    • Start with gettin IARP Clowder instance cleaned up and populated with more data and metadata
  • CCROP
    • Work on parsing JSON result from DataWolf
    • Work on adding results for both with and without cover crop
  • MobileMe&You Conference
    • Prepare for demo and presentation
  •  BD
    • Fixed production extractor issues
  • DEBOD
    • Submitted final report sections
  • IARP
    • Submitted quarterly report sections
  • CCROP
    • Wrote code for parsing JSON
    • Integrated both workflows (with and without Cover crop) with the web application
  • MobileMe&You Conference
    • Presented Clowder
  • Catch up on email
  • Send IN-Core Report
  • Prepare for Brown Dog Quarterly Report
  • HR duties/tasks
  • Brown Dog - task review/Sprint overview to catch me up
  • lots of stuff I probably don't even know about yet as I catch up from a week of vacation (smile)
  • Email filed
  • In-core reports for meeting done - still waiting on BETA report
  • Brown Dog Quarterly report reminders sent
  • Lots of HR stuffs - check
  • Follow up on Nebula Machine for In-Core
  • Trainings - i now have 2 left to complete
  • Team Sprint Planning and reviews
  • Seating / organization
  •  BD
    • fix pecan
  • GLM
    • test for geodashborad v3
 
  • Finish the connection between data repository and datawolf result
  • Create the VM for data repository
  • Create data dump process to migrate the data from weddav to data repository