You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 15 Next »

WhoPlanned - MondayAccomplished - Friday
  
  • BD
    • Work on issues for DataWolf 3.1 release
    • Work on 36 month review slides
  • NIST
    • Version 2 work
  • Other
    • Purge datasets from rapid vm, send instructions how results can be purged after runs
    • Jury duty - uncertain availability each day
  • BD
    • BD-1376 - worked on provenance slides
    • Worked on DataWolf 3.1 issues, including a display bug when working on slides
  • NIST
    • Worked on storing files by user in Clowder. DataWolf was storing the datasets by user, but since the files are stored by a separate DAO changes need to be made to the FileStorage interface. Suggested a fix in WOLF-194
    • Fixed datawolf bug for deleting datasets with the /datasets/purge API
  • Other
    • Worked on purging data on rapid VM, sent instructions to the graduate student for doing this in the future after runs are finished
    • Jury Duty - half day Tuesday
  
  
  
  • In-CORE
    • Continue Working on UI for Bridge Analysis
    • Update states for the Bridge Analysis
    • Start connecting the UI with Datawolf Workflow
    • Start looking into the Railway analysis for V1
  • GLM
    • Ingest Zooplankton data into Dev.
    • New Search page updates
 
  
  • Meet with AWS solution architect and campus IT to discuss about how to adopt AWS to our pipeline
  • Update data_cleanup_pipeline based on new requirement including source code and test code
  • Performance investigation for gene_prioritization_pipeline
  • Met with AWS solution architect and campus IT and fix the solution for using Auto Scale Group service instead of Lambda service.
  • Standardized data_cleanup_pipeline to have the same main function structure as other pipeline
  • Investigated the performance for gene_prioritization_pipeline and improved the total running time from 9 minutes to around 1 minute.
  
Sara Lambert  
  
  
  

 

  • Move TERRA globus monitor logging from flat files to Postgres database
  • Implement better log monitoring services for long-life Globus service
  • Begin converting TERRA extractors to PyClowder2 for error tracking
  • Github audit meeting
 
  • v2 Sandbox
  • v2 Sandbox (UI)
  • v2 Sandbox (Fragility Server)
  • CyberGIS Workshop
  • Civil Engineering Meeting
  • Workflow Diagrams
  • PEcAn
    • THREDDS
  • BD
    • convertors in bamboo
    • look at kubernetes
  • LSST
    • finish sprint, plan new sprint
 
David Raila  
  
  • HR Interview follow up - resume review and 1st interview scheduling
  • Brown Dog management - rollout
  • Brown Dog - Dec 5 planning
  • T-Shirts follow up
  • BD Quarterly Report
  • Review GLTG Milestones with Jong
  • Michelle Pitcel - Onboarding

In-Core Reports - status

  • additional first interviews on hold except for 1 candidate
  • Rollout doc 1st version complete - in review
  • Brown Dog planning for Dec 5 - ongoing
  • T-Shirts - have just about everyone
  • BD Quarterly Report - done! ready to upload today
  • GLTG Milestones - reviewed with team - still need to review with Jong
  • Michelle Onboarding - going better than when I worked with Jing/Bing/and Htut Khine I think
  • In-Core Reports - holding pattern
  
 
  • BD
    • tweak signup page
    • shiny app for pecan use case: connect to bdapi and get the result
    • bd.r as a library in github.
  • GLM
    • date picker for search page-- almost done.
    • review indira's PR
  • React / Redux test for INCORE v2 mock up
  • Meeting with Civil Engineering and start work on making script
  • CyberGIS Geospatial Python workshop
  • React test for various javascript library for INCORE version 2 UI
  • CyberGIS Geospatial Python workshop
  • Meeting with Civil Engineering people and started work on creating python script for network builder
  • No labels