YOU-DONT-HAVE-TO-BE-GREAT-TO-START

 

WhoPlanned - MondayAccomplished - Friday
  

Complete Kubespray Kubernetes Pull Request

Write up marketing info for NDS Workbench

Get Brown Dog Clusterman App in Bamboo with automated testing, code coverage, and deployment to Dev server


  • Cover Crop
    • Status meeting
    • Review required web-app changes to run experimental field
    • Open issues to reach Dec. 2nd functionality target
  • Ergo/KISTI
    • Discuss next steps, review requirements to release Ergo 4.0.0
  • IN-Core
    • Continue working on v2 and set development priorities
    • Review earthquake hazard model to de-couple specific earthquake from mathematical model
  • Cover Crop
    • Helped collaborators compile DSSAT 4.7 and compiled/installed 4.7 on cover crop VM
    • Updated VM with latest weather converter tool, updated DSSAT model and ran test run field with updated model/weather data, prepared inputs and added them to VM
  • IN-Core
    • Updated building damage python script with latest pyincore changes
    • Provided text for IN-CORE paper
    • Travel expense report
  • Other - DataWolf demo, lightning talk, DataWolf 4.1 release build finished
  •  NDS (Sprint 35)
    • SC17 demo script/video
    • Website updates
    • Auth design
  • Whole Tale
    • Setup monitoring via OMD
  • TERRA-REF
    • Finalize Workbench cluster configuration
    • Pipeline support
  •  NDS
    • Drafted demo/video
    • Website updates pending PA approval
    • First round auth design
  • Whole Tale
    • Preliminary monitoring config
  • TERRA-REF
    • Fixed MTU issue (fingers crossed!)
    • Pipeline running...




  
  •  GLM
    • Dissolved Oxygen
    • Fix issues in IE11
  • In-Core
  •  GLM
    • Fixed issues in IE11: Markers showing and the links working
    • Fixed lines around radio buttons in IE11
    • Helped Michelle with doing a single fetch for the regions trends data
  • InCore
    • Analysis UI

  • BD-1820 finish, merge with develop
  • BD-1860


 Both tasks finished, PR created and merged.


  •  Document performance testing result for Uber data (docker and normal physical machine)
  • Study elasticsearch and build CaseOLAP instance for KnowEnG
  • CGC testing
  •  Ran 1 set of performance testing for Uber data and documented the result in Google spreadsheet
  • Did CGC testing and document my feedback in Google document, reported back to the person who in charge of this project.
  • Started to read document elastic search.
  
  • Brown Dog online materials review/cleanup
  • SC panel prep
  • HR
  • NIH National Cancer Institute
  • SSA Newsletter
  • SC panel prep
  • HR
  • GLM release and preparation for EPA visit
  • BD bug fixes, authorization in fence/clowder
  • Clowder release
  • PSP data ingestion
  • IMLCZO lab analysis space and presentation 
  • GLM release and preparation for EPA visit
  • PSP data ingestion
  •  gltg
    • standup dev version of distributed system with illinois instance started
  • vbd
    • decide on technology for app and start working on it
  •  gltg
    • stood up dev distributes system with gltg-dev as nginx proxy
  • vbd
    • decided on technology
      • flask, openlayers
  • plantCV testing and run on two new experiments
  • hyperspectral run on 2016-2017
  • generate soil mask on 04-01 hyperspectral test dataset
  • ongoing FLIR + fullfield progress evaluation
  • heightmap laser3d scanner running
  • plantCV pipeline deployed
  • hyperspectral postponed due to calibration issues
  • scripts for dataset count summaries
  • MDF
    • search functions, search_key, author etc.
    • work on Sphinx document strings
    • continue with Globus SDK transfer
  • KISTI
    • OpenSEES, complex examples,
    • papers
  • MDF
    • done, pull requests.
    • no
    • script is runnung
  • KISTI
    • OpenSEES, complex example
  • NCSA faculty
    • help with metadata definition
  • NDS
    • Review open tickets
  • KnowEnG
    • Hopefully
    • Assist with SSViz tasks as they become available
  • Crops in Silico
    • Extend REST API to allow user to run multiple models
    • Include the real models in the API container so that those can be run as well
    • Try to think of methods for error-handling, since cisrun currently pipes all output/errors to stdout/stderr
  • NDS
    • Reviewed open tickets
  • KnowEnG
  • Crops in Silico
    • Extended REST API to allow user to run multiple models
    • Included the real models in the API container so that those can be run as well



  • Priority
    • Check email
    • Update Pull Requests
  • Afterwards: Set project tasks for the remainder of the week
  • Checked email
  • Updated Pull Requests
  • Scheduled and attended various meetings
  •  Out Monday - Thursday
  • Catch up
  • Expense Report
  • Glossary Change 

 KnowEnG

  • Explore enhancements and investigate some isolated issues with Mesos cluster performance due to auto-scaling in AWS
  • Domain transfer from knoweng.org → knoweng.illinois.edu
  • Infrastructure Support
 

 KnowEnG

  • Explore enhancements and investigate some isolated issues with Mesos cluster performance due to auto-scaling in AWS (W.I.P.)
  • Infrastructure Support
  
  •  BD
    • Identify important tasks for the current sprint and work on them
  • IARP
    • Contiue work on populating Clowder demo instance
    • Issue pull request on advnaced search feature improvment
  • CCROP
    • UI modifications related to bug fixes and new feature about flexible termination dates
    • Identify the next priority tasks and start working on them
 
  • Update Seating in Hoteling Conference Room
  • Actually file my travel report
  • Nebula Nodes for TERRA
  • HR Resumes / Interviews
  • Follow up on Training ideas
  • Prep Agile outline and presentation
  • Brown Dog Report
  • Sprint Management
  •  New seating name tags - still need to update "official" list
  • TEM done
  • Nodes ordered
  • Hr tasks - check
  • More contributions to training / Onboarding page
  • BD Report - last section should be in Monday
  • Sprint audits and things
  • JIRA Workflow/Sprint page changes started
  • Google Analytics / GLTG
  •  BD
    • jupyter login.
  • GLM
    • hackathon
 
  •  BD
    • jupyter login.
    • try jupyter with R kernal
  • GLM
    • hackathon
  • Create the method to automatically upload dataset to geoserver in Pyincore
  • Create automatic dataset upload into geoserver in data repository service
  • Process soil data and upload into geoserver
  • Made python automation code for uploading all the dataset in the webdav to geoserver
  • Created the data service api to automatically upload the shapefile to geosever when new dataset is ingested
  • Soil data cropped and uploaded to geoserver