WhoPlanned - MondayAccomplished - Friday
  • GLM
    • Begin looking at GLENDA data
  • GLTG
  • IMLCZO
    • Add versioning to footer
    • Move flux tower parser to production
  • GEOD
    • Work on React + Alt demo
 
  • Vacation
  • Vacation
  
  • Blue Waters workshop
  • Brown Dog sprint 1 tasks
  • CC* proposals (coming together)
  • Revised DIBBs/DataNet position paper and sent back
  • Distributed NDSC draft report to TAC
  • Blue Waters workshop
  • Brown Dog sprint 1 tasks and followup
  • CC* proposals (coming together)
  • Revised DIBBs/DataNet position paper and sent back
  • Distributed NDSC draft report to TAC
  • BD
    • Continue working on DataWolf 3.1 issues
  • NIST
    • Rebranding
  • Other
    • Vacation August 1 - 3, out early Friday August 5th at 2pm

  • BD
    • Worked on 3.1 issue - WOLF-164 - allowing users to edit Java tools
  • Other
    • Caught up on email
    • Vacation Aug 1 - 3
  • Vacation
  • Vacation
  • SEAD - bug fixes, SEAD-943 - Adding file thumbnails to "Edit Metadata" page in Staging Area In Progress
  • MDF - 2016-08-01+Kickoff
  • MWRD - gate, DO, elevation scripts, populate database, e=mail Joe with missing data
  • Vacation
  • Vacation
  • BD
    • Sprint 1 tasks
  • DEBOD
    • Continue with debugging / improving cell segmentation
  • BD
    • Worked on sprint tasks
  • DEBOD
    • Completed development on improving fine rotation of documents (a sub task of improving cell segmentation)
  • Assisting with new employee onboarding - Bing Zhang
  • Brown Dog - 1 week sprint
  • Brown Dog Annual Report
  • MWRD proposal review ---- What is status?
  • ISDA PPT
  • Onboarding
  • Brown Dog sprint work and annual report
  • GLTG Epic Prep - target May 2017
  • TAMU team visited 8/2 and 8/3
  • ISDA and TV PPT
  • Ordered TShirts
  • Vacation
  • Vacation
  • GLTG
    • tennessee data
  • GLM
    • standardize time returns for get and post
  • BD
    • add bdcli to bd.m
  • GLTG
    • tennessee data ingested
    • greon aggregated file posting working again after refactoring storage on prod and rsync data with dev
    • refactor greon parser to use new pyGeodashboard (bug with greon-06 data)
  • GLM
    • set up local environment
    • make stream date formats consistent
    • test datapoint data formats (appear ok)
  • BD
    • look at bd.m for adding bd-cli usage - lots to do still

SEAD

GLM

  • New Search Page

CyberSEES

  • Setup new workflow

SEAD

GLM

  • The geodashboard-search-react, the filters now use the same template instead of different ones. And it only shows up one filter on load and 3 more can be added with the + button

CyberSEES

  • Some work on trying to run the GIConverter on WSSI (still failing, but most of the configuration has been done)

Work on Water Network Analysis

  • create usage pattern dataset out from epa net input file
  • Vacation
  • Made water usage pattern dataset conversion for water network analysis
  • Finished node and link file conversion for water network analysis
  • Texas A&M Visitors (Tuesday)
  • Metadata
  • Water Network Damage Analysis
  • Texas A & M Metadata Presentation
  • Water Network Damage Analysis
  • Out Thursday/Friday

TERRA

  • meetings w/ standards committee
  • evaluate pipeline w/ new Clowder version
  • extractor support
  • multi-criteria search initial implementation done
  • extractor dev support
    • rewrite aspects of Globus pipeline to be more generic for new project sites
    • prepare for Danforth pipeline activation - call on Monday
  • meetings
  
  • SEAD
    • use and check the website
  • MSC
    • read some reference
  • SEAD
    • use and check the website
  • MSC
    • read some reference
  • Package python code and change dependencies accordingly
  • Performance testing on output format (binary or pure text file) and python library (pandas, pickle and build-in IO)
  • Finished one package packing, another one is waiting for upstream code to complete
  • Finished performance testing on output format
Htut Khine Htay Win 
  • My highlights of this week are 
    1. I got my LSST project running on my computers in nebula. 
    2. I got to write code to listen to the status of the machines and store them in Redis. 
  • My low for this week is 
    • I got stuck on sending messages with rabbitmq. Finally, it worked when I opened a bunch of security groups in creating instances of machines on the server. Still not 100% sure why those things matter. Thank you everyone for helping out. 
  • New Employee Orientation
  • Begin learning Brown Dog
  • installed the Polyglot and Clowder in development environments (linux), and installed and tested the converters and extractors.
  • For the Clowder instance on the local machine, I run the dbpedia extractor and wordcount extractor.
  • For the Polyglot,  I am able to run the existing tools and the tool which I developed for the interview.