Page tree
Skip to end of metadata
Go to start of metadata



WhoMonday - PlannedFriday - Accomplished
  • Brown Dog monthly report
  • Monitor Brown Dog progress
  • Update sprints w/Luigi
  • Several once a month meetings this week

  • Monthly report went out
  • 63 remaining BD items
  • Reviewed NIST proposal
  • Prepared for next round of 1-1 meetings
  • Finish up river extractor
  • Re-start work on handwritten numbers extractor
  • Possibly update extractors (will wait for json-ld sprint tasks to be over for this one)
  • Sent email to Qina, Kenton, Jong about the current state of the extractor with current sample results
  • Looked into new samples for the handwritten number extractor - still need to find a good collection for testing it

  • Fix deployment of dev machines
  • Presentation openstack
  • Add BrownDog to PEcAN.
  • dev machines now running tests, most tests passing.
  • presented openstack, next step quotes
  • created extractors for audio, image, video and pdf. each extractor will create a thumbnail image, preview image and a preview version of the file.
  • Started work on inserting BrownDog in PEcAn.
  • Work on new ideas for the MSC project
  • Continue working on DFDL schema for MSC spreadsheets
  • Work with new PEcAn script and develop its Browndog wrapper.
  • New report with all subjects for the MSC was produced
    • the report was sent to Melisa for distribution
    • waiting for comments
  • Winning  familiarity with DFDL
    • Beginning to think it will be hard to accomplish it using DFDL
  • Some old Jira tasks were closed.
  • Complete fixing merge conflicts with branch MMDB-1455
  • Setup person detector on extractor-0014 VM
  • Fix video display issue in Gordon
  • Fix extractor crashing issue in Gordon (continuation from previous week)
  • Complete work on integrating person tracking and person detection
  • Will complete fixing merge conflicts in branch MMDB-1455 today
  • Installed required software and completed setting up person detector on extractor-0014 VM
  • Worked on fixing video display issue in Gordon. Have created ticket for Gordon admins to update ffmpeg installation. 
  • Completed integrating person tracking and person detection... Created a person tracking extractor. That's a yay!
  • BD 18 Month Review script
  • Finish tools catalog deployment
  • bd-wget
  • Started wiki for demo scripts (followed up with others in regards to getting info needed)
  • Bug fixes/features to Tools Catalog
  • Populated Tools Catalog
  • Made YouTube videos of usage of Tools Catalog and added to Blog
  • Json-ld sprint
  • Use-case service
  • Chrome-extension UI
  • Created the ability to add/query/remove contexts for Json-ld metadata as part of the Json-ld sprint. This includes case classes, declaration context service, MongoDB service implementation, API controllers, unit test and integration test.
  • Installed necessary software for writing DAP service for CZ use case and started writing the service
  • In order to make Chrome-extension UI persistent, tried using background page instead of event page. But still it is not working. Needs to look into more details as how the communication between background page and popup.html works.

  • Json-ld sprint
  • DownloadAs button for chrome extension
  • Json-ld sprint: writing to db - done
  • Json-ld sprint: reading from db - done (except for content)
  • DOwnloadAs button - started
  • IMLCZO development and deployment
  • SEAD development and wrapping up sprint
  • Browndog development and wrapping up 2 sprints
  • Earthcube Geo identification of technical requirements from use cases
  • Present on SEAD and Medici for other dibbs project
  • GLTR pull requests
  • Fixed bugs with comments in medici and cleaned up layout
  • Continued development for SEAD sprint
  • Prepared Medici presentation for CSL Dibbs group
  • Merged pull requests for medici and geodashboard
  • Fixed bug with hot deployment of imlczo geodashboard
  1. Find daily (hourly, 5 minutes) temperatures for Stickney for 2005-12, MWRD-142
  2. create T table in the Database, MWRD-143
  3. add daily average T to the dashboard, MWRD-145
  4. start NEXRAD workflow in DataWolf
  5. look at NetCDF Java for direct conversion of NEXRAD data format to 1) png and 2) Stickney value
  1. I have daily for the entire time interval and hourly for 2010 for for O'Hara
  2. Change the script, done
  3. added rest end point
  4. started, in progress
  5. started, in progress
  • ERGO/NIST - Discuss next steps with Jong, focus on JIRA tasks for 4.0 release
  • CyberSEES - implement REST calls from flow table editor to save updates to the worldfile
  • MSR - attend NFIE meeting, waiting for feedback from Fernando on bugfix
  • Ergo - fixed summary table reports, fixed RCP launcher to work with both Kepler and Luna, reviewed pull requests, reviewed Ergo text, general code cleanup
  • CyberSEES - flow table editor commits changes via REST service
  • MSR - attended NFIE, still awaiting feedback on DataWolf bug fix
  • Followed up with 3 use-case teams, documenting issues still needed to discuss
  • Prep for BD 18 month review - followup with UMD, NSF
  • Made progress on employee evals
  • BD - youtube videos, tweet
  • BD JSON-LD sprint item (BD-435)
  • Earthcube development
  • JSON-LD sprint: demoed to Luigi of a per-user approach, he suggested a global + per-user combo approach. Finally finished the item, and created a pull request.
  • Earthcube: had a project meeting. Nailed down more concrete feature requirements.
  • The JSON-LD item took most of this week. Next week need to work more on Earthcube.
  • recorded virtual tour for great lakes to gulf site
  • built human preference extractor - wrapping Ankit's matlab code
  • built extractor that runs green index and human preference on uploaded image
  • small changes to handling of technical metadata for route/streetview extractor
  • prepared for use case meeting
    • ran all extractors to get data
    • created basic powerpoint of current standing of work
    • set up laptop to run all extractors
  • Finish Datasets API test suite
  • Check in Collections API test suite
  • Start working on Preview API Test Suite
  • Completed Datasets API Test suite
  • Made pull request for Collections API test suite
  • Started initial efforts on Preview API test suite cases