From charlesreid1

overarching goal

2018/Data Project

Current stage: spy project (preparing Dockerfiles, testing additional components before yeti deployment)

aws things

AWS writeup for DIB lab group: AWS Writeup

Amazon VPC and VPN

  • set up private test node with vpc
  • if no internet gateway, how can we update? if have internet gateway, can anyone accses necessarily?
  • networking patterns - private and public subnet, single public subnet, one gateway with internet for mult. subnets
  • use this as a testbed for banana pi hardware

monitoring and dashboards

netdata: Netdata

prometheus: Prometheus

  • how to structure data directory
  • on the 1st of the month:
  • find any data older than 1 month
  • archive it and put it in a bucket
  • purge data from system

grafana: Grafana

  • building more complex dashboards
  • sharing dashboards

toy problem goal

sourmash - dive into the source and instrument it.

Python signals library - send a signal 10 to the running program, and that bumps it into the netdata state, where it grabs values of different variables?

focus on greenfield deployments - install netdata, collect stats, visualize them

d3 viz

10 visualizations:

more calendars:

  • wikipedia article edit calendars
  • dates... color dimension, pref. binary...

Fix a shrubbery

Also see D3x10

blog posts


Still chuggin with the charlesreid1 spreadsheet. Slogging through the litany of wiki pages. Google Sheets.

With the deployment of charlesreid1 bot, we want to take the time to do it more thoroughly:

  • bot instrumentation
  • dashboard
  • monitoring bot status
  • statistics
  • display latest tweet
  • bot dashboard built with grafana



apollo bots:

  • 14/15/16/17
  • incorporate lunar surface dialogue

apollo references:


Bots/New Apollo


Genealogy photos:

  • Photos cropped/organized by family
    • 2011
    • 2017
    • Rename scheme
    • Notes - A2k11
    • Notes - R2k11
    • Notes - A2k17
    • Notes - K2k17
    • Notes - R2k17
  • Send email to fam with link on Dropbox


  • Pauline and Bruce chapters
  • Historical research planning

stretch goal

test backup and restore scripts

automate automate automate

  • use an AWS node or a GC node to actually test the backups, start to finish.
  • creating a second node that's a complete and total backup of the first.
  • in the process of doing this, we're going to have a lot of questions to answer and software to fix.
  • letsencrypt certificate generation
  • ssh

We are one step toward that goal with scripts that are automating site updates, committing to git repos, pulling latest versions of data, etc.

Now that we are familiar with subprocess for non-trivial tasks, we can move forward with scripts to automate testing.

back burner

See 2018/February#back burner