This week I created a simple dashboard for COVID19 related data released by the state of Rhode Island. It tracks the following key COVID19 metrics in Rhode Island:
Positive COVID19 cases
Patients on Ventilators (new as of 4/15)
Cases by City/Town (now a map rather than table)
Both current/cumulative and historical data for these metrics are shown with graphs and day over day change annotated to visualize trends.
Initially, data was only being shared via a shared Google Sheet from the state’s Department of Health so using Google’s Datastudio platform was an easy way to visualize the data.
Today that sheet is no longer in use (that’s a good thing!) so I’ve updated the dashboard using CovidTracking.com‘s excellent API to continue to bring in live data automatically. The COVID Tracking Project is a volunteer organization that is bringing disparate data from US states together into a single dataset and providing publicly available APIs to access everything.
Using their API is a bit overkill, and it’d be nice if the government could provide a similar service, but for now it’s the best way to get comprehensive data in the US right now.
I poll the API every few hours via a simple JSON request to a filtered dataset with just RI’s data to avoid pulling in a much larger dataset than needed. Documentation for CTP’s APIs can be found here.
Typically RI DOH updates their data once daily in the afternoon, but as CTP sometimes synthesizes data from different sources I wanted to make sure what was being shown was the most up to date version.
Since I originally created this tool, the state has also released an ArcGIS-based dashboard of their own with a similar format using Power BI and Datawrapper. I’m now embedding their map in my dashboard to visually depict cases by City instead of displaying a table.
I’m a lifelong avgeek and as I’m learning to fly for real, I use my home simulator as a really useful training tool. Occasionally I’ll stream my flights, and because I have a production background I really wanted to have a proper audio workflow. The following setup is exactly what I use to send audio to OBS for live streaming on Twitch and elsewhere.
Getting the sound out
In order to hook into the pro mixer I have, I needed a pair of external USB sound cards, hopefully ones that either output a pro line or mic signal. My same friend who’s working on those overlays recommended a digital DI box by Peavey that outputs to two XLR jacks. It rocks.
It’s a step above some similar USB cards that I’ve found and it includes circuitry to eliminate electrical noise and ground loop hums. This is really nice because PCs are noisy and most cheaper external sound cards don’t output truly clean audio. I have two of these, one for X-Plane ambient sound, and the other for PilotEdge output.
Mixing the sound
Everything gets sent into a Behringer 1002b mixer. With 5 XLR inputs and a total of 10 channels of audio, I have a lot of granular control. It’s very easy to adjust the mix to compensate for a quiet controller, or an extra loud plane. In order to use a typical pc headset with a mixer, you need to supply it with ‘plugin power.’ This is sort of a lower voltage version of phantom power and it’s tricky to find the right adapter. I found a nice one from a company that makes a lot of small adapters and power modules, Sound Professionals.
Routing audio to the right places
Two 1/4″ patch cables go from the mixer’s main outs to a Tascam USB audio interface that I typically use for recording into my Mac. I use Tascam’s app or hardware knobs to tweak the signal levels and OBS recognizes the input as a generic USB soundcard.
The mixer has both monitor and FX Send submixes, so I can isolate my headset mic and just send that back to the PC for PilotEdge. With the other submix, I send ATC to my headset plus my own mic input to simulate sidetone.