gw; blacklight-query; YAAT
The Drop is back after an unexpected hiatus (due to work, long covid symptoms, and the existential threat of losing democracy in America rapidly coming to a decision point).
We’ll start the week off with a grab bag, but gosh Tuesday’s Typography edition is gonna be so good (to make up for missing one last week).
TL;DR
(This is an AI-generated summary of today’s Drop using Ollama + a llama 3.2 custom model.)
- A lightweight Rust binary ‘gw’ enables continuous deployment by watching git repositories and executing commands on changes (https://gw.danielgrants.com/)
- Blacklight Query, a CLI tool from The Markup, allows bulk privacy scanning of websites using their Blacklight privacy inspection engine (https://themarkup.org/blacklight/2024/10/16/blacklight-query)
- YAAT (Yet Another ASCII Table) provides a self-contained HTML/CSS/JS page for looking up hex, octal, or decimal values of ASCII characters (https://larsw.xyz/yaat/)
gw

gw is a lightweight binary that solves the continuous deployment (CD) problem without the complexity of traditional CD solutions or the lock-in of proprietary platforms. Written in Rust, it operates on a simple yet spiffy premise: watch local git repositories, sync with remotes, and execute commands on changes.
The binary weighs in at just 1.5MB, or approximately 7MB when bundled with statically linked git and ssh capabilities (I’ve been oddly obsessed with binary sizes of late). It functions as a pull-based system, making it useful in restricted network environments where traditional push-based CD solutions might fail, such as behind network address translation (NAT) or VPNs.
As noted, gw operates by monitoring a specified local git repository for changes. When changes are detected in the remote repository, it automatically fetches and can trigger two types of actions: build scripts via the --script flag and deployment processes via the --process flag. This lets us create fairly sophisticated deployment pipelines while maintaining operational simplicity.
Beyond basic continuous deployment, gw can be used in many types of development workflows. It can serve as a notification system for development changes, manage docker-compose deployments, and even function as a lightweight alternative to services like Netlify.
While primarily targeting Linux environments (macOS folks may see some odd errors depending on the installation method you choose), gw provides support for Windows and macOS on a best-effort basis. It can be deployed directly on bare metal, run as a systemd service, or operated within Docker containers, offering deployment flexibility across different infrastructure configurations.
Basic usage follows this pattern:
gw /path/to/repo --script 'run build process' --process 'run deployment'
I’m in the process of setting it up to watch for repos that house Observable Framework dashboards so that it can automagically re-build and publish them without relying on any third-party Codeberg/GitHub-esque service. I’m also finding it pretty handy running code after data-centric repositories are updated.
Blacklight Query

Photo by Timothy Dykes on Unsplash
Blacklight Query (GH) is a new command-line tool from The Markup that enables bulk privacy scanning of websites using their Blacklight privacy inspection engine. The tool augments the web-based Blacklight interface by letting us scan multiple sites programmatically rather than manually entering URLs one at a time.
The tool is built on Node.js and requires npm for installation and highly suggests using nvm to ensure you use the right version of Node.js to build and install the tool. It accepts input URLs via a text file containing absolute URLs separated by newlines. The system also supports Unix-style piping for URL input, enabling integration with other command-line tools.
Blacklight Query inherits configuration options from the underlying Blacklight Collector engine. The default configuration runs in headless mode (so, it spawns a headless browser), stores results in URL-specific output directories, and scans only the homepage of each target site. Additional options allow for:
- Scanning multiple pages per site (controlled via numPages setting)
- Device emulation for different user agents
- Custom HTTP header injection
The engine README has tons more config options.
Due to the default use of a headlesss browser, the scanning process is resource-intensive, potentially impacting system performance when processing large URL lists. While it collects the same core data as the web-based Blacklight tool, the Query version handles certain elements differently. For example, it does not apply the same filtering to third-party cookie classification.
Scan results are automatically organized in an outputs directory, with subdirectories created for each scanned hostname. This structured output makes it easier to analyze and compare results across multiple sites. The JSON is lightly nested, so I’d recommend some jq processing before tossing results over to, say, duckdb to store for long-term use.
YAAT

Sometimes one needs hex, octal, or decimal value for a given (basic) character. While I’d normally just cat /usr/share/misc/ascii, sometimes one wants something prettier to look at, and that has a tiny bit of extra functionality to copy the value one needs with one click/tap vs double-click or drag-select in a terminal to do the same thing.
Enter “Yet Another ASCII Table“, which is a vanilla HTML, CSS, and JS page that does just that. It’s fully self-contained, so you can use it locally, too:
$ alias ascii="curl -s -z ${HOME}/Documents/ascii.html --remote-time -o ${HOME}/Documents/ascii.html https://larsw.xyz/yaat/ && open ${HOME}/Documents/ascii.html"
$ ascii
(Change the open to whatever works on your OS).
That particular curl incantation will only re-download the HTML file if it was modified.
FIN
Remember, you can follow and interact with the full text of The Daily Drop’s free posts on Mastodon via @dailydrop.hrbrmstr.dev@dailydrop.hrbrmstr.dev ☮️
Leave a comment