Bonus Drop #111 (2026-02-22): Screech • Spruce Up • Sandbox

An AWKward Modem; Go F…ix Yourself; Scraping Sandbox

We’ve got some nostalgia and nouveau, with a hint of pragmatism in today’s sections.


TL;DR

(This is an LLM/GPT-generated summary of today’s Drop. Ollama and MiniMax M2.1.)

  • A 5-line AWK script implements a Bell 103 modem entirely in software, encoding arbitrary text into a WAV file that can be used as a data exfiltration technique for air-gapped systems where only a shell with AWK is available (https://seriot.ch/software/awkward_modem.html)
  • The go fix subcommand in Go 1.26 has been completely revamped on the Go analysis framework, running modernizer analyzers that automatically rewrite code to use newer Go idioms like min/max builtins, strings.Cutrange-over-int, and any instead of interface{} (https://go.dev/blog/gofix)
  • Scraping Sandbox is an open-source web scraping playground built on Next.js featuring 500 sample products across 15 categories with pagination and rate limiting at 60 requests per 10 seconds, designed for practicing scraper development in a safe environment (https://scrapingsandbox.com/)

An AWKward Modem

Photo by Anna Tarazevich on Pexels.com

I’ve been thinking back on the privilege that I’ve had to be present and aware at a time when analogue, rotary phones were “high tech”, Casio digital watches were “from the future”, and just as the era of “personal computing” was being birthed. I saw, and experienced/moved through all of the technology transformations, can still recollect my Compuserve ID, and likely still have an AOL CD somewhere in a box in the storage garage. An article posted recently that triggered yet-another one of those walks down memory lane.

If you, too, grew up in the era of analogue phone lines, you know the sound of a modem handshake viscerally. As all Drop readers likely know, that screech-and-hiss when your 300-baud modem (or later, 2400, 14.4k, and even 56k!!!) dialed into a BBS or your uni’s terminal server wasn’t random noise. It was frequency-shift keying (FSK) — two distinct audio tones representing binary 1 and 0. The Bell 103 standard from 1962 used 1270 Hz for a mark (1) and 1070 Hz for a space (0). At 300 baud, with start and stop bits framing each byte, you got 30 characters per second. You could literally _hear+ data moving.

What if I told you there’s a 5-line AWK script that implements a Bell 103 modem entirely in software, encoding arbitrary text into a WAV file that can be played through speakers and decoded on the other end by a microphone. While already cool, it’s also a plausible data exfiltration technique for air-gapped systems where you have no network, no compiler, and no install rights, just a shell with awk.

The post shows that this entire protocol is so simple it can be reimplemented in a language that ships with every Unix system ever made. AWK has been standard on Unix since the late 1970s, so it is always there. No need to apt/brew it or FTP some source tarball and compile it. It just exists, like ls or cat. And that’s what makes this cool project a legitimate security concern rather than just a nostalgia trip.

Consider that hardened, air-gapped system with no network, no USB, and no ability to install software. The machine has speakers (or a headphone jack), and there’s a phone nearby (like that iPhone in your pocket if this isn’t a SCIF). The attacker types (either from memory or a tiny piece of paper) or pastes 433 bytes of AWK into a terminal, pipes a file through it, and plays the resulting WAV. The phone records what sounds like a faint tone (or, in ultrasonic mode, nothing audible at all). Later, minimodem decodes the recording back into the original text.

As the post notes, an SSH private key is typically under 2KB, and at 30 bytes per second, that’s about a minute of audio. The ultrasonic variant shifts the mark and space frequencies up to 17.5 kHz and 15 kHz respectively, which is above what most adults over 45 can hear, but well within the recording capability of modern phone microphones and apps like iOS Voice Memo in lossless mode.

What the author has done is take the oldest, most fundamental data-over-audio technique in computing; the thing that literally was the internet before there was an internet; the mechanism by which every BBS caller and early network user moved bits; and demonstrated that it never actually went away. The physics hasn’t changed and sound still carries information. The tools to generate those sounds have been sitting quietly on every Unix box for half a century.

For those of us who watched the transition from acoustic couplers (the rubber-cupped devices you’d press a telephone handset into, exactly as shown in WarGames) to Hayes-compatible modems to DSL to fiber, there’s something both charming and unsettling about this. The old ways still work and the air gap you trust is only as good as your control over every transducer in the room (i.e., every speaker, every microphone, every device capable of recording audio).

I feel compelled to note that the piece was published in Paged Out! issue 8 (February 2026), which is a free zine focused on fitting technical content onto a single page. So the post format suits this hack perfectly, since the core idea is compact enough to fit on one page because the Bell 103 protocol itself is that simple.


Go F…ix Yourself

Photo by Pixabay on Pexels.com

The go fix subcommand that shipped with the recent Go 1.26 rollout has been completely revamped, and is a major upgrade to Go’s static analysis tooling.

The old go fix was a relic from the pre-Go 1.0 days, and was used to keep code compatible during early language churn. The new go fix is rebuilt on top of the Go analysis framework (the same infrastructure behind go vet and the epic gopls). It runs a suite of “modernizer” analyzers that identify opportunities to rewrite your code using newer Go idioms and APIs.

You just run go fix ./... from a clean git state, and it silently rewrites your source files (it does skip generated files). You can preview with -diff, list available fixers with go tool fix help, enable/disable specific analyzers by flag name, and run multiple passes since fixes can be synergistic (one rewrite creates the opportunity for another).

The baked-in modernizers address a real problem: Go has been evolving faster since generics landed in 1.18, adding things like min/max builtins, range-over-int, strings.Cut, the maps and slices packages, etc. But the global corpus of Go code (and consequently, sigh, LLM training data) still reflects older patterns. The post explicitly calls out that LLM coding assistants were producing outdated Go idioms and sometimes refused to use newer features even when directed to. The modernizers exist partly to update the training corpus itself.

So what exactly gets targeted for the rewrites? if/else clamping patterns become min(max(...)); three-clause for i := 0; i < n; i++ becomes for range nstrings.Index plus slicing becomes strings.Cutinterface{} becomes any; redundant loop variable re-declarations from pre-1.22 get removed; and string concatenation in loops gets replaced with strings.Builder.

Go 1.26 also introduces new(expr), extending the new builtin to accept a value instead of just a type. So new("hello") returns a *string pointing to "hello". This eliminates all the newInt/newString helper functions that litter codebases using JSON or protobuf with optional pointer fields. There’s also a corresponding newexpr fixer.

go vet and go fix are now nearly identical in implementation, and they differ only in their analyzer suites and what they do with diagnostics. Vet reports problems; fix applies corrections. The analyzers are decoupled from “drivers” (unitchecker, singlechecker, gopls, nogo for Bazel, staticcheck, Google’s Tricorder, etc.), so you write an analyzer once and it runs everywhere. The framework supports cross-package “facts” for interprocedural analysis, and there’s been significant infrastructure investment: an AST cursor type for DOM-like navigation, a type index for fast symbol reference lookup (1000x speedup for finding calls to infrequent functions like net.Dial), and a library of refactoring primitives.

If you are in Go-land even infrequently, I highly recommend poking at the rest of the article.


Scraping Sandbox

Photo by Micah Eleazar on Pexels.com

It’s always nice to have a legit playground to practice web scraping in, especially when working in a new language.

Scripting Sandbox is an open-source web scraping playground built by Agenty (a SaaS web scraping platform catering to “AI”), designed for developers and automation builders to prototype scrapers and experiment with selectors in a safe environment. It’s essentially a fake e-commerce storefront with synthetic product data.

It has 500 sample products across 15 categories and 20 fictional vendor names. Each product has a name, price (some with sale/original pricing), rating, category, SKU, vendor, and stock status. The data is clearly generated, so you’ll see names like “Lightweight Probiotics” from vendor “SoundWave” and “Limited Edition Vinyl Record” from “TasteBud.”

They built it on Next.js with App Router, Tailwind CSS, Radix UI, ShadCN for the UI and deployed it via OpenNext on Cloudflare Workers. Normally that sentence would make me wretch, but these are the kinds of sites you encounter in the real world, so practice on this is valuable.

The site exposes pagination (21 pages of products), search/filtering by category, and individual product detail pages at /product/{id}. The header teases an API endpoint (api.agenty.com/:agent_id?limit=500) which is a reference to Agenty’s commercial scraping API, not a public endpoint on the sandbox itself.

You get 60 requests per 10 seconds, and if you exceed that, you get temporarily blocked. This is intentional, as it’s meant to simulate real-world rate-limiting behavior you’d encounter in production.

Sites like scrapethissite.com and books.toscrape.com have been the go-to practice targets for years. This one differentiates by offering a more modern, React-rendered frontend (which matters for testing JS-rendered content scraping) and the e-commerce product grid pattern specifically.


FIN

Remember, you can follow and interact with the full text of The Daily Drop’s free posts on:

  • 🐘 Mastodon via @dailydrop.hrbrmstr.dev@dailydrop.hrbrmstr.dev
  • 🦋 Bluesky via https://bsky.app/profile/dailydrop.hrbrmstr.dev.web.brid.gy

☮️

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.