Plop; Lit Starter Kit; elia
The first two sections in today’s Drop work well as a pair, since section one introduces a very handy, general purpose tool, and section two provides very practical examples of how to work with it. I tossed in a new (to me) LLM TUI resource just to mix things up a bit.
TL;DR
(This is an AI-generated summary of today’s Drop using Ollama + llama 3.2 and a custom prompt.)
- Plop is a micro-generator framework that streamlines boilerplate code creation using consistent patterns, acting as glue between inquirer prompts and handlebar templates. (https://plopjs.com/)
- The Lit Starter Kit is a tool for quickly setting up web components using the Lit framework, providing a structured environment for creation, testing, and deployment. (https://github.com/break-stuff/lit-starter-kit)
- Elia is a terminal-based application that facilitates seamless interaction with large language models directly from the command line, offering a streamlined and engaging user experience. (https://github.com/astral-sh/uv)
Plop

Plop (GH) is a micro-generator framework that streamlines the creation of boilerplate code and files with consistent patterns across projects. It acts as glue code between inquirer prompts and handlebar templates, enabling teams to codify their best practices into easily reproducible templates.
The framework operates through a plopfile.js at your project root, which exports a function accepting a plop object. This object provides the API for defining generators that create files and modify code. Each generator consists of prompts (using inquirer.js) and actions that execute based on the prompt responses.
Key actions within a plopfile include:
add(reates new files using templates)modify(updates existing files using pattern matching or transforms)append(adds content to specific locations in files)addMany(generates multiple files in one operation)
You can create dynamic action arrays that adapt based on user input:
export default function (plop) {
plop.setGenerator('component', {
prompts: [{
type: 'confirm',
name: 'wantTests',
message: 'Include tests?'
}],
actions: (data) => {
const actions = [{
type: 'add',
path: 'src/{{name}}.js',
templateFile: 'templates/component.hbs'
}];
if (data.wantTests) {
actions.push({
type: 'add',
path: 'tests/{{name}}.test.js',
templateFile: 'templates/test.hbs'
});
}
return actions;
}
});
}
The power of Plop lies in its ability to turn “best practices” into the path of least resistance, making it easier for teams to maintain consistency while reducing the cognitive overhead of creating new components, modules, or other repeated patterns in their codebase.
Let’s walk through some places you can leverage plop.
Creating React/Vue/Angular components with associated files:
import { ComponentProps } from "./types";
import "./styles.css";
const Component = ({ className="" }: ComponentProps) => {
return <div className={className}>{/* content */}</div>;
};
export default Component;
Ivan Saldano has a full breakdown of this use case, and we’ll see a similar idiom in the middle section.
Plop can also generate complete serverless function templates with associated infrastructure code, tests, and configuration files. Don’t worry, though, I won’t be proselytizing Kubernetes or any cloud provider, but I can show a plopfile example for creating systemd unit files for a service/timer pair:
export default function (plop) {
plop.setGenerator('systemd-units', {
description: 'Generate systemd service and timer units',
prompts: [{
type: 'input',
name: 'name',
message: 'Service name:'
}, {
type: 'input',
name: 'description',
message: 'Service description:'
}, {
type: 'input',
name: 'execStart',
message: 'Command to execute (ExecStart):'
}, {
type: 'input',
name: 'schedule',
message: 'Timer schedule (OnCalendar, e.g. *-*-* 04:00:00):'
}],
actions: [{
type: 'add',
path: '/etc/systemd/system/{{name}}.service',
template: `[Unit]
Description={{description}}
After=network-online.target
Wants=network-online.target
[Service]
Type=oneshot
ExecStart={{execStart}}
[Install]
WantedBy=multi-user.target`
}, {
type: 'add',
path: '/etc/systemd/system/{{name}}.timer',
template: `[Unit]
Description=Timer for {{description}}
[Timer]
OnCalendar={{schedule}}
Persistent=true
[Install]
WantedBy=timers.target`
}]
});
}
After running this generator, it will create a service unit file that defines the task to be executed, and a timer unit file that controls when the service runs
The generator prompts for essential information and creates standardized unit files following systemd conventions. The generated files will be properly formatted and ready for systemd to use after running systemctl daemon-reload.
With plop, you can also, say, manage podcast media episodes by having it create standardized folder structures, generate show notes from templates, and manage sponsor information and tracking links. Amy Dutton has a stellar example of that in this post.
Plop’s template-based approach works with any text-based file format, so it’s a handy tool regardless of the primary project language, purpose, or technology stack.
Lit Starter Kit

The “lit-starter-kit” is a tool designed to help us (OK: me) quickly set up a library of web components using the Lit framework. It provides a structured environment that simplifies the creation, testing, and deployment of Lit-based components. This is super helpful since there is a well-defined structure for Lit’s components (along with quite a bit of required boilerplate), and I need to get better about not one-offing Lit elements.
To begin, you can either fork the repository directly or use the following command to create a new project and start developing:
$ npm init lit-starter-kit your-project-name
The project utilizes “plop” (ref. section uno) to streamline the generation of new components. By running npm run new, you’ll be guided through prompts to set up a new component within your library. Running npm run dev will put the project into development mode (with all the usual goodies).
Running npm run build will generate the necessary assets:
- The dist directory will contain assets for the NPM package.
- A cdn directory at the project’s root will hold content for Content Delivery Networks.
- Metadata for your components, including framework integrations like types and React wrappers.
For development and documentation, the kit employs Storybook, which lets us showcase and document our components. I’ve generally only seen Storybook used for React/Vue components, so this was a super cool feature to see (again, for at least me).
Testing is facilitated by web-test-runner — configurable via a web-test-runner.config.js file — which executes tests in real browsers to ensure APIs function as intended in their target environments. Tests are run with — you guessed it: npm test.
This project is super new, but I have a vested interest in using it, so expect more about it in a future Drop.
elia

It’s been a minute since we’ve done a Happy ThursdAI edition, and I’ve bored you to tears with Plop and Lit, so we’ll toss in an AI-centric tool just to mix things up.
Elia is a terminal-based application that facilitates seamless interaction with large language models (LLMs) directly from the command line. Designed with a focus on keyboard efficiency, it offers a streamlined and engaging user experience. By storing conversations locally in a SQLite database, Elia ensures that users have easy access to their interaction history. It supports a variety of models, including proprietary ones like ChatGPT and Claude, as well as local models via tools such as Ollama and LocalAI. 
To install Elia, we can use uv (as you can see, I like to live in YOLO mode with uv):
$ uv pip install elia-chat --break-system-packages --system
Depending on the chosen model, setting environment variables like OPENAI_API_KEY or ANTHROPIC_API_KEY may be necessary. I’m using it with local Ollama, with:
[[models]]
name = "ollama/llama3.2:latest"
added to ~/.config/elia/config.toml. (Add entries for as many models as you may have loaded up into Ollama).
You can launch it directly from the command line for a full-screen TUI chat experience, or initiate a new chat inline using the -i or --inline option. The app also allows specifying different models through the -m or --model parameter.
You can craft custom system prompts, use different themes, and even import your ChatGPT convo history.
This has been one of the better TUI UX experiences for interactive model operations.
FIN
We all will need to get much, much better at sensitive comms, and Signal is one of the only ways to do that in modern times. You should absolutely use that if you are doing any kind of community organizing (etc.). Ping me on Mastodon or Bluesky with a “🦇?” request (public or faux-private) and I’ll provide a one-time use link to connect us on Signal.
Remember, you can follow and interact with the full text of The Daily Drop’s free posts on Mastodon via @dailydrop.hrbrmstr.dev@dailydrop.hrbrmstr.dev.
Also, refer to this post to see how to access a database of all the Drops with extracted links, and full text search capability. ☮️
Leave a comment