Drop #742 (2025-12-17): The Rumors Of The Drop’s DemAIse Have Been Greatly Exaggerated

The Curious Case Of A Proposed AI URI Scheme And The Artificial Intelligence Internet Foundation (AIIF); Deno 2.6

👋 We’re back, and I doubt there were any rumors at all. It’s been quite the wild ride since the last Drop on the 8th. React2Shell has been all-consuming of any free time (ref 1ref 2; and one more coming today on the work blog). It’s a pretty “big thing”, and is likely going to become part and parcel of the internet’s “Noise Floor”).

I’m also at a latitude that makes this time of year pretty dismal due to the lack of sunlight (which means my dreams of moving to Iceland will never be realized since I doubt I could handle the even more brutal volume of dark days they have there), and it always seems to significantly and negatively impact energy levels.

To make up for it, we have one (long) detective story and some really cool and helpful new toys — including safety features — in the latest Deno release.


TL;DR

(This is an LLM/GPT-generated summary of today’s Drop. This week, I have been — for lack of a better word — forced into using Gemini, so today’s summary was provided by that model. Sigh.)

  • This first section explores the suspicious origins and technical evolution of a proposed AI URI scheme and its governing foundation, which appears to be linked to an individual with a minimal professional footprint and a social media presence suggestive of artificial inflation (https://www.ietf.org/archive/id/draft-sogomonian-ai-uri-scheme-01.html).
  • The Deno 2.6 release introduces several practical features including a new dx utility for one-off package execution, significant type-checking speed improvements via tsgo, and enhanced security auditing tools to protect against supply chain vulnerabilities (https://deno.com/blog/v2.6).

The Curious Case Of A Proposed AI URI Scheme And The Artificial Intelligence Internet Foundation (AIIF)

If any of the links disappear from the internet, fret not, as I’ve archived them.

On the surface, [draft-sogomonian-ai-uri-scheme-01](https://www.ietf.org/archive/id/draft-sogomonian-ai-uri-scheme-01.html) proposes something that sounds plausible enough: a new “ai://” URI scheme that would provide a standardized way to address “AI” resources such as agents, models, tools, and tasks. The document, submitted to the IETF as an Internet-Draft, envisions a world where autonomous systems and robots can connect natively using something called AIIP (Artificial Intelligence Internet Protocol), while regular web applications can bridge to these resources through HTTPS gateways. The draft outlines discovery mechanisms via HTTP Link headers, HTML link relations, well-known resources, and DNS records. It proposes that a body called the “Artificial Intelligence Internet Foundation” would govern namespace administration for these identifiers.

The technical concept is not inherently unreasonable, even if the need for it remains debatable, with my own snark judgement suggesting we should not be making anything easier for “AI”. What raises eyebrows is everything surrounding the proposal.

The draft’s author is listed as “Aram Sogomonian”, claiming affiliation with the Artificial Intelligence Internet Foundation (AIIF). But that organization appears to be little more than a recently filed trademark. The first version of this draft, submitted in September 2025, listed the affiliation as “MyConcept Project” with the change controller being “MyConcept / AI Internet Project.” Neither of these entities shows any meaningful internet footprint. When the revised draft appeared in October, the author had rebranded to the Artificial Intelligence Internet Foundation, a trademark that records show was filed on October 15, 2025, mere weeks before the updated submission.

The contact email throughout all submissions remains waterbottling@icloud.com. This is not the sort of institutional email one expects from someone representing a foundation purporting to govern a new layer of internet infrastructure. Though, this is the internet, and until age verification kicks in for everything, we stil don’t know who is or is not a dog.

Comparing the two versions of the draft reveals another curious transformation. The original submission explicitly mentioned a “blockchain-based registry and resolver instead of DNS,” referenced “digital twin domains” resolved via “smart-contract registry mapping names to a domain manifest stored in decentralized storage such as IPFS,” and described an “AI Button” client concept. The second version surgically removes all blockchain and crypto terminology, pivoting entirely to conventional HTTPS gateways and established web security mechanisms. This whiplash-inducing pivot, executed within weeks, suggests either that the author received extremely pointed feedback about crypto schemes being DOA at the IETF, or that the entire document underwent a significant rewrite by someone or something that understood what language would be more palatable to standards bodies.

The IETF mailing list thread for this proposal contains its own peculiarities. The initial submission email was ostensibly sent from an iPhone (it’s almost 2026 and anyone who leaves that as their iCloud auto-footer is pretty suspect). Subsequent replies bear the hallmarks of AI-generated text: emoji checkmarks used as bullet points, overly structured responses with formatted lists, and phrases like “Your guidance is helping ensure this work follows community expectations and established best practices—thank you again for helping strengthen this foundation.” The prose has that unmistakable synthetic sheen, that eager-to-please verbosity that large language models produce when asked to write professional correspondence. However, it may also just be someone that does not have English as a primary language, who is just being overly formal and respectful in a new forum.

We now interrupt this section for a cat picture interlude as mandated by the “SECTION LENGTH EXCEEDED” rule:

Photo by Francesco Ungaro on Pexels.com

The LinkedIn profile for Aram Sogomonian that matches the draft author’s details is sparse to the point of suspicion: no profile photo, no posts, no meaningful professional history related to AI or internet standards, but inexplicably lists interests including Paris Hilton and a cryptocurrency wallet service.

Perhaps most curious is the Instagram presence. The account ai_internet_foundation (that link does not go directly to Insta), which appears to be the social media arm of this putative foundation, has 21,500 followers but only one post and follows just 13 accounts. One of those followed accounts is aram_sogomonian_333 (that link also does not go to Insta), which at least connects the social media presence to someone claiming the author’s name. But the follower-to-content ratio screams purchased followers, a common tactic for creating the appearance of legitimacy where none exists.

So what might actually be happening here? One possibility is that this is an experiment in using AI tools to generate and submit standards documents, either as a proof-of-concept, as a research project, or simply to see what one can get away with. The rapid iteration, the AI-generated correspondence, and the sudden shift in technical direction could all be consistent with someone feeding IETF feedback into a language model and having it regenerate the proposal accordingly.

Another possibility is more mundane but no less concerning: this may be an attempt to squat on a valuable URI scheme. Getting “ai://” registered (NOTE — it will likley be aiip://), even provisionally, through the IETF would represent a meaningful asset. The author would become the change controller, the foundation (such as it is) would theoretically administer the namespace, and any future legitimate use of an ai:// / aiip:// scheme would have to contend with these claims.

There is also the crypto angle that has been scrubbed from the current version. The original draft’s blockchain and IPFS references, combined with the rebranding from a vague project to a “foundation,” follows a pattern familiar from the NFT and Web3 hype cycles: dress up a speculative asset play in the language of infrastructure and standards. The removal of those references from version 01 may simply reflect tactical adaptation to an audience that would immediately reject them.

None of this means the technical proposal is completely without merit in the abstract. Standards bodies receive all manner of proposals, and the review process exists precisely to separate viable ideas from those that are not. But the pattern of evidence here—the shell foundation, the purchased followers, the AI-generated correspondence, the phantom organizational affiliations, the rapid rebranding, the crypto-to-conventional pivot, the iCloud email as the official contact—suggests this proposal deserves particular scrutiny regarding not just its technical merits but the motives behind its submission.

The IETF process will ultimately determine whether this proposal advances, stalls, or is abandoned. Those with interest in AI interoperability standards, URI scheme governance, or simply the question of how AI-assisted submissions might affect technical standards bodies would do well to follow the evolution of draft-sogomonian-ai-uri-scheme and its companion AIIP architecture document. The mailing list archives are public, the datatracker shows all revisions, and the story of what this actually represents may still be unfolding.


Deno 2.6

Deno 2.6 dropped and there’s quite a bit to unpack from it. If you’ve been even lightly following the Deno project, you know they’ve been on a tear lately making the runtime more practical for real-world use while keeping their security-first philosophy intact. This release continues that trajectory with some spiffy useful additions.

The headline feature is dx, which is essentially Deno’s answer to npx, a hallmark utility of the Node ecosystem. If you’ve used the latter (likely in the context of copypasta from a social coding site README), you know the workflow: you want to run a quick utility from NPM without permanently installing it globally, so you reach for npx. Now Deno has the same convenience baked in. Run dx cowsay and you get your ASCII cow without any fuss. The key difference from just using deno run is that dx defaults to allow-all permissions unless you specify otherwise, prompts you before downloading packages, and automatically handles lifecycle scripts if you accept. It’s optimized for the quick one-off execution pattern that npx popularized.

The permission system also added some nuance in ways that matter for practical use. Deno has always been strict about permissions, which is great for security (and a big reason I like working in/with it) but sometimes creates friction when you’re running code that gracefully handles missing environment variables or files but can’t handle Deno’s NotCapable errors. The new ignore-read and ignore-env flags let you tell Deno to return NotFound or undefined instead of permission errors for specific paths or env vars. This means you can run that dependency that wants to read twenty config files from your home directory without actually granting it access to your entire filesystem. It just thinks the files don’t exist.

Type checking tapped into the Speed Force and is much faster thanks to integration with tsgo, Microsoft’s experimental Go-based TypeScript checker. You enable it with the unstable-tsgo flag and the Deno team reports seeing 2x speedups on their internal projects. The language server also picked up some quality-of-life improvements including proper support for organizing imports and better handling of describe and it style test functions.

WebAssembly workflows are also now more ergonomic with source phase imports. Instead of fetching and compiling WASM at runtime, you can now import the compiled WebAssembly.Module directly as part of your build. Combined with the WASM import support from 2.1, working with WebAssembly in Deno is starting to feel pretty natural (and makes for a great wrapper for WebR projects).

The team’s focus on safety continues as security auditing is now built in with deno audit. It scans your dependency graph against the GitHub CVE database and reports vulnerabilities. There’s also experimental integration with socket.dev if you want their more comprehensive analysis, which can catch things like typosquatted packages that might not have CVEs but are definitely malicious. The example in the release notes shows it flagging lodahs, one of those security-research packages that exists specifically to catch people who typo lodash.

The dependency management story has improved in several ways. The first is the new deno approve-scripts command which provides an interactive picker for reviewing which packages can run lifecycle scripts during installation rather than the old blanket allow-scripts flag. You can also set a minimum dependency age requirement so your project only uses packages that have been published for at least a certain amount of time, reducing the risk of supply chain attacks through freshly compromised packages.

But wait! There’s more! We just need another feline interlude:

Photo by Pixabay on Pexels.com

Node.js compatibility continues to mature, as types/node is now included by default; this means you get proper TypeScript hints for Node APIs without manually importing the types package. Beyond that, there are dozens of fixes across node:cryptonode:fsnode:process, and node:sqlite that bring behavior closer to what Node developers expect. The sqlite module in particular got a lot of attention with backup support, aggregate functions, custom functions, and various edge case fixes.

On the API front, BroadcastChannel is now stable for cross-worker communication, and web streams are also now transferable so you can pass them between workers efficiently without copying. There is even a new dark mode toggle for HTML coverage reports (some of us who unfortunately stare at glowing rectangles when we cannot sleep appreciate this since we are now no longer blinded by the light [theme]).

Performance improvements include fixing a memory leak in fetch that affected multi-threaded programs (I ran into this on a project back in Q3!) and moving some critical Node compatibility operations into native code. The stack traces got cleaned up too, with internal Deno frames dimmed and relative paths shown for better readability.

The full release has plenty more, including bundler improvements for cross-platform targeting, a require flag for preloading CommonJS modules, and V8 14.2 bringing the latest engine improvements.

Go forth and Deno!


FIN

Remember, you can follow and interact with the full text of The Daily Drop’s free posts on:

  • 🐘 Mastodon via @dailydrop.hrbrmstr.dev@dailydrop.hrbrmstr.dev
  • 🦋 Bluesky via https://bsky.app/profile/dailydrop.hrbrmstr.dev.web.brid.gy

☮️