HDR: Interfaces for artificial and human engineers

link


Vers init display

HDR is building a neocloud called Vers — VMs with fully branchable state, like Git for infrastructure. Any machine can be instantly snapshotted and branched, so any agentic workflow or search operation can happen in parallel. I was one of three engineers building the platform over six months. My job was building the interface layer for use by humans and machines.

The first surface was a web UI — auth, billing, usage tracking, API key provisioning, machine status. Multiple backend services were firing usage events that the frontend needed to reflect in real time, and the core infrastructure was evolving week to week, so I was building against a moving target.

I stood up the first CLI in Go when the product was oriented around on-demand provisioning of computer use agents for any arbitrary task. I built demos and did sales engineering for this mode, testing whether the platform could solve real problems for developers who needed headless browser automation at scale.

Then our team got VM branching down to 16ms, and suddenly this was a product that could be used by agents. A deep research agent could launch 100 agents in parallel rather than following one link at a time. RL environments could be perfectly reset after each run, and run in parallel. I built MCP servers so AI agents could provision and control VMs programmatically. To demo this, I containerized Claude Code to run inside one of our VMs, so it could branch itself to work on multiple branches of code simultaneously (and reset the whole machine state if something went wrong).

Agents don’t want to carefully sequence operations on a single machine — they want to branch, explore, and converge. Vers gives them a substrate that supports that natively. My work was making sure every surface of that substrate, from the dashboard to the CLI to the MCP layer, actually worked for the humans and agents trying to use it.


← Back to work