Gradient Descent for Code
To turn your terminal into an autonomous programmer that writes code, tests it, saves its work, and learns as it goes, we need just three LinuxToaster tools: One does AI in the command line. One does version control that humans and AI can use. One is a shell re-imagined for AI, with loops and networking.
The Tools
toast is composable AI for the terminal. Pipe text in, get intelligence out. It talks to cloud models, local models, or both. It reads a .persona file for its system prompt and a .tools file to know which commands it's allowed to run. In chat mode, it's a pair programmer. In pipe mode, it's a one-shot transform.
ito is intent-first version control. Instead of commits, you have moments. Instead of commit messages, you have why — the intent behind the change. Every moment captures the full tree, content-addressed by SHA-256. You can undo, branch, search by intent, and sync between machines over rsync.
jam is a shell rebuilt for AI. No quoting nightmares, no $ expansion. It has two constructs that matter here: times runs a command N times, while runs until the command signals completion. These turn toast from a tool you use into an agent that works on its own.
$ ito log "handle timeouts so slow clients don't block the server"
logged a3f912c handle timeouts so slow clients don't block the server
✓ saved — you can always get back to this
None of them knows the others exist at the code level. The coupling lives in the .persona — a plain text file that teaches the AI a workflow. Swap the persona, swap the workflow. The tools stay the same.
Three Modes
Pipe — one shot
Pipe anything into toast — diffs, source files, history — and get intelligence back. One command, one answer. No conversation, no state, just Unix pipes.
$ ito changes | toast "review this diff"
$ ito changes | toast "write a one-line log message for this"
$ ito history | toast "when did we last touch auth?"
$ cat error.log | toast server.c "fix"
$ toast server.c "add error handling"
Chat — pair programming
Toast enters chat mode. It reads .persona and knows it's a programmer. It reads .tools and knows it can run ito, gcc, cat, grep, and friends.
$ toast
> build me an http server in C
Toast runs ito status and ito history to orient itself. It reads .crumbs to check for prior context. It tells you what it plans to do. It writes code, builds it, tests it. It runs ito log "initial http server — listens on 8080, serves static files". You ask for changes. It makes them, tests them, logs them.
Every action is reversible. If toast breaks something, ito undo takes you back. Say "try something risky" and toast will suggest ito on experiment first, keeping trunk clean.
When you come back tomorrow and type toast, it reads ito history and .crumbs and picks up where you left off.
Autonomous — gradient descent
This is jam's times construct:
🍞 7 times toast server.c "harden this server"
This calls toast seven times in a loop. Each invocation is stateless — toast has no memory of the last call. But the files have memory. And ito has memory. And .crumbs has memory.
Each round, toast:
1. Reads .crumbs to see what past rounds learned.
2. Reads the code.
3. Finds one thing to improve.
4. Makes the change. Builds. Tests.
5. Runs ito log with intent.
6. Updates .crumbs with what it learned or what's left to do.
7. If nothing left to improve, outputs DONE to stop the loop.
After six rounds, ito history reads like a dev journal written by the AI. Here's what a typical run produces:
b7e44d1 validate Content-Length to prevent buffer overflow on malformed requests
c912fa3 add graceful shutdown on SIGTERM so in-flight responses complete
d0a8b72 switch listen backlog from 5 to SOMAXCONN for burst traffic
e4c1190 log peer address on connect for debugging production issues
f882d03 null-check malloc returns in request parser
DONE after 6 rounds
Each change is small. Each is tested. Each is reversible. The AI didn't try to rewrite the server in one pass — it made one improvement per round, verified it worked, and moved on.
If you squint, this is gradient descent applied to code. Each round minimizes one deficiency. The loss function is implicit in the prompt. The learning rate is one change per step. It's a loose analogy — there are no partial derivatives — but the shape is the same: iterative refinement toward a goal, with memory of where you've been.
You can also leave it unbounded — jam's while loop runs until toast says DONE:
🍞 while toast server.c "kaizen until production ready"
# or cap it for safety:
🍞 20 while toast server.c "kaizen until production ready"
Memory
Three layers, all plain files on disk:
ito search.cat.Architecture
Toast doesn't need to know about version control. It just needs to know it can run ito log. ito doesn't need to know about AI. It just stores moments. The .persona teaches the workflow; the tools stay decoupled.
The .tools file is the intent boundary — toast only runs commands listed in it. For enforcement, toast can run inside firejail — seccomp filters, filesystem namespaces, network restrictions at the OS level. The allowlist tells toast what to try; firejail decides what actually executes.
No orchestration layer. No state machine. No DAG. Just a shell loop, a text file for memory, and a VCS that makes everything reversible.
Getting Started
Two config files in a project directory:
# .persona (abridged)
You are a programmer. You write code, build, test, and version
your work using ito. Read .crumbs at the start of every session.
In autonomous mode: one change per round. If nothing left to
improve, output DONE.
# .tools — one command per line, bare names
ito
gcc
make
cat
ls
find
grep
curl
diff
# Install toast
$ curl -sSL linuxtoaster.com/install | sh
# Start a project
$ mkdir myproject && cd myproject && ito init
# Pair program:
$ toast
> build me a key-value store in C
# Or let it rip:
$ toast "write a naive key-value store in C" > store.c
$ ito log "initial skeleton"
$ 10 times toast store.c "improve — make it correct, fast, and safe"
$ ito history
Then read the trail of intent it left behind.
All three tools are written in C. ito is ~1,400 lines with zero dependencies. toast is ~3,200 lines (json-c, libsodium). jam is the shell that ties them together. macOS and Linux. linuxtoaster.com
To build an agent you no longer need: Python nonsense. Frameworks. Orchestration layers. Retry logic. State machines. DAG of tasks. Databases. Vector databases. Embeddings. Context window management. Plugins. SDKs. Tight couplings. Cloud service optional. Account optional. No telemetry. Let the complexity live in the model, not the infrastructure.