orchestrate at scale
Run 10–100 agents in parallel across many projects. Each in its own git worktree, its own isolated context, its own brain. They never step on each other — and you stay one person.
PaellaDoc ADO is the orchestrator that lets one person run multiple projects with 10–100 agents working in parallel — no merge wars, no shared context bleed. Every chat, decision and artifact stays on your machine. Bring any model — Claude, Codex, Gemini, or a local Llama from a cabin in the woods.
Run 10–100 agents in parallel across many projects. Each in its own git worktree, its own isolated context, its own brain. They never step on each other — and you stay one person.
Every chat, every decision, every artifact your agents produce stays in plain SQLite on your machine. Your domain expertise, encoded. Not training data for someone else.
Today Claude. Tomorrow Codex. Next week a 70B Llama on your laptop in a cabin with no signal. The orchestrator is the constant — the model is a hot-swappable part.
# your fleetYOU (1 human) ├── orchestrator · paelladoc-ado · local │ ├── project: acme/portfolio-v3 │ │ ├── worktree-01 → claude · story-014 · ✓ │ │ ├── worktree-02 → codex · story-015 · ↻ │ │ └── …16 more │ ├── project: paelladoc/ado-core │ │ ├── worktree-01 → codex · refactor · ✓ │ │ └── …23 more │ └── project: cabin/local-llm-bench │ └── worktree-01 → llama-local · ✓ (offline) └── context_store · sqlite · ~/.paelladoc/ ├── chats.db 2.4 GB ├── decisions.db 180 MB └── artifacts/ 11 GB
Every agent runs in its own git worktree. Its own branch, its own filesystem view, its own conversation. Cleanup is automatic. Merge wars stop being a thing.
You are not the bottleneck. Queue 50 user stories before bed; wake up to 50 worktrees finished, replayed, and ready to review at your speed.
The orchestrator is on your machine. The context is on your machine. The agents call out to whichever brain you choose — or to a local model you self-host.
The real value of working with AI for months isn't the code it generates. It's the knowledge graph you build along the way: your decisions, your edge cases, the way your domain actually works.
Most agentic tools quietly turn that into someone else's training data. Or they lock it inside a single frontier model — switch and you lose everything.
PaellaDoc ADO inverts it. Your context lives in plain SQLite, on your disk, in formats you can read. Point any model at it — Claude today, Codex tomorrow, a local Llama from a cabin next week. The brain changes; your knowledge doesn't.
Air-gapped clients. On-prem mandates. Regulated industries that forbid frontier APIs. Or just a cabin off-grid. PaellaDoc ADO keeps the product methodology intact — PRD, epics, stories, acceptance criteria, golden-gate validation — and routes each story to whatever brain your context allows. Same hierarchy. Same evidence trail. Same shipping bar.
Day-0 frontier MoE, running locally. The orchestrator drives it like any other backend — same PRD, same golden gate, same audit trail.
Six side-products. One human. The orchestrator runs each as its own fleet, and the context never bleeds between them.
100 stories across a legacy migration. Queue them, assign brains, watch the tree turn green. The PM sees what every agent decided, not just what it shipped.
Owns the model weights. Owns the context. Codes from a cabin. PaellaDoc just routes the work.
Client code never leaves the laptop. Frontier APIs are opt-in per project. Audit trail in plain SQLite.
Same task, three brains, side-by-side diff. Keep the best output. Stop guessing which model is good at what.
“I'm one person. I run several products. I don't want to be a manager of agents — I want to be the architect, and let a fleet run.
I also don't want my entire mental model — every conversation, every decision — to live inside a vendor's cloud where it disappears the day I switch tools.
PaellaDoc was for the new paradigm — context and spec-driven over the code itself. The architect writes the spec; the system ships the code. 5× productivity — thesis. 277★ in April 2025, when nobody was talking about this yet.
Now I'm back. PaellaDoc ADO — chasing 100× for one human.”
free · 100% local · no account · macOS apple silicon