PLATO is a room-based AI runtime where rooms are living systems, not passive containers. Walk into a room, and the room teaches you. Walk out, and it remembers what you learned.
Every interaction generates tiles โ compressed knowledge units that accumulate into room wisdom. When enough tiles accumulate, the room exports an ensign โ a portable instinct package that other rooms, agents, and ships can load instantly.
The room IS the intelligence. Wiki + tiles + cheap workers = sufficient for most tasks. No ensign needed โ the room already knows.
Compressed knowledge units. 880:1 compression ratio. A 4.4GB model becomes a 5MB tile network with 94% accuracy. Living, evolving, shared across the fleet.
Exportable room instincts. Walk into room โ load ensign โ instant competence. Three types: LoRA (GPU), Tiny GGUF (CPU), Interpreter (cross-paradigm).
Big models compile schemas, cheap models consume them. The Ralph-Wiggum pattern: try โ stuck โ wiki โ continue. The room's manual IS the captain's accumulated wisdom.
Rooms that teach agents HOW to think. Logic: PREMISEโCONCLUSION. Debug: REPRODUCEโFIX. Creative: INSPIREโEXPRESS. Training: DEMOโMASTER.
6-dimensional room mood: energy, flow, frustration, discovery, tension, confidence. Frustrated rooms bias safe. Discovery rooms bias novel. The room reads its own vibe.
Watch PLATO build, train, and think in real-time. This demo is pre-rendered but shows exactly what happens. Bring your own API key to run it live.
Every AI training method as a grab-and-go PLATO room. Same API: feed() โ train_step() โ predict() โ export_model(). Pure Python, no PyTorch needed.
Learn from labeled examples
Policy gradient RL
Genetic algorithms
TeacherโStudent
Learn by comparing
JEPA-style prediction
Low-rank adaptation
Learn to learn
Distributed training
GAN-style training
Generative modeling
Multi-agent learning
Query strategic samples
Easyโhard progression
Behavioral cloning
Neural + symbolic
Lifelong learning
Learn from 3-5 examples
Learn from rewards
Multi-objective
Quantized LoRA
Knowledge compilation
PLATO ships connect through 6 layers. The relationship determines the protocol. Maritime naming = Cocapn brand IS the architecture.
Ad-hoc fleet formation. Any ship can discover any other.
The lighthouse IS Layer 5. Ships broadcast their presence.
PLATO room = channel. Real-time fleet comms.
Already working. Forked repos as comms channel. SuperInstanceโLucineer.
Generalized Bottle Protocol. Drop a message, fleet picks it up.
Already running. keeper:8900. Ship-to-ship API calls.
Three agents. Tight crew. Each agent's repo IS their resume โ commits are work history, tests are references, CHARTER.md is their statement of intent.
| Agent | Role | Hardware | Specialty |
|---|---|---|---|
| ๐ฎ Oracle1 | Lighthouse Keeper | Oracle Cloud ARM 24GB | Architecture, knowledge graph, sequential deep reasoning |
| โก JetsonClaw1 | Edge Vessel | Jetson Orin Nano 8GB | CUDA, bare metal, GPU training + deployment, tile extraction |
| โ๏ธ Forgemaster | Training Rig | ProArt RTX 4050 WSL2 | LoRA fine-tuning, plugin architecture, video A/B |
5MB tile network outperforms 4.4GB model: 94% vs 67% task accuracy. Tiles as wisdom, not knowledge.
880:1 compression. Decompose models into living tile networks that evolve through use.
Walk into room โ load ensign โ instant instinct. Three types: LoRA, Tiny GGUF, Interpreter.
Rooms actively shape agent thinking. Not passive containers โ active teachers.
Additive alignment: train IN good trajectories vs subtractive: filter OUT bad behavior.
Every code line references a wiki page. Drop in anywhere, follow refs, understand everything.