A Day With MetaMedium
The Water Tank Project
The MetaMedium proposes that drawing—humanity’s oldest form of external thought—can become the foundation for a new kind of human-AI communication. This is not merely a new interface. It is an evolution of language itself, where AI becomes a meta-word—a new linguistic element that transforms marks into meaning-in-context.
A Day With MetaMedium: The Water Tank Project
Morning: Kitchen Table
Sarah sketches a rough diagram on her tablet—three circles (tanks) connected by squiggly lines (pipes) near her house outline.
MetaMedium understands the intent but needs details. The canvas gently highlights the sketch: “I see you’re planning a rainwater collection system. To model this properly, I need to know: What’s your average rainfall? How much water do you use for gardening? Can you show me the actual space?”
Sarah hadn’t thought about rainfall data, but Claude surfaces local averages based on her location. She circles one number—”this one”—and writes “garden = 50 gal/week.”
MetaMedium: “Got it. Would help to see the actual site. Want to scan it?”
Mid-Morning: Outside
Sarah uses her phone to capture a 3D scan of the area against her house where she sketched the tanks. The scan uploads to the canvas.
The transformation happens live: Her rough sketch merges with the 3D scan. The three circles become precisely-rendered tanks positioned in real space. The squiggly lines become actual pipe routes with proper angles and connections. Physics constraints apply automatically—gravity flow, structural mounting points, drainage slopes.
MetaMedium shows the system in context: “This configuration gives you 275 gallons capacity, fills in about 2 weeks of average rainfall, supplies your garden for 5+ weeks. Here’s what it looks like installed.”
The canvas rotates the 3D view. Sarah frowns. The tanks feel too big, weirdly arranged.
Late Morning: Revising
She scribbles over the big tanks, redraws them smaller. Redirects one pipe path with a quick stroke.
MetaMedium updates in real-time. New tank volumes calculate. New pipe routes validate. The system re-renders: “Modified to 180 gallons, fills faster, still covers 3+ weeks of garden use. More aesthetically balanced. This work?”
Sarah smiles. Makes the checkmark—their special one, fast pen movement at the end.
MetaMedium knows this gesture: “Parts list coming up.”
A tidy list materializes as a portable artifact: tanks (specific models), pipes (lengths and diameters), connectors, mounting hardware, filter, overflow valve. Complete with links to local suppliers showing stock availability.
Afternoon: The Hardware Store
Sarah’s at the store. They have almost everything except the specific 2” ball valve Claude specified.
She photographs the alternative valve on the shelf, circles it: “Can I use this instead?”
MetaMedium processes the image: “Yes, but this valve has 1.5” threading. You’ll need to change the pipe reducer at junction B. Do they have part #PR-150 in stock?”
Sarah finds it, confirms.
MetaMedium: “Perfect. Updated system will work identically. You’re good to go.”
Late Afternoon: Installation
Home with parts scattered on the ground. Sarah opens her phone’s AR view.
MetaMedium overlays the installation guide directly on the real space. Virtual tanks float where they’ll mount. Pipe routes glow in mid-air showing exact paths. Step numbers hover near components.
She picks up the first tank. MetaMedium highlights the mounting points on the wall through her phone screen, shows drill hole positions, notes: “Level this within 2 degrees for proper drainage.”
She drills, mounts, moves to pipe assembly.
Confused about which connector goes where, she just asks aloud: “Which one is junction B?”
MetaMedium circles it in AR, animated arrow pointing: “This one. Connects the filter output to tank 2 input. Hand-tighten first, then wrench 1.5 turns.”
She encounters a tight spot where the pipe won’t quite fit the way the diagram showed. Circles the area with her finger, writes: “pipe hitting wall here”
MetaMedium recalculates: “Route it 3 inches lower, use the 45-degree elbow instead of 90. See?”
The AR view updates, showing the new path. That works.
Evening: Done
She tightens the last connection. Turns on the water to test. It flows perfectly—tanks filling, overflow working, no leaks.
Sarah sits back, satisfied. Makes another gesture they developed together—double-tap the canvas.
MetaMedium: “Nice work. System’s optimal. You know, with this setup and your existing garden layout, you could expand to the south bed without additional infrastructure. Water pressure’s sufficient, and you’re only using 60% capacity. Want to explore it?”
The canvas sketches a possibility—her garden extended, new beds highlighted, the same tank system feeding them. Seasonal yield projections float nearby.
Sarah grins. “Not today. But save that thought.”
MetaMedium: “Saved to ‘Future Projects.’ Whenever you’re ready.”
Fade to the canvas showing dozens of latent project threads, ready to become real whenever she returns. The water system diagram still there, now marked “completed,” part of an expanding web of her home’s evolution—each project informing the next, the metamedium remembering, learning, always ready to think together.
What Happened:
Started with rough sketch → became precise model
Verbal + visual + gestural communication seamlessly mixed
Real-world constraints (scans, physics) integrated automatically
Problem-solving happened through conversation, not commands
Custom gestures (fast checkmark, double-tap) learned between them
Context persisted across locations (kitchen → store → yard)
AR made virtual and physical spaces continuous
System proactively offered next possibilities based on learned patterns
Never switched “modes”—just continuous collaboration
This is the MetaMedium dream...


