Working Paper
Why we confuse optimization with progress
It's 2026. The flying cars never came. The Mars colonies don't exist. The subways we take to work date from the 19th century, while we send cat pictures on our smartphones. The question is: Why have we stopped thinking truly new thoughts?
This paper analyzes why we confuse optimization within existing paradigms with real progress — and why the next era of innovation requires systemic thinking and emergent complexity.
Why faster chips and bigger models don't create new paradigms
How path dependencies prevent real innovation
Intelligence arises from interaction, not instruction
Patterns over mechanisms, context over content
Working Paper
An Introduction to Token-Based Dataflow Architecture
The architecture of modern computers was designed in 1945. For eight decades we've optimized the sequential von Neumann bottleneck instead of replacing it. FLUID asks: what happens when we rethink computation from scratch?
This paper introduces FLUID — a dataflow architecture where self-describing tokens flow between specialized Processing Elements. Parallelism emerges naturally, security lives in the hardware, and computation becomes communication.
256-bit self-describing packets replace the fetch-decode-execute cycle
Extensible with analog, optical, or quantum Processing Elements
Data flow is the program — no registers, no program counter
Same inputs, same graph → same outputs in the same number of cycles