Build apps in kilobytes
not megabytes
Naze is a declarative, AI-native language that compiles to WebAssembly. One source file. Three output formats. Zero dependencies. Designed for a world where agents write the code.
The Agent-First Web
The way we interact with the internet is about to fundamentally change.
Imagine never opening a browser tab again. You tell an AI agent what you need — it fetches, processes, and presents information. No clicking through menus. No scrolling through ads. No loading spinners. Just answers.
But Today's Web Wasn't Built For This
The average web page is 2.5 MB of HTML, CSS, and JavaScript — designed for human eyes, not machine consumption. When AI agents scrape these pages, they waste massive compute tokenizing bloated markup. Every unnecessary <div>, every CSS animation, every bundled framework adds tokens an agent must process and discard.
“A single web page tokenizes to ~165,000 tokens. That's reading a 400-page book — just to extract a restaurant's menu.”
The agent-first web needs a new foundation. A language that compiles to kilobytes, not megabytes. Where the binary is the API. Where AI agents can generate, serve, and consume applications without the overhead of a stack designed in the 1990s.
That language is Naze.
Why Naze?
Built from first principles for a world where AI writes most of the code.
AI-Native, Human-Readable
Designed as a compilation target for AI agents, but reads like a document humans can understand. One canonical form per concept.
Self-Contained Components
Components compose via ‘use’, but the compiler inlines everything into a flat render tree at build time. Each component is fully self-contained — an AI agent reads one file, not five. That’s why σ stays at 1.
No Middle Layers
No bundler, no transpiler, no virtual DOM. Intent reaches pixels directly: parse, typecheck, serialize, layout, render.
One Source, Every Platform
The same .naze file targets web (WASM + Canvas), desktop (native window), and mobile. Compile once, run everywhere.
Introducing FAAD
A paradigm where AI agents manage the complete software lifecycle. Humans provide intent and approve results. Machines handle everything else.
Today, developers write code and AI assists. With FAAD, AI agents build, test, debug, deploy, and maintain software autonomously. Naze is engineered for this future — its grammar is small enough for local AI models, its components are self-contained for parallel generation, and its binary format is the API.
Token Complexity
A mathematical framework for measuring the true cost of AI-driven development.
The key insight: Naze achieves because every component is self-contained. This makes complexity — linear instead of the or typical of multi-file frameworks.
Cost at 100 Components
Estimated token cost (Ψ) for a 100-component application
The Energy Equation
Token reduction isn't just a developer convenience — it's an environmental imperative.
The key insight: Every variable Naze minimizes — through minimal syntax, through self-containment, through unambiguous grammar — multiplicatively reduces energy and carbon. At 98.9% fewer tokens per page, the environmental impact scales accordingly.
At Scale
Processing 1 million pages
Energy
CO₂ Emissions
The IEA projects AI data centers will consume 945 TWh by 2030 — double today's levels. Every token we eliminate matters. Naze doesn't just make AI development faster — it makes the agent-first web sustainable.
Sources: IEA Energy and AI Report, HTTP Archive Web Almanac 2024, NVIDIA H100 benchmarks
The Three-Layer Architecture
One source file compiles into three distinct layers. Humans need all three. AI agents typically need only Layer 1.
Presentation
UI tree, themes, animations, layout, colors, typography
Interaction
Event handlers, navigation, actions, validation
Data
State, computed values, server functions, data bindings
Three Outputs From One Source
The HTML/CSS/JS stack forces AI models to navigate three separate languages, a virtual DOM, bundler configurations, and framework-specific abstractions. Naze eliminates this waste — the compiled binary is the API.
Tiered Grammar
The grammar is partitioned into independent tiers. Lower tiers never depend on higher ones. An agent building dashboards needs only Tier 0.
Layout, elements, state, events, themes, components
Fetch, streams, server functions, storage, timers
Models, declarative queries
Prompt blocks, provider configuration
Concurrency, file IO, networking (future)
The command nazec grammar --format gbnf exports the grammar for constrained decoding, enabling local 3-7B models to match cloud-scale quality at zero cost.
Train Any Model
Naze's grammar is small enough to fine-tune on a single GPU. Local or cloud, every model speaks Naze.
Tiny Training Footprint
The full grammar fits in ~52K tokens. Fine-tune a 3B-parameter model on consumer hardware in hours, not weeks.
Faster Development Cycles
Small grammar means fewer training iterations, faster convergence, and rapid iteration on model improvements.
Local Models
Run Naze-trained models entirely offline with Ollama. Constrained decoding via GBNF export guarantees syntactically valid output from any local model.
Cloud Models
Cloud models already excel at Naze — fewer tokens per prompt means lower cost, faster responses, and higher accuracy than multi-language stacks.
Traditional web stacks require models to master HTML, CSS, JavaScript, framework APIs, and build tooling. Naze replaces all of that with one grammar that exports directly to GBNF for constrained decoding. The result: any model, any size, produces correct code on the first try.
See It in Action
Components compose with use, themes resolve with dot notation — and the compiler inlines it all. σ stays at 1.
app "Counter" {
state count = 0
let title = "My Counter"
column padding: 20px, gap: 16px {
heading "{title}"
text "Current count: {count}"
rect width: 200px, height: 50px,
color: #2563eb, radius: 8px {
text "Increment"
on click: set count = count + 1
}
rect width: 200px, height: 50px,
color: #dc2626, radius: 8px {
text "Reset"
on click: set count = 0
}
}
}Why curly braces? Modern LLMs handle indentation well in fresh code, but still miscount whitespace when editing existing files — exactly the agentic workflow Naze targets. In Python, one wrong indent is a syntax error; with braces, it's cosmetic. Braces make Naze fault-tolerant by design. Research source ↗
The App Factory
Imagine typing a description into your browser and getting a working app back in seconds. Not a mockup. A real, running application.
In the future, users interact with agents, not browsers. Agents generate Naze apps on the fly to display data, run workflows, or answer questions.
Describe
User describes an app in natural language
Discover
Agent queries the registry for reusable packages
Generate
Agent writes .naze source importing packages
Compile
In-browser compiler produces a working app in seconds
Publish
User saves, forks, edits, or publishes back
Grow
Each app enriches the registry for future generation
The Naze Browser — type a description, get a running app. Every generated app becomes a composable building block in a growing ecosystem.
Help Build the Future
Naze is open source and actively looking for contributors. Whether you're a language designer, compiler engineer, or AI researcher — there's a place for you.