Architecture Overview
Aerostack has four layers that compose together. Understanding how they connect will help you decide what to build and how.
The Full Picture
Layer 1: Edge Runtime (Foundation)
Every function, MCP server, and skill runs on Cloudflare Workers with native bindings to platform primitives. These are not HTTP calls — they’re direct in-process access.
| Primitive | Binding | What It Does | Latency |
|---|---|---|---|
| Database | env.DB | SQL (SQLite-compatible) + Postgres routing | ~0ms |
| Cache | env.CACHE | Key-value cache with TTL, atomic counters | ~0ms |
| Queue | env.QUEUE | Background jobs with status tracking | ~0ms |
| AI | env.AI | Multi-provider LLM (OpenAI, Anthropic, Google, Groq) | Provider RTT |
| Vector Search | env.VECTORIZE | Semantic similarity, embeddings | ~0ms |
| Storage | env.STORAGE | Object storage with CDN | ~0ms |
Compare this to traditional serverless:
| Traditional | Aerostack |
|---|---|
| Function → HTTP → Database API → response | Function → env.DB.prepare() → response |
| Function → HTTP → Redis → response | Function → env.CACHE.get() → response |
| Function → HTTP → SQS → response | Function → env.QUEUE.send() → response |
| 50-200ms per service call | ~0ms — all in-worker |
Layer 2: Functions, MCP Servers & Skills
These are the building blocks you create and deploy:
Functions
Fullstack edge functions that power everything. They can serve HTTP, process queue messages, run on cron schedules, or be called by other Aerostack components.
src/index.ts → aerostack deploy → runs on Cloudflare edge (300+ cities)
↓ native access to
DB · Cache · Queue · AI · Vector · StorageMCP Servers
Expose tools that AI can call. Three deployment modes:
Proxy mode is key for teams: your MCP server stays on your infrastructure, but Aerostack handles secrets, access control, and monitoring. Team members never see API keys.
Skills
Single-purpose tools (e.g., “send email”, “generate PDF”). Lighter than a full MCP server. Deploy as a Worker, install into workspaces.
Layer 3: Workspaces (Composition)
A workspace combines multiple MCP servers, skills, and functions behind one gateway URL. Any MCP-compatible client can connect to it.
Workspace "customer-tools"
├── Stripe MCP (proxy — keys in vault)
├── GitHub MCP (hosted on Aerostack)
├── Notion MCP (installed from Hub)
├── Skill: send-email
└── Function: order-lookup (fullstack edge)
Gateway URL: https://mcp.aerostack.dev/ws/customer-tools
Token: mwt_abc123...What connects to workspaces:
- Bots — connect via
workspace_id, get access to all tools - Claude Desktop / Cursor — connect via gateway URL + bearer token
- Any MCP client — standard JSON-RPC 2.0 protocol
- Other Aerostack workspaces — composition of compositions
Layer 4: Bots (Consumer-Facing AI)
Bots are the consumer-facing layer. They receive messages from platforms and use workspace tools to take action.
Two modes serve different needs:
| Agent Loop | Workflow | |
|---|---|---|
| Best for | Open-ended chat, Q&A | Deterministic flows, approvals |
| Tool access | LLM decides which tools | You place tools explicitly |
| Auth | Via tools | Native auth_gate node |
| Human approval | Via tools | Native human_handoff node |
| Delegation | No | Yes — call other bots |
| Code execution | No | Yes — sandboxed JavaScript |
| Scheduled messages | No | Yes — delay-based |
The SDK (Client Access)
The SDK gives your frontend/mobile app direct access to the edge runtime primitives:
One install, one provider — all primitives accessible:
// React — one import, everything available
const { user, signIn } = useAuth()
const { query } = useDb()
const { messages } = useSubscription('chat/room-1')How the Layers Compose
Here’s a real-world example — a customer support system:
User messages Telegram bot
↓
Bot (Workflow Mode)
↓ auth_gate → verify customer email via OTP
↓ mcp_tool → stripe__get_customer (Stripe MCP, proxied)
↓ mcp_tool → order-lookup (Function, queries DB natively)
↓ logic → order.total > $500?
↓ yes → human_handoff → notify manager on Slack
↓ manager approves → mcp_tool → stripe__create_refund
↓ no → mcp_tool → stripe__create_refund (auto-approve)
↓ send_message → "Your refund of $X has been processed"
Behind the scenes:
- Stripe API key never left Aerostack's encrypted vault
- Manager approved from their phone via Slack notification
- order-lookup function queried the database natively (~0ms)
- Every tool call logged with per-user analytics
- Customer never left the Telegram chatNext Steps
Now that you understand the architecture, pick where to start:
- Functions — The foundation. Write fullstack edge functions.
- MCP Servers — Host, proxy, or install AI tools.
- Workspaces — Compose tools behind one URL.
- Bots — Build intelligent bots with workflows.
- SDK — Connect your frontend to the edge runtime.