Getting StartedArchitecture Overview

Architecture Overview

Aerostack has four layers that compose together. Understanding how they connect will help you decide what to build and how.

The Full Picture

Layer 1: Edge Runtime (Foundation)

Every function, MCP server, and skill runs on Cloudflare Workers with native bindings to platform primitives. These are not HTTP calls — they’re direct in-process access.

PrimitiveBindingWhat It DoesLatency
Databaseenv.DBSQL (SQLite-compatible) + Postgres routing~0ms
Cacheenv.CACHEKey-value cache with TTL, atomic counters~0ms
Queueenv.QUEUEBackground jobs with status tracking~0ms
AIenv.AIMulti-provider LLM (OpenAI, Anthropic, Google, Groq)Provider RTT
Vector Searchenv.VECTORIZESemantic similarity, embeddings~0ms
Storageenv.STORAGEObject storage with CDN~0ms

Compare this to traditional serverless:

TraditionalAerostack
Function → HTTP → Database API → responseFunction → env.DB.prepare() → response
Function → HTTP → Redis → responseFunction → env.CACHE.get() → response
Function → HTTP → SQS → responseFunction → env.QUEUE.send() → response
50-200ms per service call~0ms — all in-worker

Layer 2: Functions, MCP Servers & Skills

These are the building blocks you create and deploy:

Functions

Fullstack edge functions that power everything. They can serve HTTP, process queue messages, run on cron schedules, or be called by other Aerostack components.

src/index.ts → aerostack deploy → runs on Cloudflare edge (300+ cities)
                                    ↓ native access to
                                    DB · Cache · Queue · AI · Vector · Storage

MCP Servers

Expose tools that AI can call. Three deployment modes:

Proxy mode is key for teams: your MCP server stays on your infrastructure, but Aerostack handles secrets, access control, and monitoring. Team members never see API keys.

Skills

Single-purpose tools (e.g., “send email”, “generate PDF”). Lighter than a full MCP server. Deploy as a Worker, install into workspaces.

Layer 3: Workspaces (Composition)

A workspace combines multiple MCP servers, skills, and functions behind one gateway URL. Any MCP-compatible client can connect to it.

Workspace "customer-tools"
├── Stripe MCP (proxy — keys in vault)
├── GitHub MCP (hosted on Aerostack)
├── Notion MCP (installed from Hub)
├── Skill: send-email
└── Function: order-lookup (fullstack edge)

Gateway URL: https://mcp.aerostack.dev/ws/customer-tools
Token: mwt_abc123...

What connects to workspaces:

  • Bots — connect via workspace_id, get access to all tools
  • Claude Desktop / Cursor — connect via gateway URL + bearer token
  • Any MCP client — standard JSON-RPC 2.0 protocol
  • Other Aerostack workspaces — composition of compositions

Layer 4: Bots (Consumer-Facing AI)

Bots are the consumer-facing layer. They receive messages from platforms and use workspace tools to take action.

Two modes serve different needs:

Agent LoopWorkflow
Best forOpen-ended chat, Q&ADeterministic flows, approvals
Tool accessLLM decides which toolsYou place tools explicitly
AuthVia toolsNative auth_gate node
Human approvalVia toolsNative human_handoff node
DelegationNoYes — call other bots
Code executionNoYes — sandboxed JavaScript
Scheduled messagesNoYes — delay-based

The SDK (Client Access)

The SDK gives your frontend/mobile app direct access to the edge runtime primitives:

One install, one provider — all primitives accessible:

// React — one import, everything available
const { user, signIn } = useAuth()
const { query } = useDb()
const { messages } = useSubscription('chat/room-1')

How the Layers Compose

Here’s a real-world example — a customer support system:

User messages Telegram bot

Bot (Workflow Mode)
  ↓ auth_gate → verify customer email via OTP
  ↓ mcp_tool → stripe__get_customer (Stripe MCP, proxied)
  ↓ mcp_tool → order-lookup (Function, queries DB natively)
  ↓ logic → order.total > $500?
  ↓   yes → human_handoff → notify manager on Slack
  ↓           manager approves → mcp_tool → stripe__create_refund
  ↓   no → mcp_tool → stripe__create_refund (auto-approve)
  ↓ send_message → "Your refund of $X has been processed"

Behind the scenes:
  - Stripe API key never left Aerostack's encrypted vault
  - Manager approved from their phone via Slack notification
  - order-lookup function queried the database natively (~0ms)
  - Every tool call logged with per-user analytics
  - Customer never left the Telegram chat

Next Steps

Now that you understand the architecture, pick where to start:

  • Functions — The foundation. Write fullstack edge functions.
  • MCP Servers — Host, proxy, or install AI tools.
  • Workspaces — Compose tools behind one URL.
  • Bots — Build intelligent bots with workflows.
  • SDK — Connect your frontend to the edge runtime.