Functions — Fullstack Edge Runtime
Aerostack Functions are fullstack edge functions running on Cloudflare Workers. Every function has native, in-process bindings to six platform primitives — Database, Cache, Queue, AI, Vector Search, and Storage. No HTTP calls. No SDK clients. Direct access at ~0ms latency.
Functions are the foundation layer of Aerostack. They power MCP servers, Skills, bot workflows, and (coming soon) AI Proxy and AI Endpoints. Everything you build on Aerostack runs on Functions under the hood.
Architecture
All six bindings are injected into your function’s env parameter. They are not HTTP calls — they are native Cloudflare bindings that execute within the same worker process. This means a database query or cache lookup adds microseconds, not milliseconds.
How Functions Compare
| Traditional Serverless | Aerostack Functions | |
|---|---|---|
| Database | fetch('https://your-db-api/query') — HTTP round trip | env.DB.prepare('SELECT ...').all() — in-process |
| Cache | redis.get(key) — network hop to Redis | env.CACHE.get(key) — same datacenter |
| Queue | sqs.sendMessage(...) — AWS API call | env.QUEUE.send(...) — native binding |
| AI | openai.chat(...) — external API | env.AI.run(model, ...) — built-in multi-provider |
| Vector | pinecone.query(...) — external API | env.VECTORIZE.query(...) — native binding |
| Storage | s3.putObject(...) — AWS API call | env.STORAGE.put(...) — native binding |
| Latency per call | 50-200ms per service call | ~0ms — all in-worker |
| Auth/config | Manage credentials for each service | Zero config — bindings are pre-wired |
The Cloudflare Workers Pattern
Aerostack functions follow the standard Cloudflare Workers interface. If you have written a Worker before, you already know how to write a function.
// src/index.ts
export default {
// Handle HTTP requests — required
async fetch(request: Request, env: Env, ctx: ExecutionContext): Promise<Response> {
const url = new URL(request.url)
if (url.pathname === '/users') {
const users = await env.DB.prepare('SELECT * FROM users WHERE active = 1').all()
return Response.json(users.results)
}
return new Response('Not found', { status: 404 })
},
// Handle queue messages — optional
async queue(batch: MessageBatch, env: Env): Promise<void> {
for (const msg of batch.messages) {
console.log('Processing job:', msg.body)
msg.ack()
}
},
// Handle cron triggers — optional
async scheduled(event: ScheduledEvent, env: Env, ctx: ExecutionContext): Promise<void> {
// Runs on a schedule (e.g. every hour)
await env.DB.prepare('DELETE FROM sessions WHERE expires_at < ?').bind(Date.now()).run()
}
}Three handlers, one file, six platform bindings. That is the entire model.
What You Can Build
| Use Case | What happens |
|---|---|
| Custom API backend | REST endpoints with DB, cache, and auth — no separate server needed |
| Bot intelligence layer | Function queries your DB, runs AI analysis, caches the result, returns it to a bot via MCP |
| Real-time data pipeline | Ingest webhook → enqueue job → process with AI → store in vector DB → cache result |
| RAG pipeline | Ingest documents → chunk → embed → store in vector search → query semantically from bots/MCP |
| Scheduled analytics | Cron function aggregates DB data → caches dashboard results → queues Slack notification |
| Smart form processor | Receive submission → validate → store in DB → queue confirmation email → respond |
| Webhook receiver | Accept incoming webhooks, validate signatures, enqueue for async processing |
| AI-powered search | User query → embed → vector search → re-rank with LLM → return results |
Next Steps
- Write your first function — step-by-step tutorial from zero to deployed
- Platform bindings — deep dive into each
env.*binding with code examples - Patterns and recipes — production-ready code for common architectures
- Deploy — CLI and dashboard deployment workflows
- Testing and debugging — local development, logs, and troubleshooting