AI Operations (sdk.ai)
Access production-ready AI models directly from your server logic.
Introduction
import { AerostackServer } from '@aerostack/sdk';
const sdk = new AerostackServer(env);
const response = await sdk.ai.chat([{ role: 'user', content: 'Hello' }]);Features
chat(messages, options)
Chat completion using models like Llama 3.
const response = await sdk.ai.chat(
[
{ role: 'system', content: 'You are a helpful assistant.' },
{ role: 'user', content: 'Explain quantum computing.' }
],
{ temperature: 0.7 }
);
console.log(response.response);embed(text)
Generate embeddings for vector search (RAG).
const embedding = await sdk.ai.embed('Text to search');
// Returns number[] vectorgenerate(prompt)
Text generation/completion.
const story = await sdk.ai.generate('Once upon a time...');AI binding (AI) is automatically available in standard production environments.
Default model: @cf/meta/llama-3-8b-instruct.