BotsWorkflowsOverview

Workflows

The workflow engine gives your bot deterministic automation. Instead of letting the LLM decide everything, you design a directed graph of nodes and edges that define exactly how each message is processed. The engine executes the graph step by step, following your branching logic, calling the tools you specify, and sending the messages you define.


When to Use Workflows vs Agent Loop

ScenarioAgent LoopWorkflow
Open-ended conversation and Q&ABest choiceOverkill
Predictable multi-step processes (refunds, onboarding)UnreliableBest choice
Conditional routing and triagePossible but fragileBest choice
Guaranteed tool execution orderNot possibleBest choice
Identity verification before data accessNot availableauth_gate node
Human approval for high-stakes actionsNot availablehuman_handoff action
Delegating to specialist botsNot availabledelegate_to_bot node
Maximum flexibility for unexpected questionsBest choiceLimited

Rule of thumb: If you can draw the flow on a whiteboard with boxes and arrows, use a workflow. If the user might ask anything and you want the LLM to figure it out, use agent loop.


How Workflows Work

Workflows are JSON graphs stored in the bot’s workflow_json field. Each workflow has:

  • Nodes — individual processing steps (14 types available)
  • Edges — connections between nodes that define execution order

When workflow_enabled is set to 1, incoming messages are processed through the workflow graph instead of the agent loop.


Execution Model

  1. Entry point: The engine finds all trigger nodes and starts execution from them
  2. BFS traversal: Nodes are executed in breadth-first order following edges
  3. Variable passing: Each node can read from and write to a shared variables context
  4. Branching: Logic nodes control which edges are followed via sourceHandle values
  5. Safety limits: Maximum 50 nodes per execution, with cycle detection

Workflow JSON Structure

{
  "nodes": [
    {
      "id": "node_1",
      "type": "trigger",
      "data": { "triggerType": "message_received", "label": "On Message" },
      "position": { "x": 0, "y": 0 }
    },
    {
      "id": "node_2",
      "type": "llm_call",
      "data": {
        "prompt": "Classify the user's intent: {{message}}",
        "outputVariable": "intent",
        "label": "Classify Intent"
      },
      "position": { "x": 200, "y": 0 }
    },
    {
      "id": "node_3",
      "type": "send_message",
      "data": {
        "message": "Your intent was classified as: {{intent.text}}",
        "label": "Respond"
      },
      "position": { "x": 400, "y": 0 }
    }
  ],
  "edges": [
    { "id": "e1", "source": "node_1", "target": "node_2" },
    { "id": "e2", "source": "node_2", "target": "node_3" }
  ]
}

All 14 Node Types

Node TypePurposeKey Capability
triggerEntry pointSets message, user_id, platform variables
llm_callCall the LLMVariable interpolation in prompts
logicif/else, switch, whileDeterministic branching
mcp_toolCall workspace toolsAny MCP, Skill, or Function
send_messageReply to userTemplate interpolation
actionSide effectsset_variable, end_conversation, human_handoff, create_ticket
loopIterationfor_each, count, while (max 50 iterations)
code_blockRun JavaScriptSandboxed execution with variable access
auth_gateIdentity verificationOTP, magic link, 4 providers + custom HTTP
schedule_messageDelayed messagesSend at a future time
send_proactiveCross-channel messagesSend to a different channel or user
delegate_to_botBot-to-bot routingCall another bot (max 3 hops)
error_handlerGraceful fallbackCatch and handle errors
parallelFan-out executionMultiple paths in parallel

See Node Types for complete configuration details and examples for each type.


Enabling Workflows

Via Dashboard

  1. Navigate to your bot’s detail page
  2. Open the Workflow tab
  3. Use the visual builder to add and connect nodes
  4. Toggle Enable Workflow to activate

Via API

curl -X PATCH https://api.aerostack.dev/api/bots/YOUR_BOT_ID \
  -H "Authorization: Bearer YOUR_JWT_TOKEN" \
  -H "Content-Type: application/json" \
  -d '{
    "workflow_json": "{\"nodes\":[...],\"edges\":[...]}",
    "workflow_enabled": 1
  }'

When a workflow is enabled, the bot uses the workflow engine exclusively. The agent loop is bypassed entirely. Disable the workflow to return to agent loop mode.


Validation

When you update workflow_json, Aerostack validates the structure:

  • The JSON must parse successfully
  • It must contain nodes (array) and edges (array)
  • Invalid JSON returns a 400 error

Runtime validations:

  • A workflow with no trigger node returns “No trigger node in workflow”
  • A workflow with no nodes returns “Workflow is empty”
  • Cycles are detected and prevented (max 50 node executions)

Workflow Runs

Every workflow execution is recorded:

FieldDescription
statusrunning, completed, or error
nodes_executedHow many nodes were executed
nodes_totalTotal nodes in the workflow
execution_logDetails of each node’s execution and timing
error_messageError details if the workflow failed
duration_msTotal execution time

Use the Testing & Debugging page to learn how to inspect workflow runs.


Combining AI and Deterministic Logic

Workflows and agent loop are mutually exclusive per message. But llm_call nodes within a workflow still use the bot’s configured LLM — you get AI reasoning at specific points while maintaining deterministic flow control.

Example pattern:

  1. trigger on message
  2. llm_call to classify intent (AI decides the category)
  3. logic to branch (deterministic routing)
  4. mcp_tool to fetch data (guaranteed execution)
  5. llm_call to generate a response using the data (AI writes the reply)
  6. send_message to the user

This gives you the best of both: structured flow with AI-powered decision-making where it matters.


Next Steps