SYS: OPERATIONALUPTIME: 99.8%

CORE CONCEPTS

Streaming

Streaming delivers agent responses token-by-token as they are generated, enabling real-time chat interfaces with sub-100ms time to first token.

Basic Streaming

Enable streaming by setting stream: true in your execution request:

Basic streaming
const stream = await client.agents.execute(agent.id, { input: "Explain how AI agents work", stream: true, }); for await (const event of stream) { switch (event.type) { case "text_delta": process.stdout.write(event.text); break; case "tool_call_start": console.log("\nCalling tool:", event.name); break; case "tool_call_end": console.log("Tool result:", event.result); break; case "done": console.log("\n\nComplete. Tokens:", event.usage.totalTokens); break; case "error": console.error("Error:", event.message); break; } }

Event Types

EventDescription
text_deltaA chunk of generated text. Concatenate to build the full response.
tool_call_startAgent begins calling a tool. Includes tool name and parameters.
tool_call_endTool execution completed. Includes the result object.
thinkingAgent's internal reasoning (if extended thinking is enabled).
doneGeneration complete. Includes final usage statistics.
errorAn error occurred during generation.

Server-Sent Events (SSE)

The REST API uses Server-Sent Events for streaming. Here is the raw SSE format:

SSE format
// POST /v1/agents/:id/execute with stream=true // Returns text/event-stream data: {"type":"text_delta","text":"I'd"} data: {"type":"text_delta","text":" be"} data: {"type":"text_delta","text":" happy"} data: {"type":"text_delta","text":" to help"} data: {"type":"tool_call_start","name":"lookup_order","parameters":{"orderId":"ORD-123"}} data: {"type":"tool_call_end","name":"lookup_order","result":{"status":"shipped"}} data: {"type":"text_delta","text":"Your order has been shipped!"} data: {"type":"done","usage":{"inputTokens":142,"outputTokens":89}}

React Integration

Use streaming with React for real-time chat UIs:

React chat component
import { useState, useCallback } from "react"; import { YourAutomation } from "@yourautomation/sdk"; const client = new YourAutomation({ apiKey: process.env.NEXT_PUBLIC_YA_KEY! }); function Chat({ agentId }: { agentId: string }) { const [messages, setMessages] = useState<string[]>([]); const [streaming, setStreaming] = useState(""); const send = useCallback(async (input: string) => { setMessages((prev) => [...prev, input]); setStreaming(""); const stream = await client.agents.execute(agentId, { input, stream: true, }); let full = ""; for await (const event of stream) { if (event.type === "text_delta") { full += event.text; setStreaming(full); } if (event.type === "done") { setMessages((prev) => [...prev, full]); setStreaming(""); } } }, [agentId]); return ( <div> {messages.map((m, i) => <p key={i}>{m}</p>)} {streaming && <p>{streaming}</p>} </div> ); }

Next.js API Route

Forward streaming responses through a Next.js API route to keep your API key server-side:

app/api/chat/route.ts
import { YourAutomation } from "@yourautomation/sdk"; const client = new YourAutomation({ apiKey: process.env.YA_API_KEY! }); export async function POST(req: Request) { const { input, agentId, conversationId } = await req.json(); const stream = await client.agents.execute(agentId, { input, conversationId, stream: true, }); // Convert to ReadableStream for the browser const encoder = new TextEncoder(); const readable = new ReadableStream({ async start(controller) { for await (const event of stream) { controller.enqueue( encoder.encode(`data: ${JSON.stringify(event)}\n\n`) ); } controller.close(); }, }); return new Response(readable, { headers: { "Content-Type": "text/event-stream", "Cache-Control": "no-cache", Connection: "keep-alive", }, }); }

Backpressure & Cancellation

You can cancel a stream at any time using an AbortController:

Cancel streaming
const controller = new AbortController(); const stream = await client.agents.execute(agent.id, { input: "Write a long essay about...", stream: true, signal: controller.signal, }); // Cancel after 5 seconds setTimeout(() => controller.abort(), 5000); try { for await (const event of stream) { process.stdout.write(event.type === "text_delta" ? event.text : ""); } } catch (err) { if (err.name === "AbortError") { console.log("\nStream cancelled by user"); } }

Note

When a stream is cancelled, you are only charged for tokens generated up to the cancellation point.