Quick Start
Get ByteBrew Engine running with Docker in under 5 minutes. By the end of this guide, you will have a working AI agent that responds to messages over a REST API.
Step 1: Start the Engine
Section titled “Step 1: Start the Engine”Download the Docker Compose file and start the engine.
curl -fsSL https://bytebrew.ai/releases/docker-compose.yml -o docker-compose.ymlCreate a .env file in the same directory:
BYTEBREW_AUTH_MODE=localLLM_API_KEY=sk-your-keyPOSTGRES_PASSWORD=bytebrewdocker compose up -dThis spins up three containers: the ByteBrew Engine, a PostgreSQL database, and the db-migrate service that runs schema migrations. The engine starts on port 8443 — both the REST API and the Admin Dashboard are served on this single port. Verify it is running:
curl http://localhost:8443/api/v1/health# {"status":"ok","version":"1.0.0","agents_count":0}Open the Admin Dashboard in your browser at http://localhost:8443/admin. You will see a login form — create your first admin account by entering your desired username and password.
Step 2: Create your first agent
Section titled “Step 2: Create your first agent”Create an agents.yaml file in the same directory as your docker-compose.yml. This file defines your agents and models:
agents: my-agent: model: glm-5 system: "You are a helpful assistant for our product."
models: glm-5: provider: openai api_key: ${OPENAI_API_KEY}Step 3: Send your first message
Section titled “Step 3: Send your first message”Before sending messages, enable chat on your schema in the Admin Dashboard: open Schemas, select your schema, toggle Chat Enabled to on, and save. Then get the schema ID from the URL or the schemas list.
Use the REST API to talk to your agent. The response streams back as Server-Sent Events (SSE), so you see tokens as they are generated:
curl -N http://localhost:8443/api/v1/schemas/{schema_id}/chat \ -H "Authorization: Bearer bb_your_api_token" \ -H "Content-Type: application/json" \ -d '{"message": "Hello, what can you do?"}'Step 4: See the response
Section titled “Step 4: See the response”The engine returns a stream of SSE events. Each event has a type field that tells you what kind of data it contains:
event: message_deltadata: {"content":"Hello! I'm your product assistant. "}
event: message_deltadata: {"content":"I can help you with product questions, "}
event: message_deltadata: {"content":"documentation search, and more."}
event: messagedata: {"content":"Hello! I'm your product assistant. I can help you with product questions, documentation search, and more."}
event: donedata: {"session_id":"a1b2c3d4"}The session_id in the done event lets you continue the conversation. Pass it in subsequent requests to maintain context:
curl -N http://localhost:8443/api/v1/schemas/{schema_id}/chat \ -H "Authorization: Bearer bb_your_api_token" \ -H "Content-Type: application/json" \ -d '{"message": "Tell me more about that", "session_id": "a1b2c3d4"}'Step 5: Open the Admin Dashboard
Section titled “Step 5: Open the Admin Dashboard”Navigate to http://localhost:8443/admin in your browser. Log in with the admin credentials you created in Step 1. From the dashboard you can manage agents, models, schemas, MCP servers, and API keys — all without editing YAML.
Connect your AI assistant
Section titled “Connect your AI assistant”ByteBrew provides an MCP server with full documentation search. Connect it to your AI coding assistant for instant answers.
Claude Code
Section titled “Claude Code”claude mcp add bytebrew-docs --transport sse https://mcp.bytebrew.ai/sseOpenAI Codex
Section titled “OpenAI Codex”codex mcp add bytebrew-docs https://mcp.bytebrew.ai/sseOther MCP clients
Section titled “Other MCP clients”SSE endpoint: https://mcp.bytebrew.ai/sse — use with Cursor, Windsurf, or any MCP-compatible client.
What’s next
Section titled “What’s next”- Configuration Reference
- API Reference
- Bootstrap & GitOps — deploy a fully-configured installation from a git repository
- Core Concepts: Agents
- Core Concepts: Schemas