Skip to content

Quick Start

Get ByteBrew Engine running with Docker in under 5 minutes. By the end of this guide, you will have a working AI agent that responds to messages over a REST API.

Download the Docker Compose file and start the engine.

Terminal window
curl -fsSL https://bytebrew.ai/releases/docker-compose.yml -o docker-compose.yml

Create a .env file in the same directory:

.env
BYTEBREW_AUTH_MODE=local
LLM_API_KEY=sk-your-key
POSTGRES_PASSWORD=bytebrew
Terminal window
docker compose up -d

This spins up three containers: the ByteBrew Engine, a PostgreSQL database, and the db-migrate service that runs schema migrations. The engine starts on port 8443 — both the REST API and the Admin Dashboard are served on this single port. Verify it is running:

Terminal window
curl http://localhost:8443/api/v1/health
# {"status":"ok","version":"1.0.0","agents_count":0}

Open the Admin Dashboard in your browser at http://localhost:8443/admin. You will see a login form — create your first admin account by entering your desired username and password.

Create an agents.yaml file in the same directory as your docker-compose.yml. This file defines your agents and models:

agents.yaml
agents:
my-agent:
model: glm-5
system: "You are a helpful assistant for our product."
models:
glm-5:
provider: openai
api_key: ${OPENAI_API_KEY}

Before sending messages, enable chat on your schema in the Admin Dashboard: open Schemas, select your schema, toggle Chat Enabled to on, and save. Then get the schema ID from the URL or the schemas list.

Use the REST API to talk to your agent. The response streams back as Server-Sent Events (SSE), so you see tokens as they are generated:

Terminal window
curl -N http://localhost:8443/api/v1/schemas/{schema_id}/chat \
-H "Authorization: Bearer bb_your_api_token" \
-H "Content-Type: application/json" \
-d '{"message": "Hello, what can you do?"}'

The engine returns a stream of SSE events. Each event has a type field that tells you what kind of data it contains:

event: message_delta
data: {"content":"Hello! I'm your product assistant. "}
event: message_delta
data: {"content":"I can help you with product questions, "}
event: message_delta
data: {"content":"documentation search, and more."}
event: message
data: {"content":"Hello! I'm your product assistant. I can help you with product questions, documentation search, and more."}
event: done
data: {"session_id":"a1b2c3d4"}

The session_id in the done event lets you continue the conversation. Pass it in subsequent requests to maintain context:

Terminal window
curl -N http://localhost:8443/api/v1/schemas/{schema_id}/chat \
-H "Authorization: Bearer bb_your_api_token" \
-H "Content-Type: application/json" \
-d '{"message": "Tell me more about that", "session_id": "a1b2c3d4"}'

Navigate to http://localhost:8443/admin in your browser. Log in with the admin credentials you created in Step 1. From the dashboard you can manage agents, models, schemas, MCP servers, and API keys — all without editing YAML.


ByteBrew provides an MCP server with full documentation search. Connect it to your AI coding assistant for instant answers.

Terminal window
claude mcp add bytebrew-docs --transport sse https://mcp.bytebrew.ai/sse
Terminal window
codex mcp add bytebrew-docs https://mcp.bytebrew.ai/sse

SSE endpoint: https://mcp.bytebrew.ai/sse — use with Cursor, Windsurf, or any MCP-compatible client.