Skip to content

Configuration

The CLI stores its configuration in ~/.agent-to-bricks/config.yaml. The bricks config commands read and write this file, but you can also edit it directly.

Here’s a complete config file with every available option:

site:
url: https://your-site.com
api_key: atb_a3b7c9d2e8f4...
llm:
provider: openai
api_key: sk-proj-abc123...
model: gpt-4o
base_url: ""
temperature: 0.3
KeyDescriptionExample
site.urlYour WordPress site URL, including the protocolhttps://your-site.com
site.api_keyAPI key from Settings > Agent to Bricks in WordPressatb_a3b7c9d2e8f4...
KeyDescriptionDefault
llm.providerThe AI provider to use(none)
llm.api_keyAPI key for the provider(none)
llm.modelModel name(none)
llm.base_urlCustom API endpoint URL"" (uses provider default)
llm.temperatureControls randomness in generation (0.0 = deterministic, 1.0 = creative)0.3

Use bricks config set to update individual keys:

Terminal window
bricks config set site.url https://your-site.com
bricks config set site.api_key atb_a3b7c9d2e8f4
bricks config set llm.provider anthropic
bricks config set llm.model claude-sonnet-4-20250514

The llm.temperature value can be set by editing the config file directly at ~/.agent-to-bricks/config.yaml.

Or run the interactive wizard to set everything at once:

Terminal window
bricks config init

You can override any config value with an environment variable. This is useful for CI pipelines, Docker containers, or keeping API keys out of the config file.

The pattern is ATB_ followed by the config key in uppercase with underscores replacing dots:

Config keyEnvironment variable
site.urlATB_SITE_URL
site.api_keyATB_SITE_API_KEY
llm.providerATB_LLM_PROVIDER
llm.api_keyATB_LLM_API_KEY
llm.modelATB_LLM_MODEL
llm.base_urlATB_LLM_BASE_URL
llm.temperatureATB_LLM_TEMPERATURE

Environment variables take precedence over the config file. Example:

Terminal window
ATB_SITE_URL=https://staging.your-site.com \
ATB_SITE_API_KEY=atb_staging_key_here_here \
bricks site info

This connects to your staging site without modifying config.yaml.

The CLI works with OpenAI, Anthropic, Cerebras, and any provider that exposes an OpenAI-compatible API.

Terminal window
bricks config set llm.provider openai
bricks config set llm.api_key sk-proj-abc123...
bricks config set llm.model gpt-4o

Other OpenAI models that work well: gpt-4o-mini (faster, cheaper), gpt-4-turbo.

Terminal window
bricks config set llm.provider anthropic
bricks config set llm.api_key sk-ant-api03-abc123...
bricks config set llm.model claude-sonnet-4-20250514

Anthropic models use a different API format than OpenAI. The CLI handles the translation automatically when you set the provider to anthropic.

Terminal window
bricks config set llm.provider cerebras
bricks config set llm.api_key csk-abc123...
bricks config set llm.model llama-4-scout-17b-16e-instruct

Cerebras runs open-source models on custom hardware. Response times tend to be fast, which makes it a good option for iterating quickly.

Any provider with an OpenAI-compatible API works. Set the provider to openai and point base_url at your endpoint:

Terminal window
bricks config set llm.provider openai
bricks config set llm.base_url http://localhost:11434/v1
bricks config set llm.api_key ollama
bricks config set llm.model llama3.1

This works with Ollama, LM Studio, vLLM, and similar tools. The api_key field still needs a value even if your local server doesn’t require authentication — just set it to any non-empty string.

Terminal window
bricks config set llm.provider openai
bricks config set llm.base_url https://api.together.xyz/v1
bricks config set llm.api_key your-together-key
bricks config set llm.model meta-llama/Llama-3-70b-chat-hf
Terminal window
bricks config set llm.provider openai
bricks config set llm.base_url https://openrouter.ai/api/v1
bricks config set llm.api_key your-openrouter-key
bricks config set llm.model anthropic/claude-sonnet-4-20250514

The temperature setting controls how creative or predictable the AI output is:

  • 0.0 — 0.2: Very consistent output. Good for structured tasks like converting a specific layout description into HTML.
  • 0.3 — 0.5: Balanced. The default of 0.3 works well for most generation tasks.
  • 0.6 — 1.0: More varied output. Can produce more interesting copy but may also introduce unexpected structural choices.

To change the temperature, edit ~/.agent-to-bricks/config.yaml directly:

llm:
temperature: 0.2

The config file supports a single site by default. To work with multiple sites, use environment variables to switch between them:

Terminal window
# Production
ATB_SITE_URL=https://your-site.com \
ATB_SITE_API_KEY=atb_prod_key_here \
bricks site info
# Staging
ATB_SITE_URL=https://staging.your-site.com \
ATB_SITE_API_KEY=atb_staging_key_here \
bricks site info

For convenience, wrap these in shell aliases or a small script:

Terminal window
# In your .bashrc or .zshrc
alias bricks-prod='ATB_SITE_URL=https://your-site.com ATB_SITE_API_KEY=atb_prod_key_here bricks'
alias bricks-staging='ATB_SITE_URL=https://staging.your-site.com ATB_SITE_API_KEY=atb_staging_key_here bricks'

Then use them like any other command:

Terminal window
bricks-staging site info
bricks-prod site pull 1460

The config file lives at ~/.agent-to-bricks/config.yaml on all platforms:

PlatformPath
Mac / Linux~/.agent-to-bricks/config.yaml
WindowsC:\Users\YourName\.agent-to-bricks\config.yaml

The CLI creates this directory and file automatically the first time you run bricks config init or bricks config set.

To see what’s currently configured:

Terminal window
bricks config list

This prints the active configuration with API keys partially redacted.