Skip to main content

OpenCode Agent

OpenCode is a multi-provider coding agent with native Language Server Protocol (LSP) support. LSP integration provides accurate code intelligence features like go-to-definition, find references, and hover information.

Prerequisites

  1. Go: Version 1.21 or higher (for installation)
  2. API Keys: For your chosen AI provider (Anthropic, OpenAI, etc.)

Installation

See the official OpenCode documentation for installation instructions.

After installation, verify it's working:

opencode --version

Configuration

Basic Configuration

# ~/.bloom/config.yaml
agent:
defaultInteractive: opencode
defaultNonInteractive: opencode

opencode:
defaultModel: anthropic/claude-sonnet-4 # REQUIRED for streaming mode
models:
- anthropic/claude-sonnet-4
- openai/gpt-4o
- github-copilot/claude-sonnet-4

Model Selection (Required)

OpenCode requires explicit model specification in non-interactive mode. Use the provider/model format:

agent:
defaultNonInteractive: opencode
opencode:
defaultModel: anthropic/claude-sonnet-4 # REQUIRED

Configuration Commands

# Set opencode as default
bloom config set-interactive opencode
bloom config set-noninteractive opencode

# Set default model (required for streaming mode)
bloom config set-model opencode anthropic/claude-sonnet-4

# Discover available models from opencode CLI
bloom config models opencode --discover

# Discover and save to config
bloom config models opencode -d -s

Available Models

To see available models for your configured providers:

# Via bloom
bloom config models opencode --discover

# Or directly via opencode
opencode models

Models use the provider/model format (e.g., anthropic/claude-sonnet-4, openai/gpt-4o).

Capabilities

CapabilitySupportedNotes
File ReadYesRead files in working directory
File WriteYesCreate and modify files
Bash/TerminalYesExecute shell commands
Git OperationsYesFull git support
Web SearchNoNot supported
Web FetchNoNot supported
Session ResumeYesVia -s <session_id> or -c flag
LSP IntegrationYesNative LSP for code intelligence
Human QuestionsNoRuns to completion

Unique Features

Native LSP Support

OpenCode has built-in Language Server Protocol support, providing:

  • Go-to-Definition: Navigate to symbol definitions
  • Find References: Locate all usages of a symbol
  • Hover Information: Get type info and documentation
  • Diagnostics: Real-time error and warning detection
  • Code Actions: Suggested fixes and refactors

This means OpenCode can:

  • Understand code structure more accurately
  • Navigate large codebases efficiently
  • Provide precise code modifications

Multi-Provider Support

Use models from different providers through a unified interface:

# Anthropic
opencode -m anthropic/claude-sonnet-4 "task"

# OpenAI
opencode -m openai/gpt-4o "task"

Session Export/Import

Export sessions for debugging or sharing:

# Export session
opencode session export <session_id> > session.json

# Import session
opencode session import < session.json

Provider-Specific Options

OptionTypeDefaultDescription
autoApprovebooleantrueAuto-approve all tool calls
modelstringREQUIREDModel in provider/model format

Auto-Approve Configuration

Bloom sets autoApprove: true via the OPENCODE_CONFIG_CONTENT environment variable:

{
"permission": {
"*": "allow"
}
}

CLI Flags Reference

When Bloom runs OpenCode:

# Interactive mode
opencode --prompt "system + user prompt" -m provider/model

# Streaming mode (autonomous)
opencode run --format json -m provider/model "prompt"

# Resume session
opencode run --format json -s <session_id> -m provider/model "prompt"

Troubleshooting

"opencode: command not found"

Cause: OpenCode not installed or not in PATH

Solution: Install OpenCode using the official docs and ensure the binary is in your PATH.

"Model selection is REQUIRED"

Cause: No model specified in streaming mode

Solution: OpenCode requires explicit model selection in non-interactive mode:

# ~/.bloom/config.yaml
agent:
defaultNonInteractive: opencode
opencode:
defaultModel: anthropic/claude-sonnet-4 # Add this

Or via command line:

bloom config set-model opencode anthropic/claude-sonnet-4

"Invalid model format"

Cause: Model not in provider/model format

Solution: Use the correct format:

  • claude-sonnet-4
  • anthropic/claude-sonnet-4

"Authentication failed" or "Invalid API key"

Cause: Missing or invalid API key for the provider

Solution:

# For Anthropic
export ANTHROPIC_API_KEY="sk-ant-..."

# For OpenAI
export OPENAI_API_KEY="sk-..."

# Verify the key is set
echo $ANTHROPIC_API_KEY

"Provider not found"

Cause: Unknown provider name

Solution: Run opencode models to see models for your configured providers.

LSP Features Not Working

Cause: Language server not available for the file type

Solution:

  • Install the relevant language server
  • Ensure it's in PATH
  • Check OpenCode logs for LSP errors

Best Practices

For Code-Heavy Work

  1. Use OpenCode when doing significant refactoring
  2. LSP provides accurate code understanding
  3. Better for navigating large codebases

Model Selection

Run opencode models to see models for your configured providers. Models are specified in provider/model format.

Session Management

# Continue most recent session
opencode -c "continue working"

# Resume specific session
opencode -s abc123 "continue"

# Export for debugging
opencode session export <id> > debug.json

Example Session

# Set up API key
export ANTHROPIC_API_KEY="sk-ant-..."

# Run autonomous task (model required)
bloom run
# OpenCode will:
# 1. Use LSP for code intelligence
# 2. Execute tasks with provider model
# 3. Auto-approve tool calls
# 4. Stream JSON output for monitoring

Comparison with Other Agents

AspectOpenCodeClaudeGoose
LSP SupportYesNoNo
Web SearchNoYesNo
Human QuestionsNoYesYes
Multi-ProviderYesNoYes
MCP ExtensionsNoNoYes

Use OpenCode when:

  • Need precise code intelligence via LSP
  • Doing significant refactoring
  • Want multi-provider flexibility
  • Working with large codebases

Use Claude when:

  • Need web search capabilities
  • Want human-in-the-loop features
  • Prefer TodoWrite progress tracking

Use Goose when:

  • Want extensibility via MCP
  • Need scheduled automation
  • Prefer open-source, local execution