Skip to content

SkyWalker2506/ccplugin-opencode-bridge

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

11 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

opencode-bridge — Claude Code Plugin

by Musab Kara · GitHub

Claude Code plugin for integrating with OpenCode -- providing access to both Zen (free cloud models) and Ollama (local models).

Install

bash <(curl -fsSL https://raw.githubusercontent.com/SkyWalker2506/claude-marketplace/main/install.sh) opencode-bridge

Or via Claude Code native marketplace:

claude plugin install opencode-bridge@musabkara-claude-marketplace

What is this?

When Claude Code quota runs out or you need a free/local alternative, OpenCode bridges the gap:

  • Zen (cloud): Free models like gpt-5-nano, minimax-m2.1-free, glm-4.7-free, kimi-k2.5-free, big-pickle -- no GPU required, runs via OpenCode's Zen cloud service
  • Ollama (local): Fully local models like qwen2.5-coder:7b, gemma3:4b -- no API key, no internet needed, runs on your machine

Both providers are configured in a single ~/.config/opencode/opencode.json file and can be switched on the fly.

Command

Command Description
/opencode Show current OpenCode setup status
/opencode zen Connect to Zen cloud, configure free models
/opencode local Switch to Ollama local model
/opencode models List all available models (cloud + local)
/opencode pull <model> Pull an Ollama model
/opencode config Show/edit OpenCode configuration
/opencode install Install or update the OpenCode CLI

Setup

1. Install OpenCode CLI

npm install -g opencode-ai

2. Install Ollama (for local models)

# macOS
brew install ollama

# Then pull a model
ollama pull qwen2.5-coder:7b
ollama pull gemma3:4b

3. Configure Zen (for free cloud models)

# Option A: Use the TUI
opencode
# Then: /connect -> select "opencode" -> paste API key from https://opencode.ai/auth

# Option B: Set environment variable
export OPENCODE_ZEN_API_KEY='sk-...'  # Add to ~/.zshrc

4. Configuration File

The plugin uses ~/.config/opencode/opencode.json:

{
  "enabled_providers": ["opencode", "ollama"],
  "model": "opencode/gpt-5-nano",
  "small_model": "opencode/gpt-5-nano",
  "provider": {
    "opencode": {
      "options": { "apiKey": "{env:OPENCODE_ZEN_API_KEY}" },
      "models": {
        "gpt-5-nano": { "name": "Zen -- GPT-5 Nano (free tier)" },
        "minimax-m2.1-free": { "name": "Zen -- MiniMax M2.1 (free, limited)" },
        "glm-4.7-free": { "name": "Zen -- GLM 4.7 (free, limited)" }
      }
    },
    "ollama": {
      "npm": "@ai-sdk/openai-compatible",
      "name": "Ollama (local)",
      "options": { "baseURL": "http://localhost:11434/v1" },
      "models": {
        "qwen2.5-coder:7b": { "name": "Qwen 2.5 Coder 7B (local)" }
      }
    }
  }
}

Shell Aliases

Run install.sh to add these aliases directly to your shell rc (no claude-config required):

Alias What it does
claude-free Opens OpenCode with Zen model (opencode/gpt-5-nano)
claude-local Opens OpenCode with Ollama model (ollama/qwen2.5-coder:7b)
bash install.sh
source ~/.zshrc  # or ~/.bashrc

Available Free Models

Zen Cloud (via OpenCode)

Model Notes
opencode/gpt-5-nano Free tier (default)
opencode/minimax-m2.1-free Free, limited time
opencode/glm-4.7-free Free, limited time
opencode/kimi-k2.5-free Free, limited time
opencode/big-pickle Free, limited time

Ollama Local

Model Size Notes
qwen2.5-coder:7b ~4.7 GB Best for coding tasks
gemma3:4b ~3.3 GB Lightweight, general purpose
qwen2.5-coder:14b ~9.0 GB Higher quality, needs more RAM
gemma3:12b ~8.1 GB Larger Gemma, better quality

Pull any model with: ollama pull <model>

License

MIT

Part of

About

Claude Code plugin: OpenCode (Zen) + Ollama local model bridge

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages