OpenCode is an AI coding assistant. It ships a desktop app, plugins for IDEs and a CLI that runs in your terminal, which makes it awesome for agentic tasks. Here's how to set it up.
Install it
The fastest way is the one-liner installer:
curl -fsSL https://opencode.ai/install | bash
If you prefer a package manager, it's on npm, Homebrew, and most Linux package managers:
# npm / bun / pnpm / yarn
npm i -g opencode-ai@latest
# Homebrew (macOS or Linux)
brew install anomalyco/tap/opencode
# Arch Linux
sudo pacman -S opencode
Windows users can reach it via Chocolatey (choco install opencode) or Scoop (scoop install opencode).
Start it
Run opencode in any project directory to open the terminal interface:
opencode
Type / to see available commands — connecting a provider, managing sessions, and so on.
Use the web UI
If you're working inside a dev container or on a remote machine, opencode web is the command you want. It starts a local server and opens a browser with a web interface: session management, a visual diff viewer, model selection, and cost tracking.
opencode web
By default, it binds to localhost. To reach it from outside the container or over the network, set the hostname:
opencode web --hostname 0.0.0.0 --port 4242
You can also lock it down with a password via the OPENCODE_SERVER_PASSWORD environment variable (HTTP basic auth, username opencode).
One nice trick: if you want both the browser UI and the terminal TUI running against the same session, start the server first and then attach from another shell:
opencode attach http://localhost:4242
Connect to GitHub Copilot
GitHub Copilot became a first-class OpenCode provider in early 2026. If you already have Copilot running through the gh CLI or VS Code, OpenCode detects the token automatically. Just start it and pick Copilot when prompted.
If you need to connect from scratch, type /connect inside OpenCode, select GitHub Copilot, and follow the device login flow. Any paid Copilot tier (Pro, Pro+, Business, or Enterprise) works with no extra subscription.
If like me you are using GitHub Enterprise, you'll be able to set your GHE instance endpoint during the connect process.
Extend it with skills
Skills are reusable instruction packs that teach OpenCode new workflows. As an open standard, skills are compatible with myriad agentic harnesses and as such with OpenCode as well.
There are directories of skills online, one of which is skills.sh. The skills CLI from Vercel Labs manages them across OpenCode, Claude Code, Cursor, Codex, and 40+ other agents.
Install a skill pack with:
npx skills add <github-org/repo>
You can browse what's in a pack before committing:
npx skills add vercel-labs/agent-skills --list
Install just one skill, or target specific agents:
# One skill only
npx skills add vercel-labs/agent-skills --skill frontend-design
# Target OpenCode specifically
npx skills add vercel-labs/agent-skills -a opencode
# Install globally (available across all projects)
npx skills add vercel-labs/agent-skills -g
Skills land in ./<agent>/skills/ per project, or ~/<agent>/skills/ when installed globally.
A few skills worth installing:
- agent-browser — Gives the agent a browser: navigate pages, fill forms, take screenshots, and extract data from any site
- find-skills — Discovers and installs skills from the open ecosystem when you need to extend the agent's capabilities
- create-skill — Helps you create and refine effective skills to automate repetitive tasks without filling your context window
Manage your installed skills with the usual suspects: npx skills list, npx skills update, and npx skills remove.
Check your quota
The built-in stats command shows token usage and cost across sessions:
opencode stats
Narrow it down with flags:
# Last 7 days, broken down by model
opencode stats --days 7 --models
# Filter by project
opencode stats --project my-app
That's the setup. From here it's mostly about building habits — good skills, a provider you trust, and keeping an eye on what you're spending. OpenCode moves fast, so it's worth checking the changelog every few weeks; the gap between what it could do when I set it up and what it can do now is already pretty wide.