Sonu Sahani logo
Sonusahani.com
How to Set Up OpenCode with Ollama (Step-by-Step)

How to Set Up OpenCode with Ollama (Step-by-Step)

0 views
5 min read
#AI

OpenCode is an open-source AI coding agent that I am installing with Ollama on an Ubuntu system with a GPU. I am using the Qwen 3 Coder 30B model, but you can use any model of your choice.

Ollama is one of the easiest tools to run models locally. If you are starting out, I think Ollama might be a good option.

How-To Install OpenCode: Installation and Setup

Set Up OpenCode with Ollama on Ubuntu (Step-by-Step) screenshot 3

You can install OpenCode in CLI, in TUI (terminal user interface), or as a VS Code extension. I am going with the TUI.

Set Up OpenCode with Ollama on Ubuntu (Step-by-Step) screenshot 4

  • Install OpenCode by running the shell command.
  • Add it to your PATH.
  • Reload your shell configuration.
  • Run opencode.

Set Up OpenCode with Ollama on Ubuntu (Step-by-Step) screenshot 5

How-To Install OpenCode: Initialize a Project

Set Up OpenCode with Ollama on Ubuntu (Step-by-Step) screenshot 6

First step, run init. It initializes your existing project wherever you are. I am already in a directory with an app.py file.

Set Up OpenCode with Ollama on Ubuntu (Step-by-Step) screenshot 7

  • init creates something called agents.mmd. It has already created that agents.mmd file.

How-To Install OpenCode: Connect Ollama as a Provider

Set Up OpenCode with Ollama on Ubuntu (Step-by-Step) screenshot 9

Next step is to connect the provider, which means connecting Ollama.

Set Up OpenCode with Ollama on Ubuntu (Step-by-Step) screenshot 10

  • Run /connect to see all providers.
  • Under Others, there is no Ollama. Ollama Cloud is there but that is a paid option.
  • Create an Ollama provider manually.

Set Up OpenCode with Ollama on Ubuntu (Step-by-Step) screenshot 11

Create the Ollama provider configuration

Set Up OpenCode with Ollama on Ubuntu (Step-by-Step) screenshot 12

Go to your OpenCode directory, open opencode.json in your editor, and add a new provider configuration for Ollama local.

Set Up OpenCode with Ollama on Ubuntu (Step-by-Step) screenshot 13

What this configuration is doing:

  • Creating a new provider with Ollama.
  • The name is ollama-local.
  • Point it to where your Ollama service is running.
  • Set your model name. I am using Qwen 3 Coder 30B.
  • Use a model with at least an 8K context window. 32K is even better.
  • The model should support tool use so it can call external functionality.

Set Up OpenCode with Ollama on Ubuntu (Step-by-Step) screenshot 14

If you do not already have the model, pull it by name in Ollama. After that, start OpenCode again.

Set Up OpenCode with Ollama on Ubuntu (Step-by-Step) screenshot 15

Connect to the Ollama provider

Set Up OpenCode with Ollama on Ubuntu (Step-by-Step) screenshot 16

  • Run connect, then type ollama.
  • Select the new provider you created.
  • Select your model. I am selecting Qwen 3 Coder 30B.
  • You can specify multiple models as a comma-separated list in the configuration file.

Set Up OpenCode with Ollama on Ubuntu (Step-by-Step) screenshot 17

How-To Install OpenCode: First Interactions

Set Up OpenCode with Ollama on Ubuntu (Step-by-Step) screenshot 18

You can ask a general question to confirm things are working, for example: What is a Python decorator? It will think and give you the answer.

Set Up OpenCode with Ollama on Ubuntu (Step-by-Step) screenshot 19

From here you can start talking with it about your codebase. You can debug your code and add new features.

Set Up OpenCode with Ollama on Ubuntu (Step-by-Step) screenshot 20

Planning and Build modes

Set Up OpenCode with Ollama on Ubuntu (Step-by-Step) screenshot 21

  • Press Tab to toggle modes: Plan and Build.
  • In Plan, it does not execute changes. It shows you the plan.
  • Toggle between Plan and Build with Tab.

Set Up OpenCode with Ollama on Ubuntu (Step-by-Step) screenshot 22

Commands

  • Press Ctrl P to see all commands.
  • Examples include rename, session, switch, and more.
  • Press Escape to exit the command list.

Set Up OpenCode with Ollama on Ubuntu (Step-by-Step) screenshot 23

Codebase operations

Set Up OpenCode with Ollama on Ubuntu (Step-by-Step) screenshot 24

  • You can chat with your codebase, ask it to describe files, review code, and add features.
  • Example: describe a file by referencing it, such as @app.py. It will explore, read, and describe the file.
  • Ask it to optimize code in a file like app.py. It will review and optimize it.
  • On the right side there is a to-do list. It tracks the steps and ticks them off as it proceeds.

Set Up OpenCode with Ollama on Ubuntu (Step-by-Step) screenshot 25

Protocol and skills support

Set Up OpenCode with Ollama on Ubuntu (Step-by-Step) screenshot 26

  • LSP support (language server protocol) for different languages.
  • MCP support (model context protocol).
  • Agent skills to give your AI coding agents a consistent expertise.

How-To Install OpenCode: Model Notes for Production

Set Up OpenCode with Ollama on Ubuntu (Step-by-Step) screenshot 27

If you are looking to do this sort of coding with coding agents in a production environment, as I speak in December 2025, I would say Opus 4.5 is one of the best models for pair programming if you can afford it. I do not know what will happen next week, but on 29th of December 2025, Opus 4.5 is best for this sort of pair programming.

Set Up OpenCode with Ollama on Ubuntu (Step-by-Step) screenshot 28

Final Thoughts

  • OpenCode installs cleanly, runs in a TUI, and initializes your project with init.
  • Add Ollama as a custom provider via opencode.json, pick a suitable model with a large context window and tool use support, and connect.
  • Use Plan vs Build modes, the command palette, and codebase-aware features to describe, review, and modify files.
  • LSP, MCP, and agent skills expand language support and repeatable expertise.
sonuai.dev

Sonu Sahani

AI Engineer & Full Stack Developer. Passionate about building AI-powered solutions.

Related Posts