Copied to clipboard
Claude Code × Ollama × Termux

Install Claude Code
In Termux Using Ollama

Claude Code Banner

A lightweight setup to turn your Android device into an AI-powered coding environment using Termux, Ollama, and Claude Code.

termux ~ claude-code
$ ollama serve
Ollama running on :11434
$ claude --model kimi-k2.5:cloud
Welcome to Claude Code
Model: kimi-k2.5:cloud
Backend: Ollama :11434
> How do I build a REST API in Node.js?
6
Simple Steps
100%
Local & Free
4GB+
RAM Needed
Why This Setup

Key Features

Everything you need to run AI coding on Android.

Runs on Android

Full AI coding environment inside Termux — no root, no PC required.

Ollama Backend

Ollama acts as the local LLM server. Claude Code talks to it on port 11434.

Kimi K2.5 Cloud

Uses the powerful Kimi K2.5 cloud model — smart, fast, and free to pull.

Claude Code CLI

Anthropic's official coding CLI, installed via npm and pointed at Ollama.

6-Step Setup

Copy-paste commands. Up and running in under 10 minutes.

Zero API Cost

No Anthropic API key needed. Ollama handles authentication locally.

Installation

Setup Guide

Copy, paste, done. Get running in minutes on your Android phone.

Requirements

Android (4GB+ RAM) Stable internet Termux (F-Droid)
Part 1 — Install Ollama (Termux)
01

Update & Install Dependencies

Update Termux packages and install required dependencies.

termux
$pkg update && pkg upgrade -y
$pkg install git nodejs-lts python -y
$pkg install ollama
02

Start Ollama Server

Launch the Ollama LLM server on port 11434.

termux
$ollama serve
03

Download Kimi Model

Pull the Kimi K2.5 cloud model. Requires strong internet connection.

termux
$ollama pull kimi-k2.5:cloud
Requires strong internet (cloud model)
Part 2 — Install Claude Code (Termux)
04

Install Claude Code

Install the Anthropic CLI globally via npm.

termux
$npm install -g @anthropic-ai/claude-code
05

Configure Ollama Connection

Set environment variables to connect Claude Code to Ollama.

termux
$echo -e '\n# Claude Code with Ollama Config\nexport ANTHROPIC_BASE_URL="http://localhost:11434"\nexport ANTHROPIC_AUTH_TOKEN="ollama"' >> ~/.bashrc && source ~/.bashrc
06

Run Claude Code with Kimi

You're ready. Launch Claude Code with the Kimi model.

termux
$claude --model kimi-k2.5:cloud
Connected to Ollama :11434
Model: kimi-k2.5:cloud
Ready to code
Architecture

System Overview

How the pieces connect on your phone.

system-diagram
Termux
├── Ollama (LLM Server :11434)
└── Claude Code (AI CLI)
Claude Code → sends requests to Ollama
Entire system runs locally on your phone
Workflow

Daily Usage

start ollama
Start Ollama:
$ollama serve
run claude
Run Claude Code:
$claude --model kimi-k2.5:cloud

Important Notes

  • Keep Ollama running before using Claude Code
  • :cloud models require internet (not fully offline)
  • Performance depends on network + device
  • Close background apps for stability
Support

Troubleshooting

Common issues and quick fixes.

Cannot Connect to Ollama

  • Ensure ollama serve is running
  • Check port 11434 is accessible

Model Pull Fails

  • Check your internet connection
  • Retry: ollama pull kimi-k2.5:cloud

Slow Performance

  • Close background apps for more RAM
  • Use a stable WiFi connection

Claude Code Not Found

  • Ensure npm global bin is in PATH
  • Reinstall: npm i -g @anthropic-ai/claude-code

Ready to Code with Claude Code?

Turn your Android into an AI-powered coding machine.