DirectoryClient For Ollama
tools

Client For Ollama

by jonigl

60
Health
good
71
Popularity

A text-based user interface (TUI) client for interacting with MCP servers using Ollama. Features include agent mode, multi-server, model switching, streaming responses, tool management, human-in-the-loop, thinking mode, model params config, MCP prompts, custom system prompt and saved preferences. Built for developers working with local LLMs.

561stars
82forks
Python
MIT
Updated March 11, 2026
#agentic-ai#ai#command-line-tool#generative-ai#linux#llm#local-llm#macos#mcp#mcp-client#mcp-server#model-context-protocol#ollama#open-source#pypi-package#sse#stdio#streamable-http#tool-management#windows

Add to Claude Desktop

Paste this into your claude_desktop_config.json file:

{
  "mcpServers": {
    "client-for-ollama": {
      "command": "npx",
      "args": [
        "-y",
        "mcp-client-for-ollama"
      ]
    }
  }
}

Config file location: ~/Library/Application Support/Claude/claude_desktop_config.json (macOS) or %APPDATA%\Claude\claude_desktop_config.json (Windows)

Installation

npx mcp-client-for-ollama

Maintenance Healthgood

Freshness32/40
Issue response15/30
Org backing10/20
Longevity3/10
Updated recently

Server Info

Categorytools
LanguagePython
LicenseMIT
StatusACTIVE

Submit Your Server

Built an MCP server? Get it listed in front of thousands of developers.

List Your Server Free
Back to Directory