
Ollama Integration Server
Query model running with Ollama from within Claude Desktop or other MCP clients
Installation
Installing for Claude Desktop
Option 1: One-Command Installation
npx mcpbar@latest install emgeee/mcp-ollama -c claude
This command will automatically install and configure the Ollama Integration Server MCP server for your selected client.
Option 2: Manual Configuration
Run the command below to open your configuration file:
npx mcpbar@latest edit -c claude
After opening your configuration file, copy and paste this configuration:
View JSON configuration
{
"mcpServers": {
"Ollama Integration Server": {
"command": "uvx",
"args": [
"mcp-ollama"
]
}
}
}
MCP Ollama
A Model Context Protocol (MCP) server for integrating Ollama with Claude Desktop or other MCP clients.
Requirements
- Python 3.10 or higher
- Ollama installed and running (https://ollama.com/download)
- At least one model pulled with Ollama (e.g.,
ollama pull llama2
)
Configure Claude Desktop
Add to your Claude Desktop configuration (~/Library/Application Support/Claude/claude_desktop_config.json
on macOS, %APPDATA%\Claude\claude_desktop_config.json
on Windows):
{
"mcpServers": {
"ollama": {
"command": "uvx",
"args": [
"mcp-ollama"
]
}
}
}
Development
Install in development mode:
git clone https://github.com/yourusername/mcp-ollama.git
cd mcp-ollama
uv sync
Test with MCP Inspector:
mcp dev src/mcp_ollama/server.py
Features
The server provides four main tools:
list_models
- List all downloaded Ollama modelsshow_model
- Get detailed information about a specific modelask_model
- Ask a question to a specified model
License
MIT
Stars
21Forks
9Last commit
5 months agoRepository age
5 monthsLicense
MIT
Auto-fetched from GitHub .
MCP servers similar to Ollama Integration Server:

Stars
Forks
Last commit

Stars
Forks
Last commit

Stars
Forks
Last commit