A favicon of Ollama MCP Server

Ollama MCP Server

An MCP Server for Ollama

Installation

Installing for Claude Desktop

Option 1: One-Command Installation

npx mcpbar@latest install rawveg/ollama-mcp -c claude

This command will automatically install and configure the Ollama MCP Server MCP server for your selected client.

Option 2: Manual Configuration

Run the command below to open your configuration file:

npx mcpbar@latest edit -c claude

After opening your configuration file, copy and paste this configuration:

View JSON configuration
{
  "mcpServers": {
    "Ollama MCP Server": {
      "command": "npx",
      "args": [
        "-y",
        "@rawveg/ollama-mcp"
      ]
    }
  }
}

Ollama MCP Server

An MCP (Model Context Protocol) server for Ollama that enables seamless integration between Ollama's local LLM models and MCP-compatible applications like Claude Desktop.

Features

  • List available Ollama models
  • Pull new models from Ollama
  • Chat with models using Ollama's chat API
  • Get detailed model information
  • Automatic port management
  • Environment variable configuration

Prerequisites

  • Node.js (v16 or higher)
  • npm
  • Ollama installed and running locally

Installation

Manual Installation

Install globally via npm:

npm install -g @rawveg/ollama-mcp

Installing in Other MCP Applications

To install the Ollama MCP Server in other MCP-compatible applications (like Cline or Claude Desktop), add the following configuration to your application's MCP settings file:

{
  "mcpServers": {
    "@rawveg/ollama-mcp": {
      "command": "npx",
      "args": [
        "-y",
        "@rawveg/ollama-mcp"
      ]
    }
  }
}

The settings file location varies by application:

  • Claude Desktop: claude_desktop_config.json in the Claude app data directory
  • Cline: cline_mcp_settings.json in the VS Code global storage

Usage

Starting the Server

Simply run:

ollama-mcp

The server will start on port 3456 by default. You can specify a different port using the PORT environment variable:

PORT=3457 ollama-mcp

Environment Variables

  • PORT: Server port (default: 3456). Can be used when running directly:
    # When running directly
    PORT=3457 ollama-mcp
    
  • OLLAMA_API: Ollama API endpoint (default: http://localhost:11434)

API Endpoints

  • GET /models - List available models
  • POST /models/pull - Pull a new model
  • POST /chat - Chat with a model
  • GET /models/:name - Get model details

Development

  1. Clone the repository:
git clone https://github.com/rawveg/ollama-mcp.git
cd ollama-mcp
  1. Install dependencies:
npm install
  1. Build the project:
npm run build
  1. Start the server:
npm start

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

However, this does not grant permission to incorporate this project into third-party services or commercial platforms without prior discussion and agreement. While I previously accepted contributions (such as a Dockerfile and related README updates) to support integration with services like Smithery, recent actions by a similar service — Glama — have required a reassessment of this policy.

Glama has chosen to include open-source MCP projects in their commercial offering without notice or consent, and subsequently created issue requests asking maintainers to perform unpaid work to ensure compatibility with their platform. This behaviour — leveraging community labour for profit without dialogue or compensation — is not only inconsiderate, but ethically problematic.

As a result, and to protect the integrity of this project and its contributors, the licence has been updated to the GNU Affero General Public License v3.0 (AGPL-3.0). This change ensures that any use of the software — particularly in commercial or service-based platforms — must remain fully compliant with the AGPL's terms and obtain a separate commercial licence. Merely linking to the original source is not sufficient where the project is being actively monetised. If you wish to include this project in a commercial offering, please get in touch first to discuss licensing terms.

License

AGPL v3.0

This project was previously MIT-licensed. As of 20th April 2025, it is now licensed under AGPL-3.0 to prevent unauthorised commercial exploitation. If your use of this project predates this change, please refer to the relevant Git tag or commit for the applicable licence.

Share:
Details:
  • Stars


    83
  • Forks


    9
  • Last commit


    2 months ago
  • Repository age


    4 months
  • License


    AGPL-3.0
View Repository

Auto-fetched from GitHub .

MCP servers similar to Ollama MCP Server:

 

 
 
  • Stars


  • Forks


  • Last commit


 

 
 
  • Stars


  • Forks


  • Last commit


 

 
 
  • Stars


  • Forks


  • Last commit


Ollama MCP Server: MCP Server – MCP.Bar