A favicon of LM Studio Bridge

LM Studio Bridge

MCP server to bridge Claude with local LLMs running in LM Studio

Installation

Installing for Claude Desktop

Manual Configuration Required

This MCP server requires manual configuration. Run the command below to open your configuration file:

npx mcpbar@latest edit -c claude

This will open your configuration file where you can add the LM Studio Bridge MCP server manually.

Claude-LMStudio Bridge

An MCP server that bridges Claude with local LLMs running in LM Studio.

Overview

This tool allows Claude to interact with your local LLMs running in LM Studio, providing:

  • Access to list all available models in LM Studio
  • The ability to generate text using your local LLMs
  • Support for chat completions through your local models
  • A health check tool to verify connectivity with LM Studio

Prerequisites

  • Claude Desktop with MCP support
  • LM Studio installed and running locally with API server enabled
  • Python 3.8+ installed

For macOS/Linux:

  1. Clone the repository
git clone https://github.com/infinitimeless/claude-lmstudio-bridge.git
cd claude-lmstudio-bridge
  1. Run the setup script
chmod +x setup.sh
./setup.sh
  1. Follow the setup script's instructions to configure Claude Desktop

For Windows:

  1. Clone the repository
git clone https://github.com/infinitimeless/claude-lmstudio-bridge.git
cd claude-lmstudio-bridge
  1. Run the setup script
setup.bat
  1. Follow the setup script's instructions to configure Claude Desktop

Manual Setup

If you prefer to set things up manually:

  1. Create a virtual environment (optional but recommended)
python -m venv venv
source venv/bin/activate  # On Windows: venv\Scripts\activate
  1. Install the required packages
pip install -r requirements.txt
  1. Configure Claude Desktop:
    • Open Claude Desktop preferences
    • Navigate to the 'MCP Servers' section
    • Add a new MCP server with the following configuration:
      • Name: lmstudio-bridge
      • Command: /bin/bash (on macOS/Linux) or cmd.exe (on Windows)
      • Arguments:
        • macOS/Linux: /path/to/claude-lmstudio-bridge/run_server.sh
        • Windows: /c C:\path\to\claude-lmstudio-bridge\run_server.bat

Usage with Claude

After setting up the bridge, you can use the following commands in Claude:

  1. Check the connection to LM Studio:
Can you check if my LM Studio server is running?
  1. List available models:
List the available models in my local LM Studio
  1. Generate text with a local model:
Generate a short poem about spring using my local LLM
  1. Send a chat completion:
Ask my local LLM: "What are the main features of transformers in machine learning?"

Troubleshooting

Diagnosing LM Studio Connection Issues

Use the included debugging tool to check your LM Studio connection:

python debug_lmstudio.py

For more detailed tests:

python debug_lmstudio.py --test-chat --verbose

Common Issues

"Cannot connect to LM Studio API"

  • Make sure LM Studio is running
  • Verify the API server is enabled in LM Studio (Settings > API Server)
  • Check that the port (default: 1234) matches what's in your .env file

"No models are loaded"

  • Open LM Studio and load a model
  • Verify the model is running successfully

"MCP package not found"

  • Try reinstalling: pip install "mcp[cli]" httpx python-dotenv
  • Make sure you're using Python 3.8 or later

"Claude can't find the bridge"

  • Check Claude Desktop configuration
  • Make sure the path to run_server.sh or run_server.bat is correct and absolute
  • Verify the server script is executable: chmod +x run_server.sh (on macOS/Linux)

Advanced Configuration

You can customize the bridge behavior by creating a .env file with these settings:

LMSTUDIO_HOST=127.0.0.1
LMSTUDIO_PORT=1234
DEBUG=false

Set DEBUG=true to enable verbose logging for troubleshooting.

License

MIT

Share:
Details:
  • Stars


    5
  • Forks


    8
  • Last commit


    4 months ago
  • Repository age


    4 months
View Repository

Auto-fetched from GitHub .

MCP servers similar to LM Studio Bridge:

 

 
 
  • Stars


  • Forks


  • Last commit


 

 
 
  • Stars


  • Forks


  • Last commit


 

 
 
  • Stars


  • Forks


  • Last commit