Comfy Server
A server using FastMCP framework to generate images based on prompts via a remote Comfy server.
Installation
Installing for Claude Desktop
Option 1: One-Command Installation
npx mcpbar@latest install lalanikarim/comfy-mcp-server -c claudeThis command will automatically install and configure the Comfy Server MCP server for your selected client.
Option 2: Manual Configuration
Run the command below to open your configuration file:
npx mcpbar@latest edit -c claudeAfter opening your configuration file, copy and paste this configuration:
View JSON configuration
{
"mcpServers": {
"Comfy Server": {
"command": "uvx",
"args": [
"comfy-mcp-server"
],
"env": {
"COMFY_URL": "${input:comfy_url}",
"OLLAMA_API": "${input:ollama_api}"
}
}
}
}Comfy MCP Server
A server using FastMCP framework to generate images based on prompts via a remote Comfy server.
Overview
This script sets up a server using the FastMCP framework to generate images based on prompts using a specified workflow. It interacts with a remote Comfy server to submit prompts and retrieve generated images.
Prerequisites
- uv package and project manager for Python.
- Workflow file exported from Comfy UI. This code includes a sample
Flux-Dev-ComfyUI-Workflow.jsonwhich is only used here as reference. You will need to export from your workflow and set the environment variables accordingly.
You can install the required packages for local development:
uvx mcp[cli]
Configuration
Set the following environment variables:
COMFY_URLto point to your Comfy server URL.COMFY_WORKFLOW_JSON_FILEto point to the absolute path of the API export json file for the comfyui workflow.PROMPT_NODE_IDto the id of the text prompt node.OUTPUT_NODE_IDto the id of the output node with the final image.OUTPUT_MODEto eitherurlorfileto select desired output.
Optionally, if you have an Ollama server running, you can connect to it for prompt generation.
OLLAMA_API_BASEto the url where ollama is running.PROMPT_LLMto the name of the model hosted on ollama for prompt generation.
Example:
export COMFY_URL=http://your-comfy-server-url:port
export COMFY_WORKFLOW_JSON_FILE=/path/to/the/comfyui_workflow_export.json
export PROMPT_NODE_ID=6 # use the correct node id here
export OUTPUT_NODE_ID=9 # use the correct node id here
export OUTPUT_MODE=file
Usage
Comfy MCP Server can be launched by the following command:
uvx comfy-mcp-server
Example Claude Desktop Config
{
"mcpServers": {
"Comfy MCP Server": {
"command": "/path/to/uvx",
"args": [
"comfy-mcp-server"
],
"env": {
"COMFY_URL": "http://your-comfy-server-url:port",
"COMFY_WORKFLOW_JSON_FILE": "/path/to/the/comfyui_workflow_export.json",
"PROMPT_NODE_ID": "6",
"OUTPUT_NODE_ID": "9",
"OUTPUT_MODE": "file",
}
}
}
}
Functionality
generate_image(prompt: str, ctx: Context) -> Image | str
This function generates an image using a specified prompt. It follows these steps:
- Checks if all the environment variable are set.
- Loads a prompt template from a JSON file.
- Submits the prompt to the Comfy server.
- Polls the server for the status of the prompt processing.
- Retrieves and returns the generated image once it's ready.
generate_prompt(topic: str, ctx: Context) -> str
This function generates a comprehensive image generation prompt from specified topic.
Dependencies
mcp: For setting up the FastMCP server.json: For handling JSON data.urllib: For making HTTP requests.time: For adding delays in polling.os: For accessing environment variables.langchain: For creating simple LLM Prompt chain to generate image generation prompt from topic.langchain-ollama: For ollama specific modules for LangChain.
License
This project is licensed under the MIT License - see the LICENSE file for details.
Stars
25Forks
7Last commit
8 months agoRepository age
9 monthsLicense
MIT
Auto-fetched from GitHub .
MCP servers similar to Comfy Server:
Stars
Forks
Last commit
Stars
Forks
Last commit
Stars
Forks
Last commit