io.github.Lykhoyda/ask-ollama icon

ask-ollama

by Lykhoyda

io.github.Lykhoyda/ask-ollama

Bridge Claude with local Ollama LLMs for private AI-to-AI collaboration — no API keys, fully local

Step 1: Install

Run this MCP server locally

Run this MCP server locally and expose it via stdio so your MCP client can connect.

Run via npm
Run via npm
ask-gemini-mcp
Run command
npx ask-gemini-mcp
Run via npm
Run via npm
ask-llm-mcp
Run command
npx ask-llm-mcp
Run via npm
Run via npm
ask-ollama-mcp@0.3.0
Run command
npx ask-ollama-mcp@0.3.0

Step 2: Connect

Connect in your MCP client

After you're running this MCP server locally, connect it to your MCP client using the configuration examples below.

Loading MCP clients…