Bridge Claude with local Ollama LLMs for private AI-to-AI collaboration — no API keys, fully local
Bridge Claude with local Ollama LLMs for private AI-to-AI collaboration — no API keys, fully local
ask-ollama · v0.3.0
by Lykhoyda
49
Step 1: Install
Run this MCP server locally
Run this MCP server locally and expose it via stdio so your MCP client can connect.
Run via npm
ask-gemini-mcpRun command
npx ask-gemini-mcpRun via npm
ask-llm-mcpRun command
npx ask-llm-mcpRun via npm
ask-ollama-mcp@0.3.0Run command
npx ask-ollama-mcp@0.3.0Step 2: Connect
Connect in your MCP client
After you're running this MCP server locally, connect it to your MCP client using the configuration examples below.
Loading MCP clients…