139a038505b296351b42ab0a8d4926dd0c9ac57c
MCP server exposing local Ollama models via LiteLLM proxy to Claude Code. Tools: query_local_model, review_code, summarize, generate_boilerplate, list_models. Deployed to k8s ai-inference namespace via ArgoCD. Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
Description
MCP server exposing local Ollama models via LiteLLM to Claude Code
Languages
Python
96.4%
Dockerfile
3.6%